Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- short-paperDecember 2020
A Computational Method to Automatically Detect the Perceived Origin of Full-Body Human Movement and its Propagation
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 449–453https://doi.org/10.1145/3395035.3425971The work reports ongoing research about a computational method, based on cooperative games on graphs, aimed at detecting the perceived origin of full-body human movement and its propagation. Compared with previous works, a larger set of movement ...
- short-paperDecember 2020
Structuring Multi-Layered Musical Feedback for Digital Bodily Interaction: Two Approaches to Multi-layered Interactive Musical Feedback Systems
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 446–448https://doi.org/10.1145/3395035.3425970This paper describes to approaches to develop simple systems for expressive bodily interaction with music, without prior musical knowledge on the user's part. It discusses two almost oppositional models: 1. Modifying a preexisting recording through ...
- research-articleDecember 2020
Gravity-Direction-Aware Joint Inter-Device Matching and Temporal Alignment between Camera and Wearable Sensors
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 433–441https://doi.org/10.1145/3395035.3425968To analyze human interaction behavior in a group or crowd, identification and device time synchronization are essential but time demanding to be performed manually. To automate the two processes jointly without any calibration steps nor auxiliary sensor,...
- research-articleDecember 2020
Defining and Quantifying Conversation Quality in Spontaneous Interactions
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 196–205https://doi.org/10.1145/3395035.3425966Social interactions in general are multifaceted and there exists a wide set of factors and events that influence them. In this paper, we quantify social interactions with a holistic viewpoint on individual experiences, particularly focusing on non-task-...
- short-paperDecember 2020
Group Performance Prediction with Limited Context
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 191–195https://doi.org/10.1145/3395035.3425964Automated prediction of group task performance normally proceeds by extracting linguistic, acoustic, or multimodal features from an entire conversation in order to predict an objective task measure. In this work, we investigate whether we can maintain ...
-
- research-articleDecember 2020
Modeling Dynamics of Task and Social Cohesion from the Group Perspective Using Nonverbal Motion Capture-based Features
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 182–190https://doi.org/10.1145/3395035.3425963Group cohesion is a multidimensional emergent state that manifests during group interaction. It has been extensively studied in several disciplines such as Social Sciences and Computer Science and it has been investigated through both verbal and ...
- short-paperDecember 2020
Inferring Student Engagement in Collaborative Problem Solving from Visual Cues
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 177–181https://doi.org/10.1145/3395035.3425961Automatic analysis of students' collaborative interactions in physical settings is an emerging problem with a wide range of applications in education. However, this problem has been proven to be challenging due to the complex, interdependent and dynamic ...
- short-paperDecember 2020
A Model of Team Trust in Human-Agent Teams
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 171–176https://doi.org/10.1145/3395035.3425959Trust is a central element for effective teamwork and successful human-technology collaboration. Although technologies, such as agents, are increasingly becoming autonomous team members operating alongside humans, research on team trust in human-agent ...
- short-paperDecember 2020
Speech, Voice, Text, And Meaning: A Multidisciplinary Approach to Interview Data through the use of digital tools
- Arjan van Hessen,
- Silvia Calamai,
- Henk van den Heuvel,
- Stefania Scagliola,
- Norah Karrouche,
- Jeannine Beeken,
- Louise Corti,
- Christoph Draxler
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 454–455https://doi.org/10.1145/3395035.3425657Interview data is multimodal data: it consists of speech sound, facial expression and gestures, captured in a particular situation, and containing textual information and emotion. This workshop shows how a multidisciplinary approach may exploit the full ...
- research-articleDecember 2020
Eating Sound Dataset for 20 Food Types and Sound Classification Using Convolutional Neural Networks
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 348–351https://doi.org/10.1145/3395035.3425656Food identification technology potentially benefits both food and media industries, and can enrich human-computer interaction. We assembled a food classification dataset consisting of 11,141 clips, based on YouTube videos of 20 food types. This dataset ...
- research-articleDecember 2020
Eating Like an Astronaut: How Children Are Willing to Eat
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 341–347https://doi.org/10.1145/3395035.3425655How food is presented and eaten influences the eating experience. Novel gustatory interfaces have opened up new ways for eating at the dining table. For example, recent developments in acoustic technology have enabled the transportation of food and ...
- research-articleDecember 2020
Multimodal Interactive Dining with the Sensory Interactive Table: Two Use Cases
- Roelof A. J. de Vries,
- Gijs H. J. Keizers,
- Sterre R. van Arum,
- Juliet A. M. Haarman,
- Randy Klaassen,
- Robby W. van Delden,
- Bert-Jan F. van Beijnum,
- Janet H. W. van den Boer
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 332–340https://doi.org/10.1145/3395035.3425654This paper presents two use cases for a new multimodal interactive instrument: the Sensory Interactive Table. The Sensory Interactive Table is an instrumented, interactive dining table, that measures eating behavior - through the use of embedded load ...
- research-articleDecember 2020
The Effect of Different Affective Arousal Levels on Taste Perception
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 328–331https://doi.org/10.1145/3395035.3425651The emotions we experience shape our perception, and our emotion is shaped by our perceptions. Taste perception is also influenced by emotions. Positive and negative emotions alter sweetness, sourness, and bitterness perception. However, most previous ...
- research-articleDecember 2020
Augmentation of Perceived Sweetness in Sugar Reduced Cakes by Local Odor Display
- Heikki Aisala,
- Jussi Rantala,
- Saara Vanhatalo,
- Markus Nikinmaa,
- Kyösti Pennanen,
- Roope Raisamo,
- Nesli Sözer
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 322–327https://doi.org/10.1145/3395035.3425650Multisensory augmented reality systems have demonstrated the potential of olfactory cues in the augmentation of flavor perception. Earlier studies have mainly used commercially available sample products. In this study, custom rye-based cakes with ...
- research-articleDecember 2020
Guess who's coming to dinner? Surveying Digital Commensality During Covid-19 Outbreak
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 317–321https://doi.org/10.1145/3395035.3425649Eating together is one of the most treasured human activities. Its benefits range from improving the taste of food to mitigating the feelings of loneliness. In 2020, many countries have adopted lock-down and social distancing policies, forcing people to ...
- research-articleDecember 2020
Eating with an Artificial Commensal Companion
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 312–316https://doi.org/10.1145/3395035.3425648Commensality is defined as "a social group that eats together", and eating in a commensality setting has a number of positive effects on humans. The purpose of this paper is to investigate the effects of technology on commensality by presenting an ...
- research-articleDecember 2020
An Accessible Tool to Measure Implicit Approach-Avoidance Tendencies Towards Food Outside the Lab
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 307–311https://doi.org/10.1145/3395035.3425647Implicit approach-avoidance tendencies can be measured by the approach-avoidance task (AAT). The emergence of mobile variants of the AAT enable its use for both in-the-lab and in-the-field experiments. Within the food domain, use of the AAT is ...
- research-articleDecember 2020
The Influence of Emotion-Oriented Extrinsic Visual and Auditory Cues on Coffee Perception: A Virtual Reality Experiment
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 301–306https://doi.org/10.1145/3395035.3425646Eating is a process that involves all senses. Recent research has shown that both food-intrinsic and extrinsic sensory factors play a role in the taste of the food we consume. Moreover, many studies have explored the relationship between emotional state ...
- research-articleDecember 2020
Automatic Analysis of Facilitated Taste-liking
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 292–300https://doi.org/10.1145/3395035.3425645This paper focuses on: (i) Automatic recognition of taste-liking from facial videos by comparatively training and evaluating models with engineered features and state-of-the-art deep learning architectures, and (ii) analysing the classification results ...
- research-articleDecember 2020
Co-Designing Flavor-Based Memory Cues with Older Adults
ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal InteractionPages 287–291https://doi.org/10.1145/3395035.3425644This initial study explores the design of flavor-based cues with older adults for their self-defining memories. It proposes using food to leverage the connections between odor and memory to develop new multisensory memory cues. Working with 4 older ...