skip to main content
10.5555/2615731.2617401acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaamasConference Proceedingsconference-collections
research-article

Robot mood is contagious: effects of robot body language in the imitation game

Published: 05 May 2014 Publication History

Abstract

Mood contagion is an automatic mechanism that induces a congruent mood state by means of the observation of another person's emotional expression. In this paper, we address the question whether robot mood displayed during an imitation game can (a) be recognized by participants and (b) produce contagion effects. Robot mood was displayed by applying a generic framework for mood expression using body language. By modulating the set of available behavior parameters in this framework for controlling pose and motion dynamics, the gestures performed by the humanoid robot NAO were adjusted to display either a positive or negative mood. In the study performed, we varied both mood as well as task difficulty. Our results show that participants are able to differentiate between positive and negative robot mood. Moreover, self-reported mood matches the mood of the robot in the easy task condition. Additional evidence for mood contagion is provided by the fact that we were able to replicate an expected effect of negative mood on task performance: in the negative mood condition participants performed better on difficult tasks than in the positive mood condition, even though participants' self-reported mood did not match that of the robot.

References

[1]
C. Breazeal, A. Takanishi, and T. Kobayashi. Social robots that interact with people. Springer Handbook of Robotics, pages 1349--1369. Springer, Berlin, 2008.
[2]
C. Breazeal. Role of expressive behaviour for robots that learn from people. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535):3527--3538, 2009.
[3]
R. Neumann, and F. Strack. "Mood contagion": the automatic transfer of mood between persons. Journal of personality and social psychology, 79(2):211, 2000.
[4]
J. Hirth, N. Schmitz, and K. Berns. Towards social robots: Designing an emotion-based architecture. Int. J. Social Robotics, 3(3):273--290, 2011.
[5]
M. Häring, N. Bee, and E. Andre. Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. RO-MAN, pages 204--209. IEEE, Aug. 2011.
[6]
M. Zecca, Y. Mizoguchi, K. Endo, F. Iida, Y. Kawabata, N. Endo, K. Itoh, and A. Takanishi. Whole body emotion expressions for kobian humanoid robot - preliminary experiments with different emotional patterns -. RO-MAN, IEEE, pages 381--386. 2009.
[7]
J. Xu, J. Broekens, K. Hindriks, and Mark A. Neerincx. Mood Expression through Parameterized Functional Behavior of Robots. Int. Symp. Robot and Human Interactive Communication (RO-MAN), pages 533--540. IEEE, 2013.
[8]
J. Xu, J. Broekens, K. Hindriks, and Mark A. Neerincx. The Relative Importance and Interrelations between Behavior Parameters for Robots' Mood Expression. Affective Comp. and Intel. Interaction (ACII), pages 558--563. IEEE, 2013.
[9]
J. Xu, J. Broekens, K. Hindriks, and Mark A. Neerincx. Bodily Mood Expression: Recognize Moods from Functional Behaviors of Humanoid Robots. International Conference on Social Robotics (ICSR), pages 511--520. Springer, 2013.
[10]
R. Gockley, J. Forlizzi, and R. Simmons. Interactions with a moody robot. ACM SIGCHI/SIGART conference on Human-robot interaction, pages 186--193. ACM, March 2006.
[11]
R. Looije, M. A. Neerincx, and F. Cnossen. Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors. International Journal of Human-Computer Studies, 68(6):386--397, 2010.
[12]
Q. A. Le, and C. Pelachaud. Evaluating an expressive gesture model for a humanoid robot: Experimental results. 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), submitted, 2012.
[13]
J. M. Kessens, M. A. Neerincx, R. Looije, M. Kroes, and G. Bloothooft. Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children. Affective Computing and Intelligent Interaction (ACII). pages 1--7. IEEE, September, 2009.
[14]
I. Leite, C. Martinho, A. Pereira, and A. Paiva. iCat: an affective game buddy based on anticipatory mechanisms. Proceedings of AAMAS, 3:1229--1232, May 2008.
[15]
I. Leite, G. Castellano, A. Pereira, C. Martinho, and A. Paiva. Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. ACM/IEEE International conference on Human-Robot Interaction (HRI), pages 367--374. ACM. March 2012.
[16]
M. Tielman, M. A. Neerincx, J. J. Meyer, R. Looije. Adaptive Emotional Expression in Robot-Child interaction. ACM/IEEE Int. conf. Human-Robot Interaction (HRI), 2014.
[17]
B. Robins, K. Dautenhahn, and P. Dickerson. From isolation to communication: a case study evaluation of robot assisted play for children with autism with a minimally expressive humanoid robot. International Conferences on Advances in Computer-Human Interactions, pages 205--211. IEEE. 2009.
[18]
K. Dautenhahn, C. L. Nehaniv, M. L. Walters, B. Robins, H. Kose-Bagci, N. A. Mirza, and M. Blow. KASPAR--a minimally expressive humanoid robot for human-robot interaction research. Applied Bionics and Biomechanics, 6(3--4):369--397, 2009.
[19]
R. Beale, and C. Creed. Affective interaction: How emotional agents affect users. International Journal of Human-Computer Studies, 67(9):755--776, 2009.
[20]
C. Breazeal. Designing Sociable Robots. MIT Press, Cambridge, MA, USA, 2002.
[21]
C. Bartneck, J. Reichenbach, and V. A. Breemen. In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions. Proceedings of the Design and Emotion, pages 32--51. July 2004.
[22]
C. Pelachaud. Studies on gesture expressivity for a virtual agent. Speech Communication, 51(7):630--639, 2009.
[23]
S. Kopp, B. Jung, N. Lessmann, and I. Wachsmuth. Max - A Multimodal Assistant in Virtual Reality Construction. KI, 17(4):11, 2003.
[24]
A. Beck, B. Stevens, K. A. Bard, and L. Cañamero. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2(1):1--29, 2012.
[25]
H. Wallbott. Bodily expression of emotion. European J. Social Psychology, 28(6):879--896, 1998.
[26]
R. von Laban, L. Ullmann. The Mastery of Movement. 4th ed. Dance Books, Limited, 2011.
[27]
D. Chi, M. Costa, L. Zhao, N. Badler. The emote model for effort and shape. SIGGRAPH, pages 173--182. ACM, 2000.
[28]
M. Mancini, G. Castellano, C. Peters, and P. W. McOwan. Evaluating the communication of emotion via expressive gesture copying behaviour in an embodied humanoid agent. ACII, pages 215--224. Springer Berlin Heidelberg, 2011.
[29]
J. Tsai, E. Bowring, S. Marsella, W. Wood, and M. Tambe. A study of emotional contagion with virtual characters. Intelligent Virtual Agents, pages 81--88. Springer Berlin Heidelberg, January, 2012.
[30]
M. E. Glickman. Parameter estimation in large dynamic paired comparison experiments. J. of the Royal Statistical Society: Series C (Applied Statistics), 48(3):377--394, 1999.
[31]
J. L. Tracy, and R. W. Robins. The automaticity of emotion recognition. Emotion, 8(1):81, 2008.
[32]
L. N. Jefferies, D. Smilek, E. Eich, and J. T. Enns. Emotional valence and arousal interact in attentional control. Psychological Science, 19(3):290--295, 2008.
[33]
N. Silvestrini, and G. H. Gendolla. The joint effect of mood, task valence, and task difficulty on effort-related cardiovascular response and facial EMG. International Journal of Psychophysiology, 73(3):226--234, 2009.
[34]
K. Gasper, and G. L. Clore. Attending to the Big Picture: Mood and Global Versus Local Processing of Visual Information. Psychological Science, 13(1):34--40, 2002.
[35]
M. Baas, C. K. De Dreu, and B. A. Nijstad. A meta-analysis of 25 years of mood-creativity research: Hedonic tone, activation, or regulatory focus' Psychological bulletin, 134(6):779, 2008.
[36]
M. R. Basso, B. K. Schefft, M. D. Ris, and W. N. Dember. Mood and global-local visual processing. J. of the Int. Neuropsychological Society, 2(3):249--255, 1996.
[37]
M. M. Bradley, and P. J. Lang. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. Journal of Behav. Ther. Exp. Psychiatry, 25(1):49--59, 1994.

Cited By

View all

Index Terms

  1. Robot mood is contagious: effects of robot body language in the imitation game

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      AAMAS '14: Proceedings of the 2014 international conference on Autonomous agents and multi-agent systems
      May 2014
      1774 pages
      ISBN:9781450327381

      Sponsors

      • IFAAMAS

      In-Cooperation

      Publisher

      International Foundation for Autonomous Agents and Multiagent Systems

      Richland, SC

      Publication History

      Published: 05 May 2014

      Check for updates

      Author Tags

      1. behavioral cues
      2. body language
      3. human robot interaction (hri)
      4. mood expression
      5. nonverbal cues
      6. social robots

      Qualifiers

      • Research-article

      Conference

      AAMAS '14
      Sponsor:

      Acceptance Rates

      AAMAS '14 Paper Acceptance Rate 169 of 709 submissions, 24%;
      Overall Acceptance Rate 1,155 of 5,036 submissions, 23%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)8
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 25 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media