skip to main content
10.1145/2617995.2618004acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

Collection and characterization of emotional body behaviors

Published: 16 June 2014 Publication History

Abstract

This paper addressees two issues in modeling bodily expression of emotions; emotional behaviors collection and expressive movement characterization. In this paper, we describe our body movement coding schema intended to the characterization of bodily emotional expression in different movement tasks. We describe as well the database that we use for the characterization of emotion expression in different movement tasks through the proposed body movement coding schema.

References

[1]
S. F. Alaoui, B. Caramiaux, M. Serrano, and F. Bevilacqua. Movement qualities as interaction modality. In Proceedings of the Designing Interactive Systems Conference, DIS '12, pages 761--769, New York, NY, USA, 2012. ACM.
[2]
A. P. Atkinson, W. H. Dittrich, A. J. Gemmell, and A. W. Young. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33(6):717--746, 2004.
[3]
T. Bänziger, H. Pirker, and K. Scherer. GEMEP-GEneva Multimodal Emotion Portrayals: A corpus for the study of multimodal emotional expressions. Proceedings of LREC, 2006.
[4]
R. L. Birdwhistell. Kinesics and context: Essays on body motion communication. University of Pensylvania Press, Philadelphia, conduct an edition, 1970.
[5]
C. Busso, M. Bulut, C.-C. Lee, A. Kazemzadeh, E. Mower, S. Kim, J. N. Chang, S. Lee, and S. S. Narayanan. IEMOCAP: interactive emotional dyadic motion capture database. Language Resources and Evaluation, 42(4):335--359, Nov. 2008.
[6]
A. Camurri, B. Mazzarino, and G. Volpe. Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library. Library, 2915:460--467, 2004.
[7]
D. Chi, M. Costa, L. Zhao, and N. Badler. The EMOTE model for effort and shape. In Proceedings of the 27th annual conference on Computer graphics and interactive techniques SIGGRAPH 00, volume ACM Comput, pages 173--182, 2000.
[8]
R. Cowie, E. Douglas-Cowie, I. Sneddon, A. Batliner, and C. Pelachaud. Principles and History. In Emotion-Oriented Systems: The HUMAINE Handbook, chapter Data and D, pages 167--196. 2011.
[9]
N. Dael, M. Mortillaro, and K. R. Scherer. Emotion expression in body action and posture. Emotion, 12(5):1085--1101, 2011.
[10]
N. Dael, M. Mortillaro, and K. R. Scherer. The Body Action and Posture Coding System (BAP): Development and Reliability. Journal of Nonverbal Behavior, pages 97--121, Jan. 2012.
[11]
P. Ekman and W. V. Friesen. Hand Movements. Journal of Communication, 22(4):353--374, 1972.
[12]
P. Ekman and W. V. Friesen. Facial Action Coding System, volume 160. Consulting Psychologists Press, 1978.
[13]
S. Frey and M. Von Cranach. A method for the assessment of body movement variability. Social Communication and Movement, pages 389--418, 1973.
[14]
M. M. Gross, E. A. Crane, and B. L. Fredrickson. Methodology for Assessing Bodily Expression of Emotion. Journal of Nonverbal Behavior, 34(4):223--248, 2010.
[15]
H. Gunes and M. Piccardi. A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior. 2007.
[16]
H. Hicheur, H. Kadone, J. Grèzes, and A. Berthoz. Perception of emotional gaits using avatar animation of real and artificially synthesized gaits. Humaine Association Conference on Affective Computing and Intelligent Interaction, 2013.
[17]
R. R. K. S. J.A. Harrigan. Proxemics, kinesics and gaze. In Oxford University Press, editor, The new handbook of methods in nonverbal behavior research, chapter Proxemics, kinesics and gaze, page 138. Oxford, 2005.
[18]
A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed. Automatic Recognition of Non-Acted Affective Postures. IEEE Transactions on Systems Man and Cybernetics Part B Cybernetics, 41(4):1027--1038, 2011.
[19]
A. Kleinsmith, I. Rebai, N. Berthouze, and J.-C. Martin. Postural expressions of emotion in a motion captured database and in a humanoid robot. Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots - AFFINE '09, (1):1--2, 2009.
[20]
R. Laban. The mastery of movement. Plymouth, UK, 1988.
[21]
Y. Ma, H. M. Paterson, and F. E. Pollick. A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behavior research methods, 38(1):134--41, Feb. 2006.
[22]
M. Meijer. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13(4):247--268, 1989.
[23]
R. Niewiadomski, M. Mancini, and T. Baur. MMLI: Multimodal multiperson corpus of laughter in interaction. In proceeding of: 4th international workshop on Human Behavior Understanding, In conjunction with ACM Multimedia 2013, At Barcelona, Spain, 8212:pp 184--195, 2013.
[24]
F. E. Pollick, H. M. Paterson, A. Bruderlin, and A. J. Sanford. Perceiving affect from arm movement. Cognition, 82(2):B51--B61, 2001.
[25]
C. L. Roether, L. Omlor, A. Christensen, and M. A. Giese. Critical features for the perception of emotion from gait. Journal of Vision, 9(6):1--32, 2009.
[26]
K. R. Scherer, R. Banse, H. G. Wallbott, and T. Goldbeck. Vocal cues in emotion encoding and decoding. Motivation and Emotion, 15(2):123--148, 1991.
[27]
K. R. Scherer and H. Ellgring. Multimodal expression of emotion: affect programs or componential appraisal patterns? Emotion Washington Dc, 7(1):158--171, 2007.
[28]
J. Tilmanne and T. Dutoit. Continuous Control of Style through Linear Interpolation in Hidden Markov Model Based Stylistic Walk Synthesis, 2011.
[29]
H. G. Wallbott. Bodily expression of emotion. European Journal of Social Psychology, 28(6):879--896, 1998.
[30]
H. G. Wallbott and K. R. Scherer. Cues and channels in emotion recognition. Journal of Personality and Social Psychology, 51(4):690--699, 1986.
[31]
Xsens. MVN BIOMECH system, Xsens website. http://www.xsens.com/.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
MOCO '14: Proceedings of the 2014 International Workshop on Movement and Computing
June 2014
184 pages
ISBN:9781450328142
DOI:10.1145/2617995
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

  • SFU: Simon Fraser University

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 June 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Body movement
  2. Emotional behavior
  3. Movement characteristics

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

MOCO '14

Acceptance Rates

MOCO '14 Paper Acceptance Rate 24 of 54 submissions, 44%;
Overall Acceptance Rate 85 of 185 submissions, 46%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media