skip to main content
research-article

A Natural User Interface for Gestural Expression and Emotional Elicitation to Access the Musical Intangible Cultural Heritage

Published: 12 April 2018 Publication History

Abstract

This article describes a prototype natural user interface, named the Intangible Musical Instrument, which aims to facilitate access to knowledge of performers that constitutes musical Intangible Cultural Heritage using off-the-shelf motion capturing that is easily accessed by the public at large. This prototype is able to capture, model, and recognize musical gestures (upper body including fingers) as well as to sonify them. The emotional status of the performer affects the sound parameters at the synthesis level. Intangible Musical Instrument is able to support both learning and performing/composing by providing to the user not only intuitive gesture control but also a unique user experience. In addition, the first evaluation of the Intangible Musical Instrument is presented, in which all the functionalities of the system are assessed. Overall, the results with respect to this evaluation were very promising.

References

[1]
Reuben M. Baron and David A. Kenny. 1986. The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology 51, 6, 1173--1182.
[2]
Peter M. Bentler. 1990. Comparative fit indexes in structural models. Psychological Bulletin 107, 238--246.
[3]
Peter M. Bentler and Chih-Ping Chou. 1987. Practical issues in structural modeling. Sociological Methods 8 Research 16, 1, 78--117.
[4]
Frédéric Bevilacqua, Fabrice Guédy, Nobert Schnell, Emmanuel Fléty, and Nicolas Leroy. 2007. Wireless sensor interface and gesture-follower for music pedagogy. In Proceedings of the International Conference of New Interfaces for Musical Expression. 124--129.
[5]
Frédéric Bevilacqua, Bruno Zamborlin, Anthony Sypniewski, Norbert Schnell, Fabrice Guédy, and Nicolas Rasamimanana. 2009. Continuous realtime gesture following and recognition. In Proceedings of the 8th International Conference on Gesture in Embodied Communication and Human-Computer Interaction (GW'09), Stefan Kopp and Ipke Wachsmuth (Eds.). Springer-Verlag, Berlin, Heidelberg, 73--84.
[6]
Frédéric Bevilacqua, Norbert Schnell, Nicolas Rasamimanana, Bruno Zamborlin, and Fabrice Guédy. 2011. Online gesture analysis and control of audio processing. In Musical Robots and Interactive Multimodal Systems (Springer Tracts in Advanced Robotics, 74), J. Solis and K. C. Ng (Eds.). Berlin: Springer, 127--142.
[7]
Jan L. Broeckx. 1981. Muziek, ratio en affect over de wisselwerking van rationeel denken en affectief beleven bij voortbrengst en ontvangst van muziek. Antwerpen: Metropolis.
[8]
Bernd Buxbaum. 2002. Optische laufzeitmessung und CDMA auf Basis der PMD-technologie mittels phasenvariabler PN-modulation. Schaker Verlag, Aachen.
[9]
Claude Cadoz and Marcelo M. Wanderley. 2000. Gesture-music. Trends in gestural control of music. In Trends in Gestural Control of Music, M. M. Wanderley and M. Battier (Eds.). Paris: IRCAM/Centre Pompidou, 71--94.
[10]
Antonio Camurri, Gualtiero Volpe, Giovanni de Poli, and Marc Leman. 2005. Communicating expressiveness and affect in multimodal interactive systems. IEEE MultiMedia 12, 1, 43--53.
[11]
Martin R. L. Clayton. 2000. Time in Indian Music: Rhythm Metre and Form in Indian Rag Performance. Oxford: Oxford University Press.
[12]
Wilson Coker. 1972. Music 8 Meaning: A Theoretical Introduction to Musical Aesthetics. New York: Free Press.
[13]
Naomi Cumming. 2000. The Sonic Self: Musical Subjectivity and Signification. Bloomington: Indiana University Press.
[14]
Francois Delalande. 1988. La gestique de Gould: éléments pour une sémiologie du geste musical. In Glenn Gould pluriel, G. Guertin (Ed.). Montreal: Louise Courteau éditrice.
[15]
John Delery and Harold D. Doty. 1996. Modes of theorizing in strategic human resource management: Test of universalistic, contingency and configurational performance predictions. Academy of Management Journal 39, 802--835.
[16]
Andy P. Field. 2005. Discovering Statistics Using SPSS (2nd ed.). London: Sage.
[17]
Anders Friberg and Johan Sundberg. 1999. Does music performance allude to locomotion? A model of final ritardandi derived from measurements of stopping runners. Journal of the Acoustical Society of America 105, 3, 146--148.
[18]
Aalf Gabrielsson and Erik Lindström. 2010. The role of structure in the musical expression of emotions. In Handbook of Music and Emotions Theory, Research, Applications, P. N. Juslin and J. A. Sloboda (Eds.). New York: Oxford University Press, 367--400.
[19]
Leontios J. Hadjileontiadis. 2014. Conceptual blending in biomusic composition space: The “brainswarm” paradigm. In Proceedings of the ICM/SMC Conference.
[20]
Robert S. Hatten. 1994. Musical Meaning in Beethoven: Markedness, Correlation, and Interpretation. Bloomington: Indiana University Press.
[21]
Rolf Inge Godøy and Mark Leman. 2009. Musical Gestures: Sound, Movement, and Meaning, Rolf Inge Godoy and Marc Leman (Eds.). New York: Routledge.
[22]
Alexander R. Jensenius. 2007. ACTION -- SOUND Developing Methods and Tools to Study. Ph.D. Dissertation, University of Oslo.
[23]
Patrik N. Juslin. 2003. Five facets of musical expression: A psychologist's perspective on music performance. Psychology of Music 31, 1, 273--302.
[24]
Patrik N. Juslin and John Sloboda. 2010. Handbook of Music and Emotions Theory, Research, Applications. New York: Oxford University Press.
[25]
Patrik N. Juslin and Renee Timmers. 2010. Expression and communication of emotion in music performance. In Handbook of Music and Emotions Theory, Research, Applications, P. N. Juslin and J. A. Sloboda (Eds.). New York: Oxford University Press, 453--489.
[26]
Henry F. Kaiser. 1974. An index of factorial simplicity. Psychometrika 39, 31--36.
[27]
Anastasia A. Katou, Pawan S. Budhwar, and Charmi Patel. 2014. Content vs. process in the HRM-performance relationship: An empirical examination. Human Resource Management 53, 4, 527--544.
[28]
Peter Keller. 2008. Joint action in music performance. In Enacting Intersubjectivity: A Cognitive and Social Perspective on the Study of Interaction, F. Morganti, A. Carassa, and G. Riva (Eds.). Amsterdam, 205--221.
[29]
Rex B. Kline. 1998. Principles and Practice of Structural Equation Modeling. New York: Guilford Press.
[30]
Stefan Koelsch and Walter A. Siebel. 2005. Towards a neural basis of music perception. Trends in Cognitive Sciences 9, 12, 578--584.
[31]
Marc Leman. 2010. Music, gesture, and the formation of embodied meaning. In Musical Gestures: Sound, Movement, and Meaning, Rolf Inge Godoy and Marc Leman (Eds.). New York: Routledge, 126--153.
[32]
Pieter-Jan Maes, Marc Leman, Micheline Lesaffre, Michiel Demey, and Dirk Moelants. 2010. From expressive gesture to sound. The development of an embodied mapping trajectory inside a musical interface. Journal on Multimodal User Interfaces 3, 1, 67--78.
[33]
David Mcneill. 1992. Hand and Mind: What Gestures Reveal About Thought. Chicago: University of Chicago Press.
[34]
Alva Noë. 2004. Action in Perception. Cambridge, MA: MIT Press.
[35]
Elazar J. Pedhazur and Liora Pedhazur-Schmelkin. 1991. Measurement, Design, and Analysis: An Integrated Approach. Hillsdale, NJ: Lawrence Erlbaum.
[36]
Carroll C. Pratt. 1931/1968. The Meaning of Music: A Study In Psychological Aesthetics. New York: Johnson.
[37]
James A. Russell. 1980. A circumflex model of affect. Journal of Personality and Social Psychology 39, 1161--1178.
[38]
Jamie Shotton, Andrew Fitzgibbon, Mat Cook, Toby Sharp, Mark Finocchio, Richard Moore, Alex Kipman, and Andrew Blake. 2013. Real-time human pose recognition in parts from a single depth image. Machine Learning for Computer Vision, SCI, 411, 119--135.
[39]
Alexander Truslit. 1938. Gestaltung und Bewegung in der Musik. Berlin-Lichterfelde: C.F. Vieweg.
[40]
N. Smirnov. 1948. Table for estimating the goodness of fit of empirical distributions. Annals of Mathematical Statistics 19, 279--281.
[41]
UNESCO. 2003. Convention of the Safeguarding of Intangible Cultural Heritage of UNESCO. Retrieved from https://ich.unesco.org/en/convention.
[42]
Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, and Denis Fisseler. 2013. Analysis of the accuracy and robustness of the leap motion controller. Sensors 13, 6380--6393.
[43]
Matthias Wiedemann, Markus Sauer, Frauke Driewer, and Klaus Schilling. 2008. Analysis and characterization of the PMD camera for application in mobile robotics. In Proceedings of the 17th IFAC World Congress, 6--11.
[44]
Liwei Zhao. 2001. Synthesis and acquisition of laban movement analysis qualitative parameters for communicative gestures. Ph.D. Dissertation. University of Pennsylvania, Philadelphia. AAI3015399.

Cited By

View all

Index Terms

  1. A Natural User Interface for Gestural Expression and Emotional Elicitation to Access the Musical Intangible Cultural Heritage

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Journal on Computing and Cultural Heritage
      Journal on Computing and Cultural Heritage   Volume 11, Issue 2
      June 2018
      124 pages
      ISSN:1556-4673
      EISSN:1556-4711
      DOI:10.1145/3199679
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 12 April 2018
      Accepted: 01 June 2017
      Revised: 01 June 2017
      Received: 01 January 2017
      Published in JOCCH Volume 11, Issue 2

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Gesture recognition
      2. emotional status
      3. evaluation
      4. sonification

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Funding Sources

      • Widget Corporation

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)38
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 05 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media