skip to main content
10.1145/3537972.3537983acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article
Open access

Professor Plucky: Expressive body movement in human–robot musical ensembles

Published: 30 June 2022 Publication History

Abstract

When people play music together, they move their bodies, and that movement plays an important role in the activity of group music making. In contrast, when robots play music with people, the robots are usually stiff and mechanical in their movement. In general, it is not well understood how the movement of such robots affects how people interact with them, or how the robot movement should be designed in order to promote certain features of interaction. As an initial exploration into these questions, we built a prototype guitar plucking robot that plucks the strings with either a) kinetic plucking mechanisms that are designed to have visually appealing movement, or b) control plucking mechanisms that do not visually move. In a pilot study we found that when guitarists play with the robot, they move their hands more and look at the robot more when it uses the kinetic mechanisms as opposed to the control ones. However, they do not report preferring the kinetic mechanisms. These preliminary findings suggest some very clear hypotheses for future followup studies.

References

[1]
Cecilio Angulo, Joan Comas, and Diego Pardo. 2011. Aibo jukeBox–A robot dance interactive experience. In International Work-Conference on Artificial Neural Networks. Springer, 605–612.
[2]
Laura Bishop, Carlos Cancino-Chacón, and Werner Goebl. 2019. Eye gaze as a means of giving and seeking information during musical interaction. Consciousness and Cognition 68 (2019), 73–96. https://doi.org/10.1016/j.concog.2019.01.002
[3]
Laura Bishop, Carlos Cancino-Chacón, and Werner Goebl. 2019. Moving to communicate, moving to interact: Patterns of body motion in musical duo performance. Music Perception 37, 1 (2019), 1–25.
[4]
Laura Bishop and Werner Goebl. 2018. Beating time: How ensemble musicians’ cueing gestures communicate beat position and tempo. Psychology of Music 46, 1 (2018), 84–106.
[5]
Laura Bishop and Werner Goebl. 2018. Communication for coordination: Gesture kinematics and conventionality affect synchronization success in piano duos. Psychological Research 82, 6 (2018), 1177–1194.
[6]
Davina Bristow, Geraint Rees, and Christopher D Frith. 2007. Social interaction modifies neural response to gaze shifts. Social cognitive and affective neuroscience 2, 1 (2007), 52–61.
[7]
Andrew Chang, Steven R Livingstone, Dan J Bosnyak, and Laurel J Trainor. 2017. Body sway reflects leadership in joint music performance. Proceedings of the National Academy of Sciences 114, 21(2017), E4134–E4141.
[8]
Stelian Coros, Bernhard Thomaszewski, Gioacchino Noris, Shinjiro Sueda, Moira Forberg, Robert W Sumner, Wojciech Matusik, and Bernd Bickel. 2013. Computational design of mechanical characters. ACM Transactions on Graphics (TOG) 32, 4 (2013), 1–12.
[9]
Donald Glowinski, Maurizio Mancini, Roddy Cowie, Antonio Camurri, Carlo Chiorri, and Cian Doherty. 2013. The movements made by performers in a skilled quartet: a distinctive pattern, and the function that it serves. Frontiers in psychology 4 (2013), 841.
[10]
Rolf Inge Godøy and Marc Leman. 2010. Musical gestures: Sound, movement, and meaning. Routledge.
[11]
Werner Goebl and Caroline Palmer. 2009. Synchronization of timing and motion among performing musicians. Music Perception 26, 5 (2009), 427–438.
[12]
Guy Hoffman and Gil Weinberg. 2011. Interactive improvisation with a robotic marimba player. Autonomous Robots 31, 2 (2011), 133–153.
[13]
Ollie Johnston and Frank Thomas. 1981. The illusion of life: Disney animation. Disney Editions New York.
[14]
Ajay Kapur, Eric Trimpin, Afzal Singer, George Suleman, and George Tzanetakis. 2007. A comparison of solenoid-based strategies for robotic drumming. In ICMC. Citeseer.
[15]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Seattle, Washington) (UbiComp ’14 Adjunct). ACM, New York, NY, USA, 1151–1160. https://doi.org/10.1145/2638728.2641695
[16]
Michael Krzyżaniak. 2021. Musical robot swarms, timing, and equilibria. Journal of New Music Research(2021), 1–19.
[17]
Laura Maes, Godfried-Willem Raes, and Troy Rogers. 2011. The man and machine robot orchestra at logos. Computer Music Journal 35, 4 (2011), 28–48.
[18]
Nancy H McNevin, Charles H Shea, and Gabriele Wulf. 2003. Increasing the distance of an external focus of attention enhances learning. Psychological research 67, 1 (2003), 22–29.
[19]
Marek P Michalowski, Selma Sabanovic, and Hideki Kozima. 2007. A dancing robot for rhythmic social interaction. In Proceedings of the ACM/IEEE international conference on Human-robot interaction. 89–96.
[20]
Jim W Murphy, James McVay, Ajay Kapur, and Dale A Carnegie. 2013. Designing and Building Expressive Robotic Guitars. In NIME. 557–562.
[21]
Ye Pan, Min-Gyu Kim, and Kenji Suzuki. 2010. A Robot Musician Interacting with a Human Partner through Initiative Exchange. In NIME. 166–169.
[22]
Curtis Roads. 1986. The Tsukuba musical robot. Computer music journal 10, 2 (1986), 39–43.
[23]
Richard Savery, Lisa Zahray, and Gil Weinberg. 2020. Shimon the Rapper: A Real-Time System for Human-Robot Interactive Rap Battles. arXiv preprint arXiv:2009.09234(2020).
[24]
Eric Singer, Jeff Feddersen, Chad Redmon, and Bil Bowen. 2004. LEMUR’s musical robots. In Proceedings of the 2004 conference on New interfaces for musical expression. Citeseer, 181–184.
[25]
Eric Singer, Kevin Larke, and David Bianciardi. 2003. LEMUR GuitarBot: MIDI Robotic String Instrument. In Nime, Vol. 3. 188–191.
[26]
Jorge Solis, Keisuke Chida, Kei Suefuji, and Atsuo Takanishi. 2006. The development of the anthropomorphic flutist robot at Waseda University. International Journal of Humanoid Robotics 3, 02 (2006), 127–151.
[27]
Antoine Stevens and Leornardo Ramirez-Lopez. 2022. An introduction to the prospectr package. R package version 0.2.4.
[28]
Fumihide Tanaka, Javier R Movellan, Bret Fortenberry, and Kazuki Aisaka. 2006. Daily HRI evaluation at a classroom environment: reports from dance interaction experiments. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. 3–9.
[29]
Clemens Wöllner, Frederik JA Deconinck, Jim Parkinson, Michael J Hove, and Peter E Keller. 2012. The perception of prototypical motion: Synchronization is enhanced with quantitatively morphed gestures of musical conductors.Journal of Experimental Psychology: Human Perception and Performance 38, 6(2012), 1390.

Cited By

View all

Index Terms

  1. Professor Plucky: Expressive body movement in human–robot musical ensembles
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      MOCO '22: Proceedings of the 8th International Conference on Movement and Computing
      June 2022
      262 pages
      ISBN:9781450387163
      DOI:10.1145/3537972
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 30 June 2022

      Check for updates

      Author Tags

      1. MOCO
      2. NIME
      3. expression
      4. eye-tracking
      5. gesture
      6. guitar
      7. interaction
      8. mocap
      9. movement
      10. music
      11. musical robots
      12. plucky
      13. professor plucky
      14. robots

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • Research Council of Norway

      Conference

      MOCO '22

      Acceptance Rates

      Overall Acceptance Rate 85 of 185 submissions, 46%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 219
        Total Downloads
      • Downloads (Last 12 months)103
      • Downloads (Last 6 weeks)11
      Reflects downloads up to 24 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media