skip to main content
10.1145/3411764.3445539acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

SonicHoop: Using Interactive Sonification to Support Aerial Hoop Practices

Published: 07 May 2021 Publication History

Abstract

Aerial hoops are circular, hanging devices for both acrobatic exercise and artistic performance that let us explore the role of interactive sonification in physical activity. We present SonicHoop, an augmented aerial hoop that generates auditory feedback via capacitive touch sensing, thus becoming a digital musical instrument that performers can play with their bodies. We compare three sonification strategies through a structured observation study with two professional aerial hoop performers. Results show that SonicHoop fundamentally changes their perception and choreographic processes: instead of translating music into movement, they search for bodily expressions that compose music. Different sound designs affect their movement differently, and auditory feedback, regardless of type of sound, improves movement quality. We discuss opportunities for using SonicHoop as an aerial hoop training tool, as a digital musical instrument, and as a creative object; as well as using interactive sonification in other acrobatic practices to explore full-body vertical interaction.

Supplementary Material

VTT File (3411764.3445539_videofigurecaptions.vtt)
VTT File (3411764.3445539_videopreviewcaptions.vtt)
Supplementary Materials (3411764.3445539_supplementalmaterials.zip)
MP4 File (3411764.3445539_videofigure.mp4)
Supplemental video
MP4 File (3411764.3445539_videopreview.mp4)
Preview video

References

[1]
SIGGRAPH 2019. 2020. SIGGRAPH 2019. https://s2019.siggraph.org. Accessed 2020-08-20.
[2]
Jan Anlauff, Jeremy R Cooperstock, and Joyce Fung. 2013. Augmented feedback for learning single-legged stance on a slackline. In 2013 International Conference on Virtual Rehabilitation (ICVR). IEEE, 162–163.
[3]
Marc Bächlin, Kilian Förster, and Gerhard Tröster. 2009. SwimMaster: A Wearable Assistant for Swimmer. In Proceedings of the 11th International Conference on Ubiquitous Computing (Orlando, Florida, USA) (UbiComp ’09). Association for Computing Machinery, New York, NY, USA, 215–224. https://doi.org/10.1145/1620545.1620578
[4]
Ludovic Baudry, David Leroy, Ràgis Thouvarecq, and Didier Chollet. 2006. Auditory concurrent feedback benefits on the circle performed in gymnastics. Journal of sports sciences 24, 2 (2006), 149–156.
[5]
Frederic Bevilacqua, Fabrice Guédy, Norbert Schnell, Emmanuel Fléty, and Nicolas Leroy. 2007. Wireless Sensor Interface and Gesture-follower for Music Pedagogy. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (New York, New York) (NIME ’07). ACM, New York, NY, USA, 124–129. https://doi.org/10.1145/1279740.1279762
[6]
Frédéric Bevilacqua, Norbert Schnell, Nicolas Rasamimanana, Bruno Zamborlin, and Fabrice Guédy. 2011. Online Gesture Analysis and Control of Audio Processing. In Musical Robots and Interactive Multimodal Systems. Springer, 127–142.
[7]
Dominik Bial, Thorsten Appelmann, Enrico Rukzio, and Albrecht Schmidt. 2012. Improving cyclists training with tactile feedback on feet. In International Conference on Haptic and Audio Interaction Design. Springer, 41–50.
[8]
Jordi Bolíbar and Roberto Bresin. 2012. Sound feedback for the optimization of performance in running. Proceedings of the Sound and Music Computing (2012), 39–41.
[9]
Robert Jan Bood, Marijn Nijssen, John Van Der Kamp, and Melvyn Roerdink. 2013. The power of auditory-motor synchronization in sports: enhancing running performance by coupling cadence with the right beats. PloS one 8, 8 (2013).
[10]
Jeffrey Boyd and Andrew Godbout. 2010. Corrective Sonic Feedback for Speed Skating: A Case Study. Georgia Institute of Technology.
[11]
Jeffrey E Boyd, Andrew Godbout, and Chris Thornton. 2012. In situ motion capture of speed skating: Escaping the treadmill. In 2012 Ninth Conference on Computer and Robot Vision. IEEE, 460–467.
[12]
Virginia Braun and Victoria Clarke. 2012. Thematic analysis.(2012).
[13]
Kristin Carlson, Sarah Fdili Alaoui, Greg Corness, and Thecla Schiphorst. 2019. Shifting Spaces: Using Defamiliarization to Design Choreographic Technologies That Support Co-Creation. In Proceedings of the 6th International Conference on Movement and Computing (Tempe, AZ, USA) (MOCO ’19). Association for Computing Machinery, New York, NY, USA, Article 17, 8 pages. https://doi.org/10.1145/3347122.3347140
[14]
Daniel Cesarini, Thomas Hermann, and Bodo Ungerechts. 2014. A real-time auditory biofeedback system for sports swimming. In Proceedings of the 20th International Conference on Auditory Display (ICAD 2014).
[15]
Continuum. 2020. Continuum Fingerboard. https://www.hakenaudio.com/continuum-fingerboard. Accessed 2020-08-20.
[16]
Thomas D Cook, Donald Thomas Campbell, and Arles Day. 1979. Quasi-experimentation: Design & analysis issues for field settings. Vol. 351. Houghton Mifflin Boston.
[17]
Artem Dementyev, Hsin-Liu (Cindy) Kao, and Joseph A. Paradiso. 2015. SensorTape: Modular and Programmable 3D-Aware Dense Sensor Network on a Tape. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 649–658. https://doi.org/10.1145/2807442.2807507
[18]
Christopher Dobrian and Daniel Koppelman. 2006. The’E’in NIME: Musical Expression with New Computer Interfaces. In NIME, Vol. 6. 277–282.
[19]
Cirque du Soleil. 2020. Cirque du Soleil. https://www.cirquedusoleil.com. Accessed 2020-08-20.
[20]
Gaël Dubus and Roberto Bresin. 2013. A systematic review of mapping strategies for the sonification of physical quantities. PloS one 8, 12 (2013), e82491.
[21]
Alfred Effenberg, Ursula Fehse, and Andreas Weber. 2011. Movement Sonification: Audiovisual benefits on motor learning. In BIO web of conferences, Vol. 1. EDP Sciences, 00022.
[22]
Alfred O Effenberg. 2005. Movement sonification: Effects on perception and action. IEEE multimedia 12, 2 (2005), 53–59.
[23]
Don Samitha Elvitigala, Denys J.C. Matthies, Löic David, Chamod Weerasinghe, and Suranga Nanayakkara. 2019. GymSoles: Improving Squats and Dead-Lifts by Visualizing the User’s Center of Pressure. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300404
[24]
Rebecca Fiebrink and Perry R Cook. 2010. The Wekinator: a system for real-time, interactive machine learning in music. In Proceedings of The Eleventh International Society for Music Information Retrieval Conference (ISMIR 2010)(Utrecht).
[25]
Jérémie Garcia, Theophanis Tsandilas, Carlos Agon, and Wendy E. Mackay. 2014. Structured Observation with Polyphony: A Multifaceted Tool for Studying Music Composition. In Proceedings of the 2014 Conference on Designing Interactive Systems (Vancouver, BC, Canada) (DIS ’14). Association for Computing Machinery, New York, NY, USA, 199–208. https://doi.org/10.1145/2598510.2598512
[26]
Nan-Wei Gong, Steve Hodges, and Joseph A. Paradiso. 2011. Leveraging Conductive Inkjet Technology to Build a Scalable and Versatile Surface for Ubiquitous Sensing. In Proceedings of the 13th International Conference on Ubiquitous Computing (Beijing, China) (UbiComp ’11). Association for Computing Machinery, New York, NY, USA, 45–54. https://doi.org/10.1145/2030112.2030120
[27]
Nan-Wei Gong, Jürgen Steimle, Simon Olberding, Steve Hodges, Nicholas Edward Gillian, Yoshihiro Kawahara, and Joseph A. Paradiso. 2014. PrintSense: A Versatile Sensing Technique to Support Multimodal Flexible Surface Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1407–1410. https://doi.org/10.1145/2556288.2557173
[28]
Michael Gurevich and Jeffrey Treviño. 2007. Expression and its discontents: toward an ecology of musical creation. In Proceedings of the 7th international conference on New interfaces for musical expression. 106–111.
[29]
Elisabet Hagert. 2010. Proprioception of the wrist joint: a review of current concepts and possible implications on the rehabilitation of the wrist. Journal of Hand Therapy 23, 1 (2010), 2–17.
[30]
Shoichi Hasegawa, Seiichiro Ishijima, Fumihiro Kato, Hironori Mitake, and Makoto Sato. 2012. Realtime Sonification of the Center of Gravity for Skiing. In Proceedings of the 3rd Augmented Human International Conference (Megève, France) (AH ’12). Association for Computing Machinery, New York, NY, USA, Article 11, 4 pages. https://doi.org/10.1145/2160125.2160136
[31]
Thomas Hermann and Sebastian Zehe. 2011. Sonified aerobics-interactive sonification of coordinated body movements. International Community for Auditory Display.
[32]
Oliver Höner, Thomas Hermann, and Christian Grunow. 2004. Sonification of group behavior for analysis and training of sports tactics. In Proceedings of the International Workshop on Interactive Sonification.
[33]
Kristina Höök, Baptiste Caramiaux, Cumhur Erkut, Jodi Forlizzi, Nassrin Hajinejad, Michael Haller, Caroline Hummels, Katherine Isbister, Martin Jonsson, George Khut, 2018. Embracing first-person perspectives in soma-based design. In Informatics, Vol. 5. Multidisciplinary Digital Publishing Institute, 8.
[34]
Jessica Hummel, Thomas Hermann, Christopher Frauenberger, and Tony Stockman. 2010. Interactive sonification of german wheel sports. In Proceedings of ISon 2010-Interactive Sonification Workshop: Human Interaction with Auditory Displays.
[35]
Andy Hunt, Marcelo M Wanderley, and Matthew Paradis. 2003. The importance of parameter mapping in electronic instrument design. Journal of New Music Research 32, 4 (2003), 429–440.
[36]
Atsuki Ikeda, Yuka Tanaka, Dong-Hyun Hwang, Homare Kon, and Hideki Koike. 2019. Golf Training System Using Sonification and Virtual Shadow. In ACM SIGGRAPH 2019 Emerging Technologies (Los Angeles, California) (SIGGRAPH ’19). Association for Computing Machinery, New York, NY, USA, Article 14, 2 pages. https://doi.org/10.1145/3305367.3327993
[37]
Shahram Izadi, Richard A. Newcombe, David Kim, Otmar Hilliges, David Molyneaux, Steve Hodges, Pushmeet Kohli, Jamie Shotton, Andrew J. Davison, and Andrew Fitzgibbon. 2011. KinectFusion: Real-Time Dynamic 3D Surface Reconstruction and Interaction. In ACM SIGGRAPH 2011 Talks (Vancouver, British Columbia, Canada) (SIGGRAPH ’11). Association for Computing Machinery, New York, NY, USA, Article 23, 1 pages. https://doi.org/10.1145/2037826.2037857
[38]
Raine Kajastila, Leo Holsti, and Perttu Hämäläinen. 2016. The Augmented Climbing Wall: High-Exertion Proximity Interaction on a Wall-Sized Interactive Surface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). Association for Computing Machinery, New York, NY, USA, 758–769. https://doi.org/10.1145/2858036.2858450
[39]
Janin Koch, Nicolas Taffin, Andrés Lucero, and Wendy E. Mackay. 2020. SemanticCollage: Enriching Digital Mood Board Design with Semantic Labels. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (Eindhoven, Netherlands) (DIS ’20). Association for Computing Machinery, New York, NY, USA, 407–418. https://doi.org/10.1145/3357236.3395494
[40]
Christin Kohrs, Nicole Angenstein, and André Brechmann. 2016. Delays in human-computer interaction and their effects on brain activity. PloS one 11, 1 (2016), e0146250.
[41]
Niilo Konttinen, Kaisu Mononen, Jukka Viitasalo, and Toni Mets. 2004. The effects of augmented auditory feedback on psychomotor skill learning in precision shooting. Journal of Sport and Exercise Psychology 26, 2 (2004), 306–316.
[42]
Guillaume Lemaitre, Olivier Houix, Patrick Susini, Yon Visell, and Karmen Franinović. 2012. Feelings elicited by auditory feedback from a computationally augmented artifact: The flops. IEEE Transactions on Affective Computing 3, 3 (2012), 335–348.
[43]
LinnStrument. 2020. LinnStrument. https://www.rogerlinndesign.com/linnstrument. Accessed 2020-08-20.
[44]
Lian Loke and Toni Robertson. 2013. Moving and Making Strange: An Embodied Approach to Movement-Based Interaction Design. ACM Trans. Comput.-Hum. Interact. 20, 1, Article 7 (April 2013), 25 pages. https://doi.org/10.1145/2442106.2442113
[45]
Valerio Lorenzoni, Jacob Staley, Thierry Marchant, Kelsey E Onderdijk, Pieter-Jan Maes, and Marc Leman. 2019. The sonic instructor: A music-based biofeedback system for improving weightlifting technique. Plos one 14, 8 (2019), e0220915.
[46]
Pieter-Jan Maes, Valerio Lorenzoni, and Joren Six. 2019. The SoundBike: musical sonification strategies to enhance cyclists’ spontaneous synchronization to external music. Journal on Multimodal User Interfaces 13, 3 (2019), 155–166.
[47]
Richard A Magill and David I Anderson. 2007. Motor learning and control: Concepts and applications. Vol. 11. McGraw-Hill New York.
[48]
Shareen Mahmud, Jessalyn Alvina, Parmit K. Chilana, Andrea Bunt, and Joanna McGrenere. 2020. Learning Through Exploration: How Children, Adults, and Older Adults Interact with a New Feature-Rich Application. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376414
[49]
Claire F Michaels and Claudia Carello. 1981. Direct perception. Prentice-Hall Englewood Cliffs, NJ.
[50]
E. Miranda and M. Wanderley. 2006. New Digital Musical Instruments: Control and Interaction beyond the Keyboard. A-R Editions.
[51]
Florian Mueller and Katherine Isbister. 2014. Movement-Based Game Guidelines. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 2191–2200. https://doi.org/10.1145/2556288.2557163
[52]
Florian “Floyd” Mueller and Matthew Muirhead. 2015. Jogging with a Quadcopter. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 2023–2032. https://doi.org/10.1145/2702123.2702472
[53]
Michael J Muller and Sarah Kuhn. 1993. Participatory design. Commun. ACM 36, 6 (1993), 24–28.
[54]
Guthman musical instrument competition. 2020. Guthman musical instrument competition. https://guthman.gatech.edu. Accessed 2020-08-20.
[55]
Joseph W Newbold, Nadia Bianchi-Berthouze, and Nicolas E Gold. 2017. Musical expectancy in squat sonification for people who struggle with physical activity. Georgia Institute of Technology.
[56]
Stina Nylander, Alex Kent, and Jakob Tholander. 2014. Swing Sound: Experiencing the Golf Swing through Sound. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI EA ’14). Association for Computing Machinery, New York, NY, USA, 443–446. https://doi.org/10.1145/2559206.2574789
[57]
Stina Nylander, Jakob Tholander, and Alex Kent. 2013. Peripheral interaction for sports-exploring two modalities for real-time feedback. (2013).
[58]
William K Ogard. 2011. Proprioception in sports medicine and athletic conditioning. Strength & Conditioning Journal 33, 3 (2011), 111–118.
[59]
OptiTrack. 2020. OptiTrack Motion Capture System. https://optitrack.com/. Accessed 2020-08-20.
[60]
Joseph A Paradiso and Neil Gershenfeld. 1997. Musical applications of electric field sensing. Computer music journal 21, 2 (1997), 69–89.
[61]
Hyung Kun Park and Woohun Lee. 2016. Motion Echo Snowboard: Enhancing Body Movement Perception in Sport via Visually Augmented Feedback. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (Brisbane, QLD, Australia) (DIS ’16). Association for Computing Machinery, New York, NY, USA, 192–203. https://doi.org/10.1145/2901790.2901797
[62]
Hyung Kun Park, HyeonBeom Yi, and Woohun Lee. 2017. Recording and Sharing Non-Visible Information on Body Movement While Skateboarding. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 2488–2492. https://doi.org/10.1145/3025453.3025476
[63]
Sebastiaan Pijnappel and Florian “Floyd” Mueller. 2014. Designing Interactive Technology for Skateboarding. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction(Munich, Germany) (TEI ’14). Association for Computing Machinery, New York, NY, USA, 141–148. https://doi.org/10.1145/2540930.2540950
[64]
Ivan Poupyrev, Nan-Wei Gong, Shiho Fukuhara, Mustafa Emre Karagozler, Carsten Schwesig, and Karen E. Robinson. 2016. Project Jacquard: Interactive Digital Textiles at Scale. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). Association for Computing Machinery, New York, NY, USA, 4216–4227. https://doi.org/10.1145/2858036.2858176
[65]
Roberto Pugliese and Tapio Takala. 2015. Sonic trampoline: the effect of audio feedback on the user experience during an exercise of jumping. IEEE MultiMedia (2015).
[66]
Katerina El Raheb, Marina Stergiou, Akrivi Katifori, and Yannis Ioannidis. 2019. Dance Interactive Learning Systems: A Study on Interaction Workflow and Teaching Approaches. 52, 3, Article 50 (June 2019), 37 pages. https://doi.org/10.1145/3323335
[67]
Hesam Ramezanzade, Behrouz Abdoli, Alireza Farsi, and Mohammad Ali Sanjari. 2014. The effect of sonification modelling on perception and accuracy of performing jump shot basketball. International Journal of Sports Studies 4, 11 (2014), 1388–1392.
[68]
Bruce Richardson, Krispin Leydon, Mikael Fernstrom, and Joseph A Paradiso. 2004. Z-Tiles: building blocks for modular, pressure-sensing floorspaces. In CHI’04 extended abstracts on Human factors in computing systems. 1529–1532.
[69]
Daniel Roetenberg, Henk Luinge, and Per Slycke. 2009. Xsens MVN: full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technologies BV, Tech. Rep 1 (2009).
[70]
Nina Schaffert, André Engel, Sebastian Schlüter, and Klaus Mattes. 2019. The sound of the underwater dolphin-kick: developing real-time audio feedback in swimming. Displays 59(2019), 53–62.
[71]
N Schaffert, R Gehret, AO Effenberg, and K Mattes. 2008. The sonified boat motion as the characteristic rhythm of several stroke rate steps. (2008).
[72]
Nina Schaffert, Thenille Braun Janzen, Klaus Mattes, and Michael H Thaut. 2019. A review on the relationship between sound and movement in sports and rehabilitation. Frontiers in psychology 10 (2019), 244.
[73]
Richard A Schmidt and Craig A Wrisberg. 2008. Motor learning and performance: A situation-based learning approach. Human kinetics.
[74]
Gerd Schmitz, Bahram Mohammadi, Anke Hammer, Marcus Heldmann, Amir Samii, Thomas F Münte, and Alfred O Effenberg. 2013. Observation of sonified movements engages a basal ganglia frontocortical network. BMC neuroscience 14, 1 (2013), 32.
[75]
Norbert Schnell, Axel Röbel, Diemo Schwarz, Geoffroy Peeters, Riccardo Borghesi, 2009. MuBu and Friends–Assembling Tools for Content Based Real-Time Interactive Audio Processing in Max/MSP. In ICMC.
[76]
Norbert Schnell, Diemo Schwarz, Joseph Larralde, and Riccardo Borghesi. 2017. PiPo, a plugin interface for afferent data stream processing modules. In 18th International Society for Music Information Retrieval Conference, Suzhou, China, 2017.
[77]
Diemo Schwarz, Grégory Beller, Bruno Verbrugghe, and Sam Britton. 2006. Real-time corpus-based concatenative synthesis with catart. In Proc. of the 9th Int. Conference on Digital Audio Effects, Montreal, Canada.
[78]
Hugo Scurto, Wanyu Liu, Benjamin Matuszewski, Frédéric Bevilacqua, Jean-Louis Frechin, Uros Petrevski, and Norbert Schnell. 2019. Entrain: Encouraging Social Interaction in Collective Music Making. In ACM SIGGRAPH 2019 Studio (Los Angeles, California) (SIGGRAPH ’19). Association for Computing Machinery, New York, NY, USA, Article 5, 2 pages. https://doi.org/10.1145/3306306.3328004
[79]
Sensel. 2020. The Sensel Morph. https://sensel.com/pages/the-sensel-morph. Accessed 2020-08-20.
[80]
Roland Sigrist, Samantha Fox, Robert Riener, and Peter Wolf. 2016. Benefits of crank moment sonification in cycling. Procedia Eng 147(2016), 513–518.
[81]
Roland Sigrist, Georg Rauter, Robert Riener, and Peter Wolf. 2013. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychonomic bulletin & review 20, 1 (2013), 21–53.
[82]
Daniel Spelmezan. 2012. An Investigation into the Use of Tactile Instructions in Snowboarding. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services (San Francisco, California, USA) (MobileHCI ’12). Association for Computing Machinery, New York, NY, USA, 417–426. https://doi.org/10.1145/2371574.2371639
[83]
Ana Tajadura-Jimenez, Nadia Bianchi-Berthouze, Enrico Furfaro, and Frederic Bevilacqua. 2015. Sonification of Surface Tapping Changes Behavior, Surface Perception, and Emotion. MultiMedia, IEEE 22, 1 (2015), 48–57. https://doi.org/10.1109/MMUL.2015.14
[84]
Laia Turmo Vidal, Elena Márquez Segura, Christopher Boyer, and Annika Waern. 2019. Enlightened Yoga: Designing an Augmented Class with Wearable Lights to Support Instruction. In Proceedings of the 2019 on Designing Interactive Systems Conference (San Diego, CA, USA) (DIS ’19). Association for Computing Machinery, New York, NY, USA, 1017–1031. https://doi.org/10.1145/3322276.3322338
[85]
Laia Turmo Vidal, Hui Zhu, and Abraham Riego-Delgado. 2020. BodyLights: Open-Ended Augmented Feedback to Support Training Towards a Correct Exercise Execution. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376268
[86]
Martin Weigel, Tong Lu, Gilles Bailly, Antti Oulasvirta, Carmel Majidi, and Jürgen Steimle. 2015. ISkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors for Mobile Computing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 2991–3000. https://doi.org/10.1145/2702123.2702391
[87]
Matthew Wright, Ryan J Cassidy, and Michael Zbyszynski. 2004. Audio and Gesture Latency Measurements on Linux and OSX. In ICMC.
[88]
George M Wyburn. 1964. Human senses and perception. Vol. 82. University of Toronto Press.
[89]
Jiajun Yang and Andy Hunt. 2015. Real-time sonification of biceps curl exercise using muscular activity and kinematics. Georgia Institute of Technology.
[90]
Thomas G. Zimmerman, Joshua R. Smith, Joseph A. Paradiso, David Allport, and Neil Gershenfeld. 1995. Applying Electric Field Sensing to Human-Computer Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’95). ACM Press/Addison-Wesley Publishing Co., USA, 280–287. https://doi.org/10.1145/223904.223940

Cited By

View all
  • (2024)Movits: a Minimalist Toolkit for Embodied SketchingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660706(3302-3317)Online publication date: 1-Jul-2024
  • (2024)Co-Designing Sensory Feedback for Wearables to Support Physical Activity through Body SensationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36434998:1(1-31)Online publication date: 6-Mar-2024
  • (2024)Towards a Minimalist Embodied Sketching Toolkit for Wearable Design for Motor LearningProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3635253(1-7)Online publication date: 11-Feb-2024
  • Show More Cited By

Index Terms

  1. SonicHoop: Using Interactive Sonification to Support Aerial Hoop Practices
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
      May 2021
      10862 pages
      ISBN:9781450380966
      DOI:10.1145/3411764
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Badges

      • Honorable Mention

      Author Tags

      1. Interactive sonification
      2. aerial hoop
      3. auditory feedback
      4. capacitive sensing
      5. sound

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      CHI '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)69
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 23 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Movits: a Minimalist Toolkit for Embodied SketchingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660706(3302-3317)Online publication date: 1-Jul-2024
      • (2024)Co-Designing Sensory Feedback for Wearables to Support Physical Activity through Body SensationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36434998:1(1-31)Online publication date: 6-Mar-2024
      • (2024)Towards a Minimalist Embodied Sketching Toolkit for Wearable Design for Motor LearningProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3635253(1-7)Online publication date: 11-Feb-2024
      • (2023)Acrosuit: Promoting Improvisation in Acroyogis With Tactile and Visual CuesProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3573118(1-7)Online publication date: 26-Feb-2023
      • (2023)Boards Hit Back: Reflecting on Martial Arts Practices Through Soma DesignProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580722(1-18)Online publication date: 19-Apr-2023
      • (2023)ZensInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103084178:COnline publication date: 1-Oct-2023

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media