skip to main content
10.1145/3597638.3608430acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Open access

How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces

Published: 22 October 2023 Publication History

Editorial Notes

A corrigendum was issued for this paper on February 13, 2024. You can download the corrigendum from the Supplemental Material section of this citation page

Abstract

Always-on, upper-body input from sensors like accelerometers, infrared cameras, and electromyography hold promise to enable accessible gesture input for people with upper-body motor impairments. When these sensors are distributed across the person’s body, they can enable the use of varied body parts and gestures for device interaction. Personalized upper-body gestures that enable input from diverse body parts including the head, neck, shoulders, arms, hands and fingers and match the abilities of each user, could be useful for ensuring that gesture systems are accessible. In this work, we characterize the personalized gesture sets designed by 25 participants with upper-body motor impairments and develop design recommendations for upper-body personalized gesture interfaces. We found that the personalized gesture sets that participants designed were highly ability-specific. Even within a specific type of disability, there were significant differences in what muscles participants used to perform upper-body gestures, with some predominantly using shoulder and upper-arm muscles, and others solely using their finger muscles. Eight percent of gestures that participants designed were with their head, neck, and shoulders, rather than their hands and fingers, demonstrating the importance of tracking the whole upper-body. To combat fatigue, participants performed 51% of gestures with their hands resting on or barely coming off of their armrest, highlighting the importance of using sensing mechanisms that are agnostic to the location and orientation of the body. Lastly, participants activated their muscles but did not visibly move during 10% of the gestures, demonstrating the need for using sensors that can sense muscle activations without movement. Both inertial measurement unit (IMU) and electromyography (EMG) wearable sensors proved to be promising sensors to differentiate between personalized gestures. Personalized upper-body gesture interfaces that take advantage of each person’s abilities are critical for enabling accessible upper-body gestures for people with upper-body motor impairments.

Supplemental Material

PDF File
Corrigendum to "How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces" by Yamagami et al., Proceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '23)

References

[1]
Bashar Altakrouri, Daniel Burmeister, Dennis Boldt, and Andreas Schrader. 2016. Insights on the impact of physical impairments in full-body motion gesture elicitation studies. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. 1–10.
[2]
Apple. 2022. Use AssistiveTouch on your Apple Watch. https://support.apple.com/en-us/HT212760 [Accessed: April 27, 2023].
[3]
Jan Bobeth, Susanne Schmehl, Ernst Kruijff, Stephanie Deutsch, and Manfred Tscheligi. 2012. Evaluating performance and acceptance of older adults using freehand gestures for TV menu control. In Proceedings of the 10th European conference on Interactive tv and video. 35–44.
[4]
Massimo Camplani and Luis Salgado. 2014. Background foreground segmentation with RGB-D Kinect data: An efficient combination of classifiers. Journal of Visual Communication and Image Representation 25, 1 (2014), 122–136.
[5]
Fabio M Caputo, Pietro Prebianca, Alessandro Carcangiu, Lucio D Spano, and Andrea Giachetti. 2018. Comparing 3D trajectories for simple mid-air gesture recognition. Computers & Graphics 73 (2018), 17–25.
[6]
Micael Carreira, Karine Lan Hing Ting, Petra Csobanka, and Daniel Gonçalves. 2017. Evaluation of in-air hand gestures interaction for older people. Universal Access in the Information Society 16 (2017), 561–580.
[7]
Patrick Carrington, Jian-Ming Chang, Kevin Chang, Catherine Hornback, Amy Hurst, and Shaun K Kane. 2016. The gest-rest family: Exploring input possibilities for wheelchair armrests. ACM Transactions on Accessible Computing (TACCESS) 8, 3 (2016), 1–24.
[8]
Patrick Carrington, Amy Hurst, and Shaun K Kane. 2014. Wearables and chairables: inclusive design of mobile input and output techniques for power wheelchair users. In Proceedings of the SIGCHI Conference on human factors in computing systems. 3103–3112.
[9]
Ross A Clark, Benjamin F Mentiplay, Emma Hough, and Yong Hao Pua. 2019. Three-dimensional cameras and skeleton pose tracking for physical function assessment: A review of uses, validity, current developments and Kinect alternatives. Gait & posture 68 (2019), 193–200.
[10]
Jacob Cohen. 1960. A coefficient of agreement for nominal scales. Educational and psychological measurement 20, 1 (1960), 37–46.
[11]
Eleanor Criswell. 2010. Cram’s introduction to surface electromyography. Jones & Bartlett Publishers.
[12]
Carlo J De Luca. 1997. The use of surface electromyography in biomechanics. Journal of applied biomechanics 13, 2 (1997), 135–163.
[13]
Han De Vries, Marc N Elliott, David E Kanouse, and Stephanie S Teleki. 2008. Using pooled kappa to summarize interrater agreement across many items. Field methods 20, 3 (2008), 272–282.
[14]
Nem Khan Dim and Xiangshi Ren. 2014. Designing motion gesture interfaces in mobile phones for blind people. Journal of Computer Science and technology 29, 5 (2014), 812–824.
[15]
Tim Duente, Justin Schulte, Max Pfeiffer, and Michael Rohs. 2018. Muscleio: Muscle-based input and output for casual notifications. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 2 (2018), 1–21.
[16]
Mingming Fan, Zhen Li, and Franklin Mingzhe Li. 2020. Eyelid gestures on mobile devices for people with motor impairments. In Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility. 1–8.
[17]
Michela Ferron, Nadia Mana, and Ornella Mich. 2019. Designing mid-air gesture interaction with mobile devices for older adults. Perspectives on human-computer interaction research with older people (2019), 81–100.
[18]
Michela Ferron, Nadia Mana, Ornella Mich, and Christopher Reeves. 2018. Design of multimodal interaction with mobile devices. Challenges for visually impaired and elderly users. In Proceedings of the 3rd International Conference on Human Computer Interaction Theory and Applications (HUCAPP). 140–146.
[19]
Leah Findlater, Ben Lee, and Jacob Wobbrock. 2012. Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2679–2682.
[20]
Leah Findlater and Jacob Wobbrock. 2012. Personalized input: improving ten-finger touchscreen typing through automatic adaptation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 815–824.
[21]
Leah Findlater, Jacob O Wobbrock, and Daniel Wigdor. 2011. Typing on flat glass: examining ten-finger expert typing patterns on touch surfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 2453–2462.
[22]
Krzysztof Z Gajos, Amy Hurst, and Leah Findlater. 2012. Personalized dynamic accessibility. Interactions 19, 2 (2012), 69–73.
[23]
Krzysztof Z Gajos, Jacob O Wobbrock, and Daniel S Weld. 2008. Improving the performance of motor-impaired users with automatically-generated, ability-based interfaces. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 1257–1266.
[24]
Kathrin Gerling, Ian Livingston, Lennart Nacke, and Regan Mandryk. 2012. Full-body motion-based game interaction for older adults. In Proceedings of the SIGCHI conference on human factors in computing systems. 1873–1882.
[25]
Kathrin M Gerling, Kristen K Dergousoff, Regan L Mandryk, 2013. Is movement better? Comparing sedentary and motion-based game controls for older adults. In Proceedings-Graphics Interface. Canadian Information Processing Society, 133–140.
[26]
Seniam group. 2000. Recommendations for sensor locations on individual muscles. http://www.seniam.org/ [Accessed: April 13, 2023].
[27]
Donny Huang, Xiaoyi Zhang, T Scott Saponas, James Fogarty, and Shyamnath Gollakota. 2015. Leveraging dual-observable input for fine-grained thumb interaction using forearm EMG. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 523–528.
[28]
Yangjian Huang, Weichao Guo, Jianwei Liu, Jiayuan He, Haisheng Xia, Xinjun Sheng, Haitao Wang, Xuetao Feng, and Peter B Shull. 2015. Preliminary testing of a hand gesture recognition wristband based on emg and inertial sensor fusion. In Intelligent Robotics and Applications: 8th International Conference, ICIRA 2015, Portsmouth, UK, August 24-27, 2015, Proceedings, Part I 8. Springer, 359–367.
[29]
Shaun K Kane, Jacob O Wobbrock, and Richard E Ladner. 2011. Usable gestures for blind people: understanding preference and performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 413–422.
[30]
BC Kennedy, Solway S CA BDE, and S McConnell. 2011. Disabilities of the arm, shoulder and hand (DASH). The DASH and QuickDASH outcome measure User’s manual. Toronto: Ontario Inst. Work Heal (2011).
[31]
Christine Kühnel, Tilo Westermann, Fabian Hemmert, Sven Kratz, Alexander Müller, and Sebastian Möller. 2011. I’m home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies 69, 11 (2011), 693–704.
[32]
Seong Kyu Leem, Faheem Khan, and Sung Ho Cho. 2019. Detecting mid-air gestures for digit writing with radio sensors and a CNN. IEEE Transactions on Instrumentation and Measurement 69, 4 (2019), 1066–1081.
[33]
Qi Feng Liu, Keiko Katsuragawa, and Edward Lank. 2019. Eliciting Wrist and Finger Gestures to Guide Recognizer Design. In Graphics Interface. 9–1.
[34]
Kelly Mack, Emma J McDonnell, Leah Findlater, and Heather D Evans. 2022. Chronically Under-Addressed: Considerations for HCI Accessibility Practice with Chronically Ill People. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility. 1–15.
[35]
Meethu Malu, Pramod Chundury, and Leah Findlater. 2018. Exploring accessible smartwatch interactions for people with upper body motor impairments. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.
[36]
Meethu Malu and Leah Findlater. 2015. Personalized, wearable control of a head-mounted display for users with upper body motor impairments. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 221–230.
[37]
Nora McDonald, Sarita Schoenebeck, and Andrea Forte. 2019. Reliability and inter-rater reliability in qualitative research: Norms and guidelines for CSCW and HCI practice. Proceedings of the ACM on human-computer interaction 3, CSCW (2019), 1–23.
[38]
Arc Media. 2016. Thalmic Labs – Myo Retail Launch. https://www.youtube.com/watch?v=A8lGstYAY14&ab_channel=ArcMedia [Accessed: April 12, 2023].
[39]
Meta. 2023. Getting started with Hand Tracking on Meta Quest headsets. https://www.meta.com/help/quest/articles/headsets-and-accessories/controllers-and-hand-tracking/hand-tracking-quest-2/ [Accessed: April 10, 2023].
[40]
Microsoft. 2021. HoloLens 2 gestures for authoring and navigating in Dynamics 365 Guides. https://learn.microsoft.com/en-us/dynamics365/mixed-reality/guides/authoring-gestures-hl2 [Accessed: April 10, 2023].
[41]
Microsoft. 2023. Azure Kinect DK. https://www.microsoft.com/en-us/d/azure-kinect-dk/8pp5vxmd9nhq?activetab=pivot:overviewtab [Accessed: April 12, 2023].
[42]
Meredith Ringel Morris. 2012. Web on the wall: insights from a multimodal interaction elicitation study. In Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces. 95–104.
[43]
Meredith Ringel Morris, Jacob O Wobbrock, and Andrew D Wilson. 2010. Understanding users’ preferences for surface gestures. In Proceedings of graphics interface 2010. 261–268.
[44]
Martez E Mott, Radu-Daniel Vatavu, Shaun K Kane, and Jacob O Wobbrock. 2016. Smart touch: Improving touch accuracy for people with motor impairments with template matching. In Proceedings of the 2016 CHI conference on human factors in computing systems. 1934–1946.
[45]
Miguel A Nacenta, Yemliha Kamber, Yizhou Qiang, and Per Ola Kristensson. 2013. Memorability of pre-designed and user-defined gesture sets. In Proceedings of the SIGCHI conference on human factors in computing systems. 1099–1108.
[46]
Nurhazimah Nazmi, Mohd Azizi Abdul Rahman, Shin-Ichiroh Yamamoto, Siti Anom Ahmad, Hairi Zamzuri, and Saiful Amri Mazlan. 2016. A review of classification techniques of EMG signals during isotonic and isometric contractions. Sensors 16, 8 (2016), 1304.
[47]
Sameera Palipana, Dariush Salami, Luis A Leiva, and Stephan Sigg. 2021. Pantomime: Mid-air gesture recognition with sparse millimeter-wave radar point clouds. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, 1 (2021), 1–27.
[48]
Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 955–960.
[49]
Daniel Roetenberg, Henk Luinge, Per Slycke, 2009. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors. Xsens Motion Technologies BV, Tech. Rep 1 (2009), 1–7.
[50]
Simon Ruffieux, Denis Lalanne, and Elena Mugellini. 2013. ChAirGest: a challenge for multimodal mid-air gesture recognition for close HCI. In Proceedings of the 15th ACM on International conference on multimodal interaction. 483–488.
[51]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-defined motion gestures for mobile interaction. In Proceedings of the SIGCHI conference on human factors in computing systems. 197–206.
[52]
T Scott Saponas, Desney S Tan, Dan Morris, Ravin Balakrishnan, Jim Turner, and James A Landay. 2009. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. 167–176.
[53]
Ovidiu-Andrei Schipor, Laura-Bianca Bilius, and Radu-Daniel Vatavu. 2022. WearSkill: personalized and interchangeable input with wearables for users with motor impairments. In Proceedings of the 19th International Web for All Conference. 1–5.
[54]
Samuel Sanford Shapiro and Martin B Wilk. 1965. An analysis of variance test for normality (complete samples). Biometrika 52, 3/4 (1965), 591–611.
[55]
Benjamin R Shuman, Marije Goudriaan, Kaat Desloovere, Michael H Schwartz, and Katherine M Steele. 2019. Muscle synergies demonstrate only minimal changes after treatment in cerebral palsy. Journal of neuroengineering and rehabilitation 16 (2019), 1–10.
[56]
Alexandru-Ionut Siean and Radu-Daniel Vatavu. 2021. Wearable interactions for users with motor impairments: systematic review, inventory, and research implications. In Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility. 1–15.
[57]
Jean Vanderdonckt, Nathan Magrofuoco, Suzanne Kieffer, Jorge Pérez, Ysabelle Rase, Paolo Roselli, and Santiago Villarreal. 2019. Head and shoulders gestures: Exploring user-defined gestures with upper body. In Design, User Experience, and Usability. User Experience in Advanced Technological Environments: 8th International Conference, DUXU 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, July 26–31, 2019, Proceedings, Part II 21. Springer, 192–213.
[58]
Radu-Daniel Vatavu. 2012. User-defined gestures for free-hand TV control. In Proceedings of the 10th European conference on Interactive tv and video. 45–48.
[59]
Radu-Daniel Vatavu and Ovidiu-Ciprian Ungurean. 2022. Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor Impairments. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–16.
[60]
Santiago Villarreal-Narvaez, Jean Vanderdonckt, Radu-Daniel Vatavu, and Jacob O Wobbrock. 2020. A systematic review of gesture elicitation studies: What can we learn from 216 studies?. In Proceedings of the 2020 ACM designing interactive systems conference. 855–872.
[61]
Panagiotis Vogiatzidakis and Panayiotis Koutsabasis. 2018. Gesture elicitation studies for mid-air interaction: A review. Multimodal Technologies and Interaction 2, 4 (2018), 65.
[62]
Tijana Vuletic, Alex Duffy, Laura Hay, Chris McTeague, Gerard Campbell, and Madeleine Grealy. 2019. Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies 129 (2019), 74–94.
[63]
Tianyi Wang, Xun Qian, Fengming He, Xiyun Hu, Yuanzhi Cao, and Karthik Ramani. 2021. GesturAR: An authoring system for creating freehand interactive augmented reality applications. In The 34th Annual ACM Symposium on User Interface Software and Technology. 552–567.
[64]
Markus L Wittorf and Mikkel R Jakobsen. 2016. Eliciting mid-air gestures for wall-display interaction. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. 1–4.
[65]
Jacob O Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A Myers. 2005. Maximizing the guessability of symbolic input. In CHI’05 extended abstracts on Human Factors in Computing Systems. 1869–1872.
[66]
Jacob O Wobbrock, James Fogarty, Shih-Yen Liu, Shunichi Kimuro, and Susumu Harada. 2009. The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1401–1410.
[67]
Jacob O Wobbrock, Krzysztof Z Gajos, Shaun K Kane, and Gregg C Vanderheiden. 2018. Ability-based design. Commun. ACM 61, 6 (2018), 62–71.
[68]
Jacob O Wobbrock, Shaun K Kane, Krzysztof Z Gajos, Susumu Harada, and Jon Froehlich. 2011. Ability-based design: Concept, principles and examples. ACM Transactions on Accessible Computing (TACCESS) 3, 3 (2011), 1–27.
[69]
Jacob O Wobbrock, Meredith Ringel Morris, and Andrew D Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems. 1083–1092.
[70]
Jacob O Wobbrock, Andrew D Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology. 159–168.
[71]
Huiyue Wu, Yu Wang, Jiali Qiu, Jiayi Liu, and Xiaolong Zhang. 2019. User-defined gesture interaction for immersive VR shopping applications. Behaviour & Information Technology 38, 7 (2019), 726–741.
[72]
Huiyue Wu, Shaoke Zhang, Jiayi Liu, Jiali Qiu, and Xiaolong Zhang. 2019. The gesture disagreement problem in free-hand gesture interaction. International Journal of Human–Computer Interaction 35, 12 (2019), 1102–1114.
[73]
Xuhai Xu, Jun Gong, Carolina Brum, Lilian Liang, Bongsoo Suh, Shivam Kumar Gupta, Yash Agarwal, Laurence Lindsey, Runchang Kang, Behrooz Shahsavari, 2022. Enabling hand gesture customization on wrist-worn devices. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–19.
[74]
Momona Yamagami, Keshia M Peters, Ivana Milovanovic, Irene Kuang, Zeyu Yang, Nanshu Lu, and Katherine M Steele. 2018. Assessment of dry epidermal electrodes for long-term electromyography measurements. Sensors 18, 4 (2018), 1269.
[75]
Xuan Zhao, Mingming Fan, and Teng Han. 2022. “I Don’t Want People to Look At Me Differently” Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–15.

Cited By

View all

Index Terms

  1. How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ASSETS '23: Proceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility
      October 2023
      1163 pages
      ISBN:9798400702204
      DOI:10.1145/3597638
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 October 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. accessibility
      2. gestures
      3. input
      4. motor impairments

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • Center for Research and Education on Accessible Technology and Experiences (CREATE)
      • Meta
      • NIDILRR

      Conference

      ASSETS '23
      Sponsor:

      Acceptance Rates

      ASSETS '23 Paper Acceptance Rate 55 of 182 submissions, 30%;
      Overall Acceptance Rate 436 of 1,556 submissions, 28%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)791
      • Downloads (Last 6 weeks)93
      Reflects downloads up to 15 Sep 2024

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media