skip to main content
10.1145/3365610.3365624acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
research-article

User-defined interaction for smart homes: voice, touch, or mid-air gestures?

Published: 26 November 2019 Publication History

Abstract

Smart home appliances and smart homes, in general, are on the verge of ubiquity. Research and industry proposed a range of modalities, including speech, mid-air gestures, and touch displays, to control smart homes. While previous work designed for the individual modalities, it is unclear how they compare from a user-centered perspective. Therefore, we conducted an elicitation study that asked participants to propose commands using speech, mid-air gestures, and a touch display. Also, we asked participants to rate their suggestions and the modalities. The results show that using voice commands or a touch display is clearly preferred compared to the use of mid-air gestures. As we found high agreement scores for voice commands, our results also highlight the potential of elicitation studies for voice interfaces.

References

[1]
Christopher Ackad, Andrew Clayphan, Martin Tomitsch, and Judy Kay. 2015. An In-the-wild Study of Learning Mid-air Gestures to Browse Hierarchical Information at a Large Interactive Public Display. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, New York, NY, USA, 1227--1238.
[2]
Shaikh Shawon Arefin Shimon, Courtney Lutton, Zichun Xu, Sarah Morrison-Smith, Christina Boucher, and Jaime Ruiz. 2016. Exploring Non-touchscreen Gestures for Smartwatches. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3822--3833.
[3]
Andrea Bellucci, Andrea Vianello, Yves Florack, Luana Micallef, and Giulio Jacucci. 2019. Augmenting objects at home through programmable sensor tokens: A design journey. International Journal of Human-Computer Studies 122 (2019), 211--231.
[4]
Joanna Bergstrom-Lehtovirta and Antti Oulasvirta. 2014. Modeling the Functional Area of the Thumb on Mobile Touchscreen Surfaces. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 1991--2000.
[5]
Richard A. Bolt. 1980. Put-that-there: Voice and Gesture at the Graphics Interface. SIGGRAPH Comput. Graph. 14, 3 (July 1980), 262--270.
[6]
Tilman Dingler, Rufat Rzayev, Alireza Sahami Shirazi, and Niels Henze. 2018. Designing Consistent Gestures Across Device Types. In Engage with CHI, Regan Mandryk and Mark Hancock (Eds.). The Association for Computing Machinery, New York, New York, 1--12.
[7]
Yasmin Felberbaum and Joel Lanir. 2018. Better Understanding of Foot Gestures. In Engage with CHI, Regan Mandryk and Mark Hancock (Eds.). The Association for Computing Machinery, New York, New York, 1--12.
[8]
Florian Habler, Marco Peisker, and Niels Henze. 2019. Differences Between Smart Speakers and Graphical User Interfaces for Music Search Considering Gender Effects. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia (MUM 2019). ACM, New York, NY, USA, 7.
[9]
Florian Habler, Valentin Schwind, and Niels Henze. 2019. Effects of Smart Virtual Assistants' Gender and Language. In Proceedings of Mensch Und Computer 2019 (MuC'19). ACM, New York, NY, USA, 469--473.
[10]
Charles Hannon. 2016. Gender and Status in Voice User Interfaces. Interactions 23, 3 (April 2016), 34--37.
[11]
Niels Henze, Andreas Löcken, Susanne Boll, Tobias Hesselmann, and Martin Pielot. 2010. Free-hand Gestures for Music Playback: Deriving Gestures with a User-centred Process. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia (MUM '10). ACM, New York, NY, USA, Article 16, 10 pages.
[12]
Lynn Hoff, Eva Hornecker, and Sven Bertel. 2016. Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias?. In Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '16). ACM, New York, NY, USA, 86--91.
[13]
Alison Duncan Kerr. 2018. Alexa and the Promotion of Oppression. In Proceedings of the 2018 ACM Celebration of Women in Computing (womENcourage '18). ACM, New York, NY, USA. https://womencourage.acm.org/2018/wp-content/uploads/2018/07/womENcourage_2018_paper_54.pdf
[14]
Christine Kühnel, Tilo Westermann, Fabian Hemmert, Sven Kratz, Alexander Müller, and Sebastian Möller. 2011. I'm home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies 69, 11 (2011), 693--704.
[15]
Anne Köpsel and Nikola Bubalo. 2015. Benefiting from Legacy Bias. interactions 22, 5 (Aug. 2015), 44--47.
[16]
Huy Viet Le, Sven Mayer, Benedict Steuerlein, and Niels Henze. 2019. Investigating Unintended Inputs for One-Handed Touch Interaction Beyond the Touchscreen. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '19). ACM, New York, NY, USA, Article 34, 14 pages.
[17]
Andreas Löcken, Tobias Hesselmann, Martin Pielot, Niels Henze, and Susanne Boll. 2011. User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems 18 (2011), 15--31.
[18]
Sven Mayer, Perihan Gad, Katrin Wolf, Paweł W. Woźniak, and Niels Henze. 2017. Understanding the Ergonomic Constraints in Designing for Touch Surfaces. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '17). ACM, New York, NY, USA, Article 33, 9 pages.
[19]
Sven Mayer, Lars Lischke, Adrian Lanksweirt, Huy Viet Le, and Niels Henze. 2018. How to Communicate New Input Techniques. In Proceedings of the 10th Nordic Conference on Human-Computer Interaction (NordiCHI '18). ACM, New York, NY, USA, 460--472.
[20]
Sven Mayer, Lars Lischke, Paweł W. Woźniak, and Niels Henze. 2018. Evaluating the Disruptiveness of Mobile Interactions: A Mixed-Method Approach. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 406, 14 pages.
[21]
Meredith Ringel Morris, Andreea Danielescu, Steven Drucker, Danyel Fisher, Bongshin Lee, m. c. schraefel, and Jacob O. Wobbrock. 2014. Reducing Legacy Bias in Gesture Elicitation Studies. interactions 21, 3 (May 2014), 40--45.
[22]
Meredith Ringel Morris, Jacob O. Wobbrock, and Andrew D. Wilson. 2010. Understanding Users' Preferences for Surface Gestures. In Proceedings of Graphics Interface 2010 (GI '10). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 261--268. http://dl.acm.org/citation.cfm?id=1839214.1839260
[23]
Miguel A. Nacenta, Yemliha Kamber, Yizhou Qiang, and Per Ola Kristensson. 2013. Memorability of Pre-designed and User-defined Gesture Sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 1099--1108.
[24]
Michael Nielsen, Moritz Störring, Thomas B Moeslund, and Erik Granum. 2003. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In International gesture workshop. Springer, 409--420.
[25]
Chidera Obinali. 2019. The Perception of Gender in Voice Assistants. In Proceedings of the Southern Association for Information Systems Conference. 6. https://aisel.aisnet.org/sais2019/39
[26]
Julie Rico and Stephen Brewster. 2010. Gesture and Voice Prototyping for Early Evaluations of Social Acceptability in Multimodal Interfaces. In International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI '10). ACM, New York, NY, USA, Article 16, 9 pages.
[27]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-defined motion gestures for mobile interaction. In Conference proceedings and extended abstracts / the 29th Annual CHI Conference on Human Factors in Computing Systems, Desney Tan, Geraldine Fitzpatrick, Carl Gutwin, Bo Begole, and Wendy A. Kellogg (Eds.). ACM, New York, NY, 197.
[28]
Jaime Ruiz and Daniel Vogel. 2015. Soft-Constraints to Reduce Legacy and Performance Bias to Elicit Whole-body Gestures with Low Arm Fatigue. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 3347--3350.
[29]
Thomas Schlömer, Benjamin Poppinga, Niels Henze, and Susanne Boll. 2008. Gesture Recognition with a Wii Controller. In Proceedings of the 2Nd International Conference on Tangible and Embedded Interaction (TEI '08). ACM, New York, NY, USA, 11--14.
[30]
Valentin Schwind, Niklas Deierlein, Romina Poguntke, and Niels Henze. 2019. Understanding the Social Acceptability of Mobile Devices Using the Stereotype Content Model. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 361, 12 pages.
[31]
Radu-Daniel Vatavu. 2012. User-defined Gestures for Free-hand TV Control. In Proceedings of the 10th European Conference on Interactive TV and Video (EuroITV '12). ACM, New York, NY, USA, 45--48.
[32]
Radu-Daniel Vatavu. 2013. A Comparative Study of User-defined Handheld vs. Freehand Gestures for Home Entertainment Environments. Journal of Ambient Intelligence and Smart Environments, 5, 2 (March 2013), 187--211.
[33]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15, Bo Begole, Jinwoo Kim, Kori Inkpen, and Woontack Woo (Eds.). ACM Press, New York, New York, USA, 1325--1334.
[34]
Panagiotis Vogiatzidakis and Panayiotis Koutsabasis. 2018. Gesture Elicitation Studies for Mid-Air Interaction: A Review. Multimodal Technologies and Interaction 2, 4 (2018).
[35]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05). ACM, New York, NY, USA, 1869--1872.
[36]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. In CHI 2009 - digital life, new world, Dan R. Olsen, Richard B. Arthur, Ken Hinckley, Meredith Ringel Morris, Scott Hudson, and Saul Greenberg (Eds.). ACM, New York, NY, 1083.
[37]
Ionuţ-Alexandru Zaiţi, Ştefan-Gheorghe Pentiuc, and Radu-Daniel Vatavu. 2015. On free-hand TV control: experimental results on user-elicited gestures with Leap Motion. Personal and Ubiquitous Computing 19, 5--6 (2015), 821--838.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
MUM '19: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia
November 2019
462 pages
ISBN:9781450376242
DOI:10.1145/3365610
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 November 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. display control
  2. mid-air gestures
  3. smart home
  4. voice control

Qualifiers

  • Research-article

Conference

MUM 2019

Acceptance Rates

Overall Acceptance Rate 190 of 465 submissions, 41%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)54
  • Downloads (Last 6 weeks)11
Reflects downloads up to 07 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media