skip to main content
10.1145/3365610.3368419acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
poster

Supporting cross-device interactions with gestures between personal and public devices

Published: 26 November 2019 Publication History

Abstract

Seamless interaction across personal and public devices is still problematic. Gestural interaction can be a useful support for this purpose, but how to exploit it in cross-device frameworks is still unclear. We present an elicitation study aiming to contribute in identifying the most intuitive single or combined gestures that people can perform to interact with cross-device applications. This study led us to the definition of a possible gesture vocabulary for the typical interactive tasks in such applications. We then applied the resulting preferred gestures in an example cross-device Web application exploiting it. This application has finally been tested in order to evaluate the gestures' actual usability in cross-device interactions.

References

[1]
Marco Barsotti, Fabio Paternò, and Francesca Pulina. 2017. A Web Framework for Cross-device Gestures Between Personal Devices and Public Displays. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM '17), 69--78.
[2]
Frederick Brudy, Christian Holz, Roman Rädle, Chi-Jui Wu, Steven Houben, Clemens Nylandsted Klokmose, and Nicolai Marquardt. 2019. Cross-Device Taxonomy: Survey, Opportunities and Challenges of Interactions Spanning Across Multiple Devices. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), 562.
[3]
Linda Di Geronimo, Marica Bertarini, Julia Badertscher, Maria Husmann, and Moira C. Norrie. 2017. Exploiting mid-air gestures to share data among devices. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '17), 35.
[4]
Haiwei Dong, Ali Danesh, Nadia Figueroa, and Abdulmotaleb El Saddik. 2015. An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE access, 3: 543--555.
[5]
Mathias Frisch, Jens Heydekorn, and Raimund Dachselt. 2009. Investigating Multi-Touch and Pen Gestures for Diagram Editing on Interactive Surfaces. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS '09), 149--156.
[6]
Christian Kray, Daniel Nesbitt, John Dawson, and Michael Rohs. 2010. User-defined Gestures for Connecting Mobile Phones, Public Displays, and Tablets. In Proceedings of the 12th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '10), 239--248.
[7]
Michael Nielsen, Moritz Störring, Thomas B. Moeslund, and Erik Granum. 2003. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In International Gesture Workshop, 409--420.
[8]
Fabio Paternò. 2019. Concepts and design space for a better understanding of multi-device user interfaces, Universal Access in the Information Society, 1--24.
[9]
Isabel Benavente Rodriguez and Nicolai Marquardt. 2017. Gesture Elicitation Study on How to Opt-in & Opt-out from Interactions with Public Displays. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (ISS '17), 32--41.
[10]
Teddy Seyed, Chris Burns, Mario Costa Sousa, Frank Maurer, and Anthony Tang. 2012. Eliciting Usable Gestures for Multi-Display Environments. In Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces (ITS '12), 41--50.
[11]
Radu-Daniel Vatavu. 2013. A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments. Journal of Ambient Intelligence and Smart Environments, 5, 2: 187--211.
[12]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09), 1083--1092.
[13]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the guessability of symbolic input. In CHI'05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05), 1869--1872.

Cited By

View all
  • (2024)Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and ClassificationCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3661331(57-65)Online publication date: 24-Jun-2024

Index Terms

  1. Supporting cross-device interactions with gestures between personal and public devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MUM '19: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia
    November 2019
    462 pages
    ISBN:9781450376242
    DOI:10.1145/3365610
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 November 2019

    Check for updates

    Author Tags

    1. cross-device interaction techniques
    2. cross-device user interfaces
    3. elicitation study
    4. gesture vocabulary
    5. user-defined gestures

    Qualifiers

    • Poster

    Conference

    MUM 2019

    Acceptance Rates

    Overall Acceptance Rate 190 of 465 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 07 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and ClassificationCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3661331(57-65)Online publication date: 24-Jun-2024

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media