skip to main content
10.1145/2470654.2466114acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing

Published: 27 April 2013 Publication History

Abstract

Imaginary Interfaces are screen-less ultra-mobile interfaces. Previously we showed that even though they offer no visual feedback they allow users to interact spatially, e.g., by pointing at a location on their non-dominant hand.
The primary goal of this paper is to provide a deeper understanding of palm-based imaginary interfaces, i.e., why they work. We perform our exploration using an interaction style inspired by interfaces for visually impaired users. We implemented a system that audibly announces target names as users scrub across their palm. Based on this interface, we conducted three studies. We found that (1) even though imaginary interfaces cannot display visual contents, users' visual sense remains the main mechanism that allows users to control the interface, as they watch their hands interact. (2) When we remove the visual sense by blindfolding, the tactile cues of both hands feeling each other in part replace the lacking visual cues, keeping imaginary interfaces usable. (3) While we initially expected the cues sensed by the pointing finger to be most important, we found instead that it is the tactile cues sensed by the palm that allow users to orient themselves most effectively.
While these findings are primarily intended to deepen our understanding of Imaginary Interfaces, they also show that eyes-free interfaces located on skin outperform interfaces on physical devices. In particular, this suggests that palm-based imaginary interfaces may have benefits for visually impaired users, potentially outperforming the touchscreen-based devices they use today.

Supplementary Material

suppl.mov (chi0113-file3.mp4)
Supplemental video

References

[1]
Apple. VoiceOver for iPhone. http://www.apple.com/accessibility/iphone/vision.html
[2]
Bolanowski S. J., Verrillo R. T., McGlone F. Passive, active and intraactive (self) touch. Behavioural Brain Research 148, (2004), 41--45.
[3]
Brewster, S., Lumsden, J., Bell, M., Hall, M. and Tasker, S. Multimodal 'eyes-free' interaction techniques for wearable devices. In Proc. CHI, (2003), 473--480.
[4]
Chen, X., Marquardt, N., Tang, A., Boring, S. and Greenberg, S. Extending a mobile device's interaction space through body-centric interaction. In Proc. MobileHCI, (2012), 151--160.
[5]
Code Factory. Mobile Speak. http://www.codefactory.es/en/products.asp?id=316
[6]
Dezfuli, N., Khalilbeigi, M., Huber, J., Müller, F. and Mühlhäuser, M. PalmRC: imaginary palm-based remote control for eyes-free television interaction. In Proc. EuroiTV, (2012), 27--34.
[7]
Driver J., and Spence C. Attention and the crossmodal construction of space. Trends in Cognitive Sciences 2, 7, (1998), 254--262.
[8]
Folmer, E. and Morelli, T. Spatial gestures using a tactile-proprioceptive display. In Proc. TEI, (2012), 139--142.
[9]
Fuentes, C. T., Bastian, A. J. Where is your arm? Variations in proprioception across space and tasks. Journal of Neurophysiology 103, 1 (2010), 164--171.
[10]
Gibson, J. J. Observations on active touch. Psychological Review 69, 6, (1962), 477--491.
[11]
Goldstein, M. and Chincholle, D. Finger-joint gesture wearable keypad. In Proc. MobileHCI, (1999), 9--18.
[12]
Gollner, U., Bieling, T. and Joost, G. Mobile Lorm Glove: introducing a communication device for deaf-blind people. In Proc. TEI, (2012), 127--130.
[13]
Gustafson, S., Bierwirth, D. and Baudisch, P. Imaginary Interfaces: spatial interaction with empty hands and without visual feedback. In Proc. UIST, (2010), 3--12.
[14]
Gustafson, S., Holz, C. and Baudisch, P. Imaginary Phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In Proc. UIST, (2011), 283--292.
[15]
Harrison, C., Benko, H. and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST, (2011), 441--450.
[16]
Harrison, C., Tan, D. and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI, (2010), 453--462.
[17]
Kane, S. K., Bigham, J. P. and Wobbrock, J. O. Slide Rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. ASSETS, (2008), 73--80.
[18]
Kane, S. K., Jayant, C., Wobbrock, J. O. and Ladner, R. E. Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In Proc. ASSETS, (2009), 115--122.
[19]
Kuester, F., Chen, M., Phair, M. E. and Mehring, C. Towards keyboard independent touch typing in VR. In Proc. VRST, (2005), 86--95.
[20]
Landua, S. and Wells, L. Merging tactile sensory input and audio data by means of the Talking Tactile Tablet. In Proc. Eurohaptics, (2003), 414--418.
[21]
Ladavas, E., Farne, A., Zeloni, G. and di Pellegrino, G. Seeing or not seeing where your hands are. Experimental Brain Research 31, (2000), 458--467.
[22]
Li, K. A., Baudisch, P. and Hinckley, K. Blindsight: eyes-free access to mobile phones. In Proc. CHI, (2008), 1389--1398.
[23]
Li, F. C., Dearman, D. and Truong, K. N. Virtual Shelves: interactions with orientation aware devices. In Proc. UIST, (2009), 125--128.
[24]
Li, F. C., Dearman, D. and Truong, K. N. Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In Proc. ASSETS, (2010), 187--194.
[25]
Lin, S.-Y., Su, Z.-H., Cheng, K.-Y., Liang, R.-H., Kuo, T.-H. and Chen, B.-Y. PUB - Point Upon Body: exploring eyes-free interactions and methods on an arm. In Proc. UIST, (2011), 481--488.
[26]
Maravita, A., Spence, C., Driver, J. Multisensory integration and the body schema: close to hand and within reach. Current Biology 13, (July 2003), R531--R539.
[27]
McGookin, D., Brewster, S. and Jiang, W. W. Investigating touchscreen accessibility for people with visual impairments. In Proc. NordiCHI, (2008), 298--307.
[28]
Mistry, P., Maes, P. and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI Ext. Abs., (2009), 4111--4116.
[29]
Oakley, I. and Park, J. Motion marking menus: an eyes-free approach to motion input for handheld devices. IJHCS 67, 6 (2009), 515--532.
[30]
Pirhonen, A., Brewster, S. and Holguin, C. Gestural and audio metaphors as a means of control for mobile devices. In Proc. CHI, (2002), 291--298.
[31]
Shoemaker, G., Tsukitani, T., Kitamura, Y. and Booth, K. S. Bodycentric interaction techniques for very large wall displays. In Proc. NordiCHI, (2010), 463--472.
[32]
Strachan, S., Murray-Smith, R. and O'Modhrain, S. BodySpace: inferring body pose for natural control of a music player. In CHI Ext. Abs., (2007), 2001--2006.
[33]
Tamaki, E., Miyaki, T. and Rekimoto, J. Brainy Hand: an ear-worn hand gesture interaction device. In CHI Ext. Abs., (2009), 4255--4260.
[34]
Vallbo, A. B. and Johansson, R. S. The tactile sensory innervation of the glabrous skin of the human hand. Active Touch, the Mechanism of Recognition of Objects by Manipulation, (1978), 29--54.
[35]
Vanderheiden, G. C. Use of audio-haptic interface techniques to allow nonvisual access to touchscreen appliances. In Proc. Human Factors and Ergonomics Society (Poster), (1996), 1266.
[36]
Voisin, J., Lamarre, Y. and Chapman, C. E. Haptic discrimination of object shape in humans: contribution of cutaneous and proprioceptive inputs. Experimental Brain Research 145, 2 (2002), 251--260.
[37]
Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R. and Baudisch, P. earPod: eyes-free menu selection using touch input and reactive audio feedback. In Proc. CHI, (2007), 1395--1404.

Cited By

View all

Index Terms

  1. Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2013
    3550 pages
    ISBN:9781450318990
    DOI:10.1145/2470654
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 April 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. imaginary interfaces
    2. mobile
    3. non-visual
    4. tactile feedback
    5. visual feedback
    6. wearable

    Qualifiers

    • Research-article

    Conference

    CHI '13
    Sponsor:

    Acceptance Rates

    CHI '13 Paper Acceptance Rate 392 of 1,963 submissions, 20%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)90
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 24 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media