skip to main content
10.1145/1868914.1868958acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Tactile camera vs. tangible camera: taking advantage of small physical artefacts to navigate into large data collection

Published: 16 October 2010 Publication History

Abstract

This paper presents the design and evaluation of two interaction techniques used to navigate into large data collection displayed on a large output space while based on manipulations of a small physical artefact. The first technique exploits the spatial position of a digital camera and the second one uses its tactile screen. User experiments have been conducted to study and compare the both techniques, with regards to users' performance and satisfaction. Results establish that Tactile technique is more efficient than Tangible technique for easy pointing tasks while Tangible technique is better for hardest pointing tasks. In addition, users' feedback shows that they prefer to use the tangible camera, which requires fewer skills.

References

[1]
Azuma, R. A survey of augmented reality. Presence: Teleoperators and Virtual Environments 6, 4 (1997), 355--385.
[2]
Aylward, R. and Paradiso, J. A. 2006. Sensemble: a wireless, compact, multi-user sensor system for interactive dance. In Proceedings of the 2006 Conference on New interfaces For Musical Expression (Paris, France, June 04--08, 2006). New Interfaces For Musical Expression. IRCAM --- Centre Pompidou, Paris, France, 134--139.
[3]
Bach, C. and Scapin, D. L. (2010). Comparing inspections and user testing for the evaluation of virtual environments. Intl Journal of Human-Computer Interaction, 26(8). p 786--824.
[4]
Bangor, A., T. Kortum, P., T. Miller, J. (2008) An Empirical Evaluation of the System Usability Scale. Intl Journal of Human-Computer Interaction, 24(6). p 574--594.
[5]
Bérigny Wall, B., Wang, X. InterANTARCTICA: Tangible User Interface for Museum Based Interaction, International Journal of Virtual Reality 8, 3 (2009) 19--24.
[6]
Blanch, R., Guiard, Y., and Beaudouin-Lafon, M. 2004. Semantic pointing: improving target acquisition with control-display ratio adaptation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vienna, Austria, April 24--29, 2004). CHI '04. ACM, New York, NY, 519--526.
[7]
Blanch, R. and Ortega, M. 2009. Rake cursor: improving pointing performance with concurrent input channels. In Proceedings of the 27th international Conference on Human Factors in Computing Systems (Boston, MA, USA, April 04--09, 2009). CHI '09. ACM, New York, NY, 1415--1418.
[8]
Bryan, D. and Gershman, A. The Aquarium: A Novel User Interface Metaphor for Large, Online Stores. Database and Expert Systems Applications, International Workshop on, IEEE Computer Society (2000), 601.
[9]
Brooke, J. (1996). Sus: A quick and dirty usability scale. In Jordan, P. W., Weerdmeester, B., Thomas, A., and Mclelland, I. L., editors, Usability evaluation in industry. Taylor and Francis, London.
[10]
Davis, J., Chen, X. (2002). Lumipoint: Multi-User Laser-Based Interaction on Large Tiled Displays. In Displays, Volume 23, Issue 5.
[11]
DiPaola, S. and Akai, C. Designing an adaptive multimedia interactive to support shared learning experiences. ACM SIGGRAPH 2006 Educators program, ACM (2006), 14
[12]
Dourish, P. Where The Action Is: The Foundations Of Embodied Interaction. MIT Press, 2004.
[13]
Faconti, G., Massink, M.: Analysis of pointing tasks on a white board. In: Doherty, G., Blandford, A. (eds.) DSVIS 2006. LNCS, vol. 4323, pp. 185--198. Springer, Heidelberg (2007)
[14]
Fish, R. S., Kraut, R. E., Root, R. W., and Rice, R. E. 1993. Video as a technology for informal communication. Commun. ACM 36, 1 (Jan. 1993), 48--61.
[15]
Fitts P. M. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47:381--391, 1954.
[16]
Grossman, T. and Balakrishnan, R. 2005. The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Portland, Oregon, USA, April 02--07, 2005). CHI '05. ACM, New York, NY, 281--290.
[17]
Grudin, J. The computer reaches out: the historical continuity of interface design. Proceedings of the SIGCHI conference on Human factors in computing systems: Empowering people, ACM (1990), 261--268.
[18]
Guiard, Y., Blanch, R., and Beaudouin-Lafon, M. 2004. Object pointing: a complement to bitmap pointing in GUIs. In Proceedings of Graphics interface 2004 (London, Ontario, Canada, May 17--19, 2004). ACM International Conference Proceeding Series, vol. 62. Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario, 9--16.
[19]
Guimbretière, F., Stone, M., and Winograd, T. 2001. Fluid interaction with high-resolution wall-size displays. In Proceedings of the 14th Annual ACM Symposium on User interface Software and Technology (Orlando, Florida, November 11--14, 2001). UIST '01. ACM, New York, NY, 21--30.
[20]
Han, J. Y. Low-cost multi-touch sensing through frustrated total internal reflection. In Proceedings of the 18th Annual ACM Symposium on User interface Software and Technology (Seattle, WA, USA, October 23--26, 2005). UIST '05. ACM, New York, NY, 115--118.
[21]
Hansen, A. S., Overholt, D., Burleson, W., Jensen, C. N., Lahey, B., and Muldner, K. 2009. Pendaphonics: an engaging tangible pendulum-based sonic interaction experience. In Proceedings of the 8th international Conference on interaction Design and Children (Como, Italy, June 03--05, 2009). IDC '09. ACM, New York, NY, 286--288.
[22]
Ishii, H., Ullmer, B. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia, United States, March 22--27, 1997). S. Pemberton, Ed. CHI '97. ACM, New York, NY, 234--241.
[23]
Ishii, H., Ben-Joseph, E., Underkoffler, J., Yeung, L., Chak, D., Kanji, Z., and Piper, B. 2002. Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. In Proceedings of the 1st international Symposium on Mixed and Augmented Reality (September 30 -- October 01, 2002). Symposium on Mixed and Augmented Reality. IEEE Computer Society, Washington, DC, 203.
[24]
Jota, R., Pereira, J. M., et Jorge, J. A. A comparative study of interaction metaphors for large-scale displays. Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, ACM (2009), 4135--4140.
[25]
Kobayashi, M. and Igarashi, T. 2008. Ninja cursors: using multiple cursors to assist target acquisition on large screens. In Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems (Florence, Italy, April 05--10, 2008). CHI '08. ACM, New York, NY, 949--958
[26]
Mackenzie I. S. Fitts' law as a research and design tool in human-computer interaction. Human-Computer Interaction, 7:91--139, 1992.
[27]
Letessier, J. and Bérard, F. 2004. Visual tracking of bare fingers for interactive surfaces. In Proceedings of the 17th Annual ACM Symposium on User interface Software and Technology (Santa Fe, NM, USA, October 24--27, 2004). UIST '04. ACM, New York, NY, 119--122.
[28]
MacKenzie, I. S. et Jusoh, S. An Evaluation of Two Input Devices for Remote Pointing. Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction, Springer-Verlag (2001), 235--250.
[29]
Mimio interactive whiteboard, 2005. http://www.mimio.com
[30]
Norman, D. A., & Draper, S. W. (1986). User Centered System Design: New Perspectives on Human-computer Interaction (1er ééd., p. 526). CRC.
[31]
Piper, B., Ratti, C., and Ishii, H. 2002. Illuminating clay: a 3--D tangible interface for landscape analysis. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Changing Our World, Changing Ourselves (Minneapolis, Minnesota, USA, April 20--25, 2002). CHI '02. ACM, New York, NY, 355--362.
[32]
Robertson, G., Czerwinski, M., Baudisch, P., Meyers, B., Robbins, D., Smith, G., and Tan, D. 2005. The Large-Display User Experience. IEEE Comput. Graph. Appl. 25, 4 (Jul. 2005), 44--51.
[33]
Santos, C. R. D. et Gros, P. Multiple Views in 3D Metaphoric Information Visualization., International Conference on, IEEE Information Visualisation (2002), 468
[34]
Skaburskis, A. W., Vertegaal, R., et Shell, J. S. Auramirror: reflections on attention. Proceedings of the 2004 symposium on Eye tracking research \& applications, ACM (2004), 101--108.
[35]
Vogel, D. and Balakrishnan, R. 2005. Distant freehand pointing and clicking on very large, high resolution displays. In Proceedings of the 18th Annual ACM Symposium on User interface Software and Technology (Seattle, WA, USA, October 23--26, 2005). UIST '05. ACM, New York, NY, 33--42.
[36]
Wagner, D. and Schmalstieg, D. Artoolkitplus for pose tracking on mobile devices. Proceedings of 12th Computer Vision Winter Workshop, (2007), 139--146.
[37]
Zhai, S. Characterizing computer input with Fitts' law parameters --- The information and non-information aspects of pointing. International Journal of Human-Computer Studies, 61(6), 791--809. December 2004

Cited By

View all

Index Terms

  1. Tactile camera vs. tangible camera: taking advantage of small physical artefacts to navigate into large data collection

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '10: Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
    October 2010
    889 pages
    ISBN:9781605589343
    DOI:10.1145/1868914
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Reykjavik University
    • University of Iceland

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. interaction technique
    2. mixed interactive systems
    3. pointing task
    4. usability study

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    NordiCHI '10
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 23 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media