skip to main content
10.1145/2390256.2390296acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Multimodal reference resolution for mobile spatial interaction in urban environments

Published: 17 October 2012 Publication History

Abstract

We present results of a study on referring to the outside environment from within a moving vehicle. Reference resolution is the first necessary step in integrating the outside environment into the interactive system in the car. It is the problem of finding out which of the objects outside the users is interested in. In our study, we explored eye gaze, head pose, pointing gesture with a smart phone, and the user's view field. We implemented and tested everything in a moving vehicle in a real-life traffic. For safety reasons, the front-seat passenger used the system while the driver was concentrating completely on driving. For analysis and visualization of the user's interaction with the environment, 528 buildings of the city were modeled in 2.5D by using an airborne LIDAR scan, Google Earth, and a spatial database. As a result of our study, we propose in this paper a new algorithm for spatial reference resolution together with a scanning mechanism.

References

[1]
K. Amlacher, G. Fritz, P. Luley, A. Almer, and L. Paletta. Geo-contextual priors for attentive urban object recognition. In Proceedings of the 2009 IEEE international conference on Robotics and Automation, ICRA'09, pages 3015--3020, Piscataway, NJ, USA, 2009. IEEE Press.
[2]
L. Baillie, H. Kunczier, and H. Anegg. Rolling, rotating and imagining in a virtual mobile world. In Proceedings of the 7th international conference on Human computer interaction with mobile devices & services, MobileHCI '05, pages 283--286, New York, NY, USA, 2005. ACM.
[3]
M. Baldauf, P. Fröhlich, and S. Hutter. Kibitzer: a wearable system for eye-gaze-based mobile urban exploration. In Proceedings of the 1st Augmented Human International Conference, AH '10, pages 9:1--9:5, New York, NY, USA, 2010. ACM.
[4]
W. Barfield, C. M. Hendrix, O. Bjorneseth, K. A. Kaczmarek, and W. Lotens. Comparison of human sensory capabilities with technical specifications of virtual environment equipment. Teleoperators and Virtual Environments, 4:329--356, 1995.
[5]
J. D. Carswell. 3dq: Threat dome visibility querying on mobile devices. In GIM International, Vol.24, (8), 2010.
[6]
V. R. Chandrasekhar, D. M. Chen, S. S. Tsai, N.-M. Cheung, H. Chen, G. Takacs, Y. Reznik, R. Vedantham, R. Grzeszczuk, J. Bach, and B. Girod. The stanford mobile visual search data set. In Proceedings of the second annual ACM conference on Multimedia systems, MMSys '11, pages 117--122, New York, NY, USA, 2011. ACM.
[7]
E. De Ves, I. Coma, M. Fernndez, and J. Gimeno. Intelligent eye: location-based multimedia information for mobile. In Conference iiWAS2010, Paris, France, 2010.
[8]
W. T. Fong, S. K. Ong, and A. Y. C. Nee. Computer vision centric hybrid tracking for augmented reality in outdoor urban environments. In Proceedings of the 8th International Conference on Virtual Reality Continuum and its Applications in Industry, VRCAI '09, pages 185--190, New York, NY, USA, 2009. ACM.
[9]
K. Gardiner and J. D. Carswell. Viewer-based directional querying for mobile applications. In Proceedings of the Fourth international conference on Web information systems engineering workshops, WISEW'03, pages 83--91, Washington, DC, USA, 2003. IEEE Computer Society.
[10]
K. Gardiner, J. Yin, and J. D. Carswell. Egoviz --- a mobile based spatial interaction system. In Proceedings of the 9th International Symposium on Web and Wireless Geographical Information Systems, W2GIS '09, pages 135--152, Berlin, Heidelberg, 2009. Springer-Verlag.
[11]
J. Löfström. Physical Mobile Interaction Design for Spatially-Aware Pointing. Master's thesis, KTH Computer Science and Communication, Stockholm, Sweden, 2010.
[12]
R. Simon and P. Fröhlich. A mobile application framework for the geospatial web. In Proceedings of the 16th international conference on World Wide Web, WWW '07, pages 381--390, New York, NY, USA, 2007. ACM.
[13]
R. Simon, P. Fröhlich, and H. Anegg. Beyond location based - the spatially aware mobile phone. In J. Carswell and T. Tezuka, editors, Web and Wireless Geographical Information Systems, volume 4295 of Lecture Notes in Computer Science, pages 12--21. Springer Berlin/Heidelberg, 2006. 10.1007/11935148_2.
[14]
R. Simon, P. Fröhlich, and T. Grechenig. Geopointing: evaluating the performance of orientation-aware location-based interaction under real-world conditions. J. Locat. Based Serv., 2:24--40, March 2008.
[15]
R. Simon, P. Fröhlich, G. Obernberger, and E. Wittowetz. The point to discover geowand. In Proceedings of the 9th International Conference on Ubiquitous Computing (UbiComp 07), September 2007, 2009.
[16]
R. Simon, H. Kunczier, and H. Anegg. Towards orientation-aware location based mobile services. In G. Gartner, W. Cartwright, and M. P. Peterson, editors, Location Based Services and TeleCartography, Lecture Notes in Geoinformation and Cartography, pages 279--290. Springer Berlin Heidelberg, 2007. 10.1007/978-3-540-36728-4_21.
[17]
L. Terissi and J. Gómez. 3d head pose and facial expression tracking using a single camera. j-jucs, 16(6):903--920, 2010.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AutomotiveUI '12: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
October 2012
280 pages
ISBN:9781450317511
DOI:10.1145/2390256
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. automotive
  2. eye gaze
  3. head pose
  4. modality choice
  5. pointing gesture
  6. reference resolution

Qualifiers

  • Research-article

Funding Sources

Conference

AutomotiveUI '12

Acceptance Rates

Overall Acceptance Rate 248 of 566 submissions, 44%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 29 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media