skip to main content
10.1145/2459236.2459251acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

A depth cue method based on blurring effect in augmented reality

Published: 07 March 2013 Publication History

Abstract

In this paper, a depth cue method based on blurring effect in augmented reality is proposed. Distinguished from the previous researches, the proposed method offers an algorithm which, based on the spatial information in the real world and the intrinsic parameters of the camera, estimates the blurring effect in the whole scene. Through one-time checkerboard calibration, the camera parameters are registered, the value of the Point Spread Function parameters is measured and the blur circle radius on a particular position could be predicted for later virtual object rendering. The measurement procedure of the blur circle radius and the estimation algorithm are discussed. A prototype of the proposed AR system is implemented. In the end, an evaluation of the blur circle radius estimation algorithm is provided and the future work of the research is discussed.

References

[1]
Kruijff, E., Swan, J. E. and Feiner S. Perceptual issues in Augmented Reality revisited. Proc. ISMAR 2010.
[2]
Dey, A., Cunningham, A. and Sandor, C. Evaluating depth perception of photorealistic mixed reality visualizations for occluded objects in outdoor environments. Proc. VRST 2010, ACM Press (2010), 211--218.
[3]
Mikkola, M., Boev, A. and Gotchev, A. Relative importance of depth cues on portable autostereoscopic display. Proc. MoViD 2010, ACM Press (2010), 63--68.
[4]
Surdick, R., Davis, T. and Elizabeth, T. Perception of distance in simulated visual displays: a comparison of the effectiveness and accuracy of multiple depth cues across viewing distances. J., Teleoperators & Virtual Environments. 6 (1997), 513--532.
[5]
Fischer, J., Cunningham, D., Bartz, D., Wallraven, C., Bülthoff, H. and Straßer, W. Measuring the discernability of virtual objects in conventional and stylized Augmented Reality. Proc. EGVE 2006.
[6]
Haller, M., Landerl, F., and Billinghurst, M. A loose and sketchy approach in a mediated reality environment. Proc. GRAPHITE 2005, ACM Press (2005), 371--379.
[7]
Okumura, B., Kanbara, M., and Yokoya, N. Augmented Reality based on estimation of defocusing and motion blurring from captured images. Proc. ISMAR 2006, 219--225.
[8]
Okumura, B., Kanbara, M., and Yokoya, N. Image composition based on blur estimation from captured image for Augmented Reality. Proc. VR 2006, 18-.
[9]
Shan, Q., Jia, J., and Agarwala, A. High-quality motion deblurring from a single image. Proc. SIGGRAPH 2008, ACM Press (2008), 73--83.
[10]
Cho, T. S., Paris, S., Horn, B. K. P., and Freeman, W. T. Blur kernel estimation using the radon transform. Proc. CVPR 2011, 241--248.
[11]
Joshi, N., Szeliski, R., and Kriegman, D. J. PSF estimation using sharp edge prediction. Proc. CVPR 2008, 1--8.
[12]
Pentland, A. P. 1985. A new sense for depth of field. Proc. IJCAI 85, Aravind Joshi (Ed.), Vol. 2, 988--994.
[13]
Mühlich, M., Bildverarbeitung, L., and Aach, T. High accuracy feature detection for camera calibration: a multi-steerable approach. Proc. The 29th DAGM, 284--293.
[14]
Sun, W., Yang, X., Xiao, S., and Hu, W. Robust checkerboard recognition for efficient nonplanar geometry registration in projector-camera systems. Proc. PROCAMS 2008.
[15]
Kim, S. K., Park, S. R., and Paik, J. K. Simultaneous out-of-focus blur estimation and restoration for digital auto-focusing system. IEEE Trans. on Consum. Electron. 44, 3 (1998), 1071--1075.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AH '13: Proceedings of the 4th Augmented Human International Conference
March 2013
254 pages
ISBN:9781450319041
DOI:10.1145/2459236
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • SimTech: SimTech
  • Universität Stuttgart: Universität Stuttgart

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 March 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. blur
  3. depth cue
  4. point spread function
  5. thin lens model

Qualifiers

  • Research-article

Conference

AH'13
Sponsor:
  • SimTech
  • Universität Stuttgart
AH'13: 4th Augmented Human International Conference
March 7 - 8, 2013
Stuttgart, Germany

Acceptance Rates

AH '13 Paper Acceptance Rate 49 of 69 submissions, 71%;
Overall Acceptance Rate 121 of 306 submissions, 40%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)1
Reflects downloads up to 26 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media