skip to main content
10.1145/3639701.3656311acmconferencesArticle/Chapter ViewAbstractPublication PagesimxConference Proceedingsconference-collections
research-article

360Align: An Open Dataset and Software for Investigating QoE and Head Motion in 360° Videos with Alignment Edits

Published: 07 June 2024 Publication History

Abstract

This paper presents the resources utilized for and gathered from a subjective QoE experiment conducted with 45 participants who watched 360° videos processed with offline alignment edits. The aim of these edits is to redirect the assumed field of view to a specified region of interest by rotating the 360° video around the horizon. The user experiment involved alignment edits employing both gradual and instant rotation. We employed the Double Stimulus method, whereby participants evaluated each original-processed video pair, resulting in a dataset with 5400 comfort and sense of presence ratings. During video consumption, head motion was recorded from the Meta Quest 2 device. The resulting dataset, containing the original and processed videos, is made publicly accessible. The accompanying web application, developed for the execution of the experiment, is released alongside scripts for the evaluation of head rotation data in a public repository. A cross-analysis of QoE and HM behavior provides insight into the efficacy of alignment edits for attention alignment within the same scene. This comprehensive set of experimental resources establishes a foundation for further research of 360° videos processed with alignment edits.

Supplemental Material

MP4 File
Video demonstrating the research and the proposed web platform, experiment setup and the Free and Informed Consent Term.
PDF File
Video demonstrating the research and the proposed web platform, experiment setup and the Free and Informed Consent Term.

References

[1]
2022. Virtual Reality (VR) Market Size, Share & Trends Analysis Report By Technology (Semi & Fully Immersive, Non-immersive), By Device (HMD, GTD), By Component (Hardware, Software), By Application, By Region, And Segment Forecasts, 2023 - 2030. Report ID: GVR-1-68038-831-2. Number of Pages: 300. https://www.grandviewresearch.com/industry-analysis/virtual-reality-vr-market/methodology
[2]
Tanja Aitamurto, Andrea Stevenson Won, Sukolsak Sakshuwong, Byungdoo Kim, Yasamin Sadeghi, Krysten Stein, Peter G Royal, and Catherine Kircos. 2021. From FOMO to JOMO: Examining the Fear and Joy of Missing Out and Presence in a 360° Video Viewing Experience. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (2021). https://doi.org/10.1145/3411764.3445183
[3]
Lucas S Althoff, Mylene Farias, and Alessandro Rodrigues e Silva. 2024. Single Stimulus Experiment on 360 Videos with Alignment Edits. https://doi.org/10.17605/OSF.IO/YFTV7
[4]
Lucas S. Althoff, Myléne C. Q. Farias, Alessandro Rodrigues Silva, and Marcelo M. Carvalho. 2023. Impact of Alignment Edits on the Quality of Experience of 360° Videos. IEEE Access 11 (2023), 108475–108492. https://doi.org/10.1109/ACCESS.2023.3319346
[5]
Lucas S. Althoff, Myllena A. Prado, Sana Alamgeer, Alessandro Silva, Ravi Prakash, Marcelo M. Carvalho, and Mylène C. Q. Farias. 2022. 360RAT: A Tool for Annotating Regions of Interest in 360-degree Videos. Proceedings of the Brazilian Symposium on Multimedia and the Web (2022). https://doi.org/10.1145/3539637.3557930
[6]
Gabriel De Castro Araújo, Henrique Domingues Garcia, Mylene Farias, Ravi Prakash, and Marcelo Carvalho. 2023. 360EAVP: A 360-Degree Edition-Aware Video Player. In Proceedings of the 15th International Workshop on Immersive Mixed and Virtual Environment Systems (Vancouver, BC, Canada) (MMVE ’23). Association for Computing Machinery, New York, NY, USA, 18–23. https://doi.org/10.1145/3592834.3592879
[7]
Felix Bork, Christian Schnelzer, Ulrich Eck, and Nassir Navab. 2018. Towards Efficient Visual Guidance in Limited Field-of-View Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics 24 (2018), 2983–2992.
[8]
Jessica Brilhart. 2016. In the blink of a mind. Retrieved February 15, 2022 from https://medium/https://medium.com/the-language-of-vr/in-the-blink-of-a-mindprologue-7864c0474a29#.v0gfq5v0x
[9]
Cullen Brown, Ghanshyam Bhutra, Mohamed Suhail, Qinghong Xu, and Eric D Ragan. 2017. Coordinating attention and cooperation in multi-user virtual reality narratives. In 2017 IEEE Virtual Reality (VR). IEEE, 377–378. https://doi.org/10.1109/VR.2017.7892334
[10]
Ruochen Cao, James A. Walsh, Andrew Cunningham, Carolin Reichherzer, Subrata Dey, and B. Thomas. 2019. A Preliminary Exploration of Montage Transitions in Cinematic Virtual Reality. 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (2019), 65–70. https://doi.org/10.1109/ISMAR-Adjunct.2019.00031
[11]
Xavier Corbillon, F. D. Simone, and G. Simon. 2017. 360-Degree Video Head Movement Dataset. Proceedings of the 8th ACM on Multimedia Systems Conference (2017).
[12]
Savino Dambra, Giuseppe Samela, Lucile Sassatelli, Romaric Pighetti, Ramon Aparicio-Pardo, and Anne-Marie Pinna-Déry. 2018. Film editing: New levers to improve VR streaming. In Proceedings of the 9th ACM Multimedia Systems Conference. 27–39. https://doi.org/10.1145/3204949.3204962
[13]
Stephan Fremerey, Ashutosh Singla, Kay Meseberg, and Alexander Raake. 2018. AVtrack360: An Open Dataset and Software Recording People’s Head Rotations Watching 360° Videos on an HMD(MMSys ’18). Association for Computing Machinery, New York, NY, USA, 403–408. https://doi.org/10.1145/3204949.3208134
[14]
Michael Gödde, Frank Gabler, Dirk Siegmund, and Andreas Braun. 2018. Cinematic Narration in VR - Rethinking Film Conventions for 360 Degrees. In HCI. https://doi.org/10.1007/978-3-319-91584-5_15
[15]
Jesús Gutiérrez, Pablo Pérez, Marta Orduna, Ashutosh Singla, Carlos Cortés, Pramit Mazumdar, Irene Viola, Kjell Brunnström, Federica Battisti, Natalia Cieplinska, Dawid Juszka, Lucjan Janowski, Mikołaj Leszczuk, Anthony Olufemi Adeyemi-Ejeye, Yaosi Hu, Zhenzhong Chen, Glenn Van Wallendael, Peter Lambert, César Díaz, John Hedlund, Omar Hamsis, Stephan Fremerey, Frank Hofmeyer, Alexander Raake, Pablo César, Marco Carli, and Narciso García. 2022. Subjective Evaluation of Visual Quality and Simulator Sickness of Short 360 Videos: ITU-T Rec. P.919. IEEE Transactions on Multimedia 24 (2022), 3087–3100. https://doi.org/10.1109/tmm.2021.3093717
[16]
Yuki Harada and Junji Ohyama. 2021. Quantitative evaluation of visual guidance effects for 360-degree directions. Virtual Reality (2021).
[17]
Daniel Harley, Aneesh P. Tarun, Daniel Germinario, and Ali Mazalek. 2017. Tangible VR: Diegetic Tangible Objects for Virtual Reality Narratives. In Proceedings of the 2017 Conference on Designing Interactive Systems (Edinburgh, United Kingdom) (DIS ’17). Association for Computing Machinery, New York, NY, USA, 1253–1263. https://doi.org/10.1145/3064663.3064680
[18]
International Telecommunication Union 2019. P.800 : Methods for subjective determination of transmission quality. International Telecommunication Union.
[19]
International Telecommunication Union 2020. ITU-T Recommendation P.919: Subjective test methodologies for 360º video on head-mounted displays. International Telecommunication Union. https://www.itu.int/rec/T-REC-P.919/en
[20]
International Telecommunication Union 2023. ITU-T Recommendation BT.500-8: Methodology for the subjective assessment of the quality of television pictures. International Telecommunication Union. https://www.itu.int/rec/R-REC-BT.500-15-202305-I/en
[21]
International Telecommunication Union 2023. Recommendation P.910: Subjective Video Quality Assessment Methods for Multimedia Applications. International Telecommunication Union. https://www.itu.int/rec/T-REC-P.910
[22]
Jason Jerald, Tabitha C. Peck, Frank Steinicke, and Mary C. Whitton. 2008. Sensitivity to scene motion for phases of head yaws. In APGV ’08.
[23]
Tina Kjær, Christoffer B Lillelund, Mie Moth-Poulsen, Niels C Nilsson, Rolf Nordahl, and Stefania Serafin. 2017. Can you cut it?: an exploration of the effects of editing in cinematic virtual reality. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. ACM, 4. https://doi.org/10.1145/3139131.3139166
[24]
Sebastian Knorr, Cagri Ozcinar, Colm O Fearghail, and Aljosa Smolic. 2018. Director’s cut: a combined dataset for visual attention analysis in cinematic VR content. In Proceedings of the 15th ACM SIGGRAPH European Conference on Visual Media Production. 1–10. https://doi.org/10.1145/3278471.3278472
[25]
Carlos Marañes, Diego Gutierrez, and Ana Serrano. [n. d.]. Towards assisting the decision-making process for content creators in cinematic virtual reality through the analysis of movie cuts and their influence on viewers’ behavior. International Transactions in Operational Research n/a, n/a ([n. d.]). https://doi.org/10.1111/itor.13106 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/itor.13106
[26]
Brandy Murovec, Julia Spaniol, Jennifer L. Campos, and Behrang Keshavarz. 2021. Multisensory Effects on Illusory Self-Motion (Vection): the Role of Visual, Auditory, and Tactile Cues.Multisensory research (2021), 1–22. https://api.semanticscholar.org/CorpusID:236998435
[27]
Babak Naderi, Sebastian Möller, and Ross Cutler. 2021. Speech Quality Assessment in Crowdsourcing: Comparison Category Rating Method. In 2021 Thirteen International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 1–6.
[28]
Samah Gaber Mohamed Nassar. 2021. Engaging by Design: Utilization of VR Interactive Design Tool in Mise-en-scène Design in Filmmaking. International Design Journal 11, 6 (2021), 5. https://doi.org/10.21608/IDJ.2021.87166.1022
[29]
Lasse T Nielsen, Matias B Møller, Sune D Hartmeyer, Troels CM Ljung, Niels C Nilsson, Rolf Nordahl, and Stefania Serafin. 2016. Missing the point: an exploration of how to guide users’ attention during cinematic virtual reality. In Proceedings of the 22nd ACM conference on virtual reality software and technology. 229–232. https://doi.org/10.1145/2993369.2993405
[30]
Amy Pavel, Björn Hartmann, and Maneesh Agrawala. 2017. Shot orientation controls for interactive cinematography with 360 video. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 289–297. https://doi.org/10.1145/3126594.3126636
[31]
Romaric Pighetti, wasp898, Savino D, sassatelli, and Joël CANCELA VAZ. 2018. UCA4SVR/TOUCAN-VR: TOUCAN_VR. https://doi.org/10.5281/zenodo.1204442
[32]
Jayesh S. Pillai and Manvi Verma. 2019. Grammar of VR Storytelling: Analysis of Perceptual Cues in VR Cinema. In Proceedings of the 16th ACM SIGGRAPH European Conference on Visual Media Production (London, United Kingdom) (CVMP ’19). Association for Computing Machinery, New York, NY, USA, Article 10, 10 pages. https://doi.org/10.1145/3359998.3369402
[33]
Katharina Margareta Theresa Pöhlmann, Julia Föcker, Patrick Dickinson, Adrian Parke, and Louise O’hare. 2021. The Effect of Motion Direction and Eccentricity on Vection, VR Sickness and Head Movements in Virtual Reality.Multisensory research (2021), 1–40. https://api.semanticscholar.org/CorpusID:233350895
[34]
Pablo Pérez and Javier Escobar. 2019. MIRO360: A Tool for Subjective Assessment of 360 Degree Video for ITU-T P.360-VR. In 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX). 1–3. https://doi.org/10.1109/QoMEX.2019.8743216
[35]
Bernhard E. Riecke, Jörg Schulte-Pelkum, Marios N. Avraamides, Markus von der Heyde, and Heinrich H. Bülthoff. 2006. Cognitive factors can influence self-motion perception (vection) in virtual reality. ACM Trans. Appl. Percept. 3 (2006), 194–216. https://api.semanticscholar.org/CorpusID:15927542
[36]
Henrique Souza Rossi, Karan Mitra, Christer Åhlund, Irina Cotanis, Niclas Ögren, and Per Johansson. 2023. ALTRUIST: A Multi-platform Tool for Conducting QoE Subjective Tests. In 2023 15th International Conference on Quality of Multimedia Experience (QoMEX). 99–102. https://doi.org/10.1109/QoMEX58391.2023.10178508
[37]
Sylvia Rothe, Daniel Buschek, and Heinrich Hußmann. 2019. Guidance in Cinematic Virtual Reality-Taxonomy, Research Status and Challenges. Multimodal Technol. Interact. 3 (2019), 19. https://doi.org/10.3390/MTI3010019
[38]
Ana Serrano, Daniel Martin, Diego Gutierrez, Karol Myszkowski, and Belen Masia. 2020. Imperceptible Manipulation of Lateral Camera Motion for Improved Virtual Reality Applications. ACM Trans. Graph. 39, 6, Article 267 (nov 2020), 14 pages. https://doi.org/10.1145/3414685.3417773
[39]
Ana Serrano, Vincent Sitzmann, Jaime Ruiz-Borau, Gordon Wetzstein, Diego Gutierrez, and Belen Masia. 2017. Movie editing and cognitive event segmentation in virtual reality video. ACM Transactions on Graphics (TOG) 36, 4 (2017), 1–12. https://doi.org/10.1145/3072959.3073668
[40]
Marco Speicher, Christoph Rosenberg, Donald Degraen, Florian Daiber, and Antonio Krüger. 2019. Exploring Visual Guidance in 360-degree Videos. Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (2019).
[41]
Huyen T. T. Tran, Nam P. Ngoc, Cuong T. Pham, Yong Ju Jung, and Truong Cong Thang. 2019. A Subjective Study on User Perception Aspects in Virtual Reality. Applied Sciences 9, 16 (2019). https://doi.org/10.3390/app9163384
[42]
Feng Tian, Minlei Hua, Tingting Zhang, and Wenrui Zhang. 2020. Spatio-temporal Editing Method and Application in Virtual Reality Video. 2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC) 1 (2020), 2290–2294. https://doi.org/10.1109/ITNEC48623.2020.9085087
[43]
Simon Weaving. 2021. Evoke, don’t show: Narration in cinematic virtual reality and the making of Entangled. Virtual Creativity (2021). https://doi.org/10.1386/vcr_00047_1
[44]
Dennis Wolf, Michael Rietzler, Laura Bottner, and Enrico Rukzio. 2021. Augmenting Teleportation in Virtual Reality With Discrete Rotation Angles. ArXiv abs/2106.04257 (2021).
[45]
R Michael Young. 2000. Creating interactive narrative structures: The potential for AI approaches. Psychology 13 (2000), 1–26.

Cited By

View all
  • (2024)Towards Accessible Musical Performances in Virtual Reality: Designing a Conceptual Framework for Omnidirectional Audio DescriptionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675618(1-17)Online publication date: 27-Oct-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IMX '24: Proceedings of the 2024 ACM International Conference on Interactive Media Experiences
June 2024
465 pages
ISBN:9798400705038
DOI:10.1145/3639701
  • Editors:
  • Asreen Rostami,
  • Donald McMillan,
  • Jonathan Hook,
  • Irene Viola,
  • Jun Nishida,
  • Hanuma Teja Maddali,
  • Alexis Clay
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 June 2024

Permissions

Request permissions for this article.

Check for updates

Badges

  • Honorable Mention

Author Tags

  1. 360° video
  2. Alignment Edits
  3. Comfort
  4. Presence
  5. Quality of experience

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Coordination for the Improvement of Higher Education Personnel

Conference

IMX '24

Acceptance Rates

Overall Acceptance Rate 69 of 245 submissions, 28%

Upcoming Conference

IMX '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)83
  • Downloads (Last 6 weeks)7
Reflects downloads up to 03 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Towards Accessible Musical Performances in Virtual Reality: Designing a Conceptual Framework for Omnidirectional Audio DescriptionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675618(1-17)Online publication date: 27-Oct-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media