ABSTRACT
In this paper, we present a viewing test with 48 subjects watching 20 different entertaining omnidirectional videos on an HTC Vive Head Mounted Display (HMD) in a task-free scenario. While the subjects were watching the contents, we recorded their head movements. The obtained dataset is publicly available in addition to the links and timestamps of the source contents used. Within this study, subjects were also asked to fill in the Simulator Sickness Questionnaire (SSQ) after every viewing session. Within this paper, at first SSQ results are presented. Several methods for evaluating head rotation data are presented and discussed. In the course of the study, the collected dataset is published along with the scripts for evaluating the head rotation data. The paper presents the general angular ranges of the subjects' exploration behavior as well as an analysis of the areas where most of the time was spent. The collected information can be presented as head-saliency maps, too. In case of videos, head-saliency data can be used for training saliency models, as information for evaluating decisions during content creation, or as part of streaming solutions for region-of-interest-specific coding as with the latest tile-based streaming solutions, as discussed also in standardization bodies such as MPEG.
- Shahryar Afzal, Jiasi Chen, and KK Ramakrishnan. 2017. Characterization of 360-degree Videos. In Proceedings of the Workshop on Virtual Reality and Augmented Reality Network. ACM, 1--6. Google ScholarDigital Library
- ARTE G.E.I.E. 2018. ARTE360 VR. (2018). http://arte.tv/arte360Google Scholar
- Xavier Corbillon, Francesca De Simone, and Gwendal Simon. 2017. 360-degree video head movement dataset. In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 199--204. Google ScholarDigital Library
- Philip Day. 2018. Whirligig Free - Whirligig. (2018). http://www.whirligig.xyz/new-page-3Google Scholar
- Ana De Abreu, Cagri Ozcinar, and Aljosa Smolic. 2017. Look around you: Saliency maps for omnidirectional images in vr applications. In Quality of Multimedia Experience (QoMEX), 2017 Ninth International Conference on. IEEE, 1--6.Google ScholarCross Ref
- Ching-Ling Fan, Jean Lee, Wen-Chih Lo, Chun-Ying Huang, Kuan-Ta Chen, and Cheng-Hsin Hsu. 2017. Fixation Prediction for 360 Video Streaming in Head-Mounted Virtual Reality. In Proceedings of the 27th Workshop on Network and Operating Systems Support for Digital Audio and Video. ACM, 67--72. Google ScholarDigital Library
- VR Industry Forum. 2017. LEXICON. Technical Report Version 1.0 2017-07-01. http://www.vr-if.org/lexicon.Google Scholar
- International Telecommunication Union. 2008. ITU-T P. 910. Subjective video quality assessment methods for multimedia applications (2008).Google Scholar
- Robert S Kennedy, Norman E Lane, Kevin S Berbaum, and Michael G Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology 3, 3 (1993), 203--220.Google Scholar
- Wen-Chih Lo, Ching-Ling Fan, Jean Lee, Chun-Ying Huang, Kuan-Ta Chen, and Cheng-Hsin Hsu. 2017. 360 Video Viewing Dataset in Head-Mounted Virtual Reality. In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 211--216. Google ScholarDigital Library
- Søren Merser, Bill Miller, and Ken Weiner. 2018. Latin Squares for Constructing "Williams Designs", Balanced for First-order Carry-over (Residual) Effects. (2018). http://statpages.info/latinsq.htmlGoogle Scholar
- Amy Pavel, Björn Hartmann, and Maneesh Agrawala. 2017. Shot Orientation Controls for Interactive Cinematography with 360 Video. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 289--297. Google ScholarDigital Library
- Yashas Rai, Jesús Gutiérrez, and Patrick Le Callet. 2017. A dataset of head and eye movements for 360 degree images. In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 205--210. Google ScholarDigital Library
- Taehyun Rhee, Lohit Petikam, Benjamin Allen, and Andrew Chalmers. 2017. Mr360: Mixed reality rendering for 360 panoramic videos. IEEE transactions on visualization and computer graphics 23, 4 (2017), 1379--1388. Google ScholarDigital Library
- Ashutosh Singla, Stephan Fremerey, Werner Robitza, Pierre Lebreton, and Alexander Raake. 2017. Comparison of Subjective Quality Evaluation for HEVC Encoded Omnidirectional Videos at Different Bit-rates for UHD and FHD Resolution. In Proceedings of the on Thematic Workshops of ACM Multimedia 2017. ACM, 511--519. Google ScholarDigital Library
- Ashutosh Singla, Stephan Fremerey, Werner Robitza, and Alexander Raake. 2017. Measuring and comparing QoE and simulator sickness of omnidirectional videos in different head mounted displays. In Quality of Multimedia Experience (QoMEX), 2017 Ninth International Conference on. IEEE, 1--6.Google ScholarCross Ref
- Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, and Gordon Wetzstein. 2016. Saliency in VR: How do people explore virtual environments? arXiv preprint arXiv.1612.04335 (2016).Google Scholar
- Chenglei Wu, Zhihao Tan, Zhi Wang, and Shiqiang Yang. 2017. A Dataset for Exploring User Behaviors in VR Spherical Video Streaming. In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 193--198. Google ScholarDigital Library
- Mai Xu, Yuhang Song, Jianyi Wang, Minglang Qiao, Liangyu Huo, and Zulin Wang. 2017. Modeling Attention in Panoramic Video: A Deep Reinforcement Learning Approach. arXiv preprint arXiv.1710.10755 (2017).Google Scholar
Index Terms
- AVtrack360: an open dataset and software recording people's head rotations watching 360° videos on an HMD
Recommendations
360° Video Viewing Dataset in Head-Mounted Virtual Reality
MMSys'17: Proceedings of the 8th ACM on Multimedia Systems Conference360° videos and Head-Mounted Displays (HMDs) are getting increasingly popular. However, streaming 360° videos to HMDs is challenging. This is because only video content in viewers' Field-of-Views (FoVs) is rendered, and thus sending complete 360° videos ...
360-Degree Video Head Movement Dataset
MMSys'17: Proceedings of the 8th ACM on Multimedia Systems ConferenceWhile Virtual Reality applications are increasingly attracting the attention of developers and business analysts, the behaviour of users watching 360-degree (i.e. omnidirectional) videos has not been thoroughly studied yet. This paper introduces a ...
A dataset of head and eye movements for 360° videos
MMSys '18: Proceedings of the 9th ACM Multimedia Systems ConferenceResearch on visual attention in 360° content is crucial to understand how people perceive and interact with this immersive type of content and to develop efficient techniques for processing, encoding, delivering and rendering. And also to offer a high ...
Comments