000101613 001__ 101613 000101613 005__ 20230706131410.0 000101613 0247_ $$2doi$$a10.1109/VR46266.2020.1580727911717 000101613 0248_ $$2sideral$$a118336 000101613 037__ $$aART-2020-118336 000101613 041__ $$aeng 000101613 100__ $$aMarañes, Carlos 000101613 245__ $$aExploring the impact of 360° movie cuts in users' attention 000101613 260__ $$c2020 000101613 5060_ $$aAccess copy available to the general public$$fUnrestricted 000101613 5203_ $$aVirtual Reality (VR) has grown since the first devices for personal use became available on the market. However, the production of cinematographic content in this new medium is still in an early exploratory phase. The main reason is that cinematographic language in VR is still under development, and we still need to learn how to tell stories effectively. A key element in traditional film editing is the use of different cutting techniques, in order to transition seamlessly from one sequence to another. A fundamental aspect of these techniques is the placement and control over the camera. However, VR content creators do not have full control of the camera. Instead, users in VR can freely explore the 360° of the scene around them, which potentially leads to very different experiences. While this is desirable in certain applications such as VR games, it may hinder the experience in narrative VR. In this work, we perform a systematic analysis of users'' viewing behavior across cut boundaries while watching professionally edited, narrative 360° videos. We extend previous metrics for quantifying user behavior in order to support more complex and realistic footage, and we introduce two new metrics that allow us to measure users'' exploration in a variety of different complex scenarios. From this analysis, (i) we confirm that previous insights derived for simple content hold for professionally edited content, and (ii) we derive new insights that could potentially influence VR content creation, informing creators about the impact of different cuts in the audience's behavior. 000101613 536__ $$9info:eu-repo/grantAgreement/EC/H2020/682080/EU/Intuitive editing of visual appearance from real-world datasets/CHAMELEON$$9This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 682080-CHAMELEON$$9info:eu-repo/grantAgreement/ES/MINECO/TIN2016-78753-P 000101613 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/ 000101613 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion 000101613 700__ $$0(orcid)0000-0002-7503-7022$$aGutierrez, Diego$$uUniversidad de Zaragoza 000101613 700__ $$0(orcid)0000-0002-7796-3177$$aSerrano, Ana$$uUniversidad de Zaragoza 000101613 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf. 000101613 773__ $$g(2020), 73-82$$pProc. (IEEE Conf. Virtual Real. 3D User Interfaces)$$tProceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2020$$x2642-5246 000101613 8564_ $$s13270507$$uhttps://zaguan.unizar.es/record/101613/files/texto_completo.pdf$$yPostprint 000101613 8564_ $$s3234013$$uhttps://zaguan.unizar.es/record/101613/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint 000101613 909CO $$ooai:zaguan.unizar.es:101613$$particulos$$pdriver 000101613 951__ $$a2023-07-06-12:20:45 000101613 980__ $$aARTICLE