000165990 001__ 165990
000165990 005__ 20260116163026.0
000165990 0247_ $$2doi$$a10.1109/ISMAR67309.2025.00054
000165990 0248_ $$2sideral$$a147451
000165990 037__ $$aART-2025-147451
000165990 041__ $$aeng
000165990 100__ $$0(orcid)0009-0001-6833-6147$$aPina, Jorge$$uUniversidad de Zaragoza
000165990 245__ $$aA Comprehensive Analysis of the Influence of Cognitive Load on Physiological Signals in Virtual Reality
000165990 260__ $$c2025
000165990 5060_ $$aAccess copy available to the general public$$fUnrestricted
000165990 5203_ $$aThe study of cognitive load (CL) has been an active field of research across disciplines such as psychology, education, and computer graphics and visualization for decades. In the context of Virtual Reality (VR), understanding mental demand becomes particularly relevant, as immersive experiences increasingly integrate multisensory stimuli that require users to distribute their limited cognitive resources. In this work, we investigate the effects of cognitive load during a search task in VR, combining objective and subjective measurements, including physiological signals and validated questionnaires. We designed an experiment in which participants performed a visual search task under two cognitive load conditions (either alone or while responding to a concurrent auditory task) and across two visual search areas (90° and 360°). We collected a rich dataset comprising task performance, eye tracking, electrocardiogram (ECG), electrodermal activity (EDA), photoplethysmography (PPG), and inertial measurements, along with subjective assessments (NASA-TLX questionnaires). Our analysis shows that increased cognitive load hinders visual search performance and affects multiple physiological markers, offering a solid foundation for future research on cognitive load in multisensory virtual environments.
000165990 536__ $$9info:eu-repo/grantAgreement/ES/DGA/T25-24$$9info:eu-repo/grantAgreement/ES/MICINN/PRE2023-UZ-26$$9info:eu-repo/grantAgreement/ES/MICIU/PID2022-141766OB-I00
000165990 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000165990 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000165990 700__ $$0(orcid)0000-0002-5275-8652$$aBernal-Berdun, Edurne$$uUniversidad de Zaragoza
000165990 700__ $$0(orcid)0000-0002-0073-6398$$aMartin, Daniel$$uUniversidad de Zaragoza
000165990 700__ $$0(orcid)0000-0002-8016-7649$$aMalpica, Sandra$$uUniversidad de Zaragoza
000165990 700__ $$aReal, Carmen$$uUniversidad de Zaragoza
000165990 700__ $$aBarquero, Alberto
000165990 700__ $$0(orcid)0000-0001-5918-1043$$aArmañac-Julián, Pablo$$uUniversidad de Zaragoza
000165990 700__ $$0(orcid)0000-0001-8742-0072$$aLazaro, Jesus$$uUniversidad de Zaragoza
000165990 700__ $$0(orcid)0000-0003-0226-4950$$aMartín-Yebra, Alba
000165990 700__ $$0(orcid)0000-0003-0060-7278$$aMasia, Belen$$uUniversidad de Zaragoza
000165990 700__ $$0(orcid)0000-0002-7796-3177$$aSerrano, Ana$$uUniversidad de Zaragoza
000165990 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000165990 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf.
000165990 773__ $$g(2025), 433-443$$pProc. int. symp. mixed augment. real. ISMAR$$tProceedings - International Symposium on Mixed and Augmented Reality, ISMAR$$x1554-7868
000165990 8564_ $$s2869885$$uhttps://zaguan.unizar.es/record/165990/files/texto_completo.pdf$$yPostprint
000165990 8564_ $$s2693192$$uhttps://zaguan.unizar.es/record/165990/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000165990 909CO $$ooai:zaguan.unizar.es:165990$$particulos$$pdriver
000165990 951__ $$a2026-01-16-14:54:21
000165990 980__ $$aARTICLE