Measuring and Predicting Multisensory Reaction Latency: A Probabilistic Model for Visual-Auditory Integration
Resumen: Virtual/augmented reality (VR/AR) devices offer both immersive imagery and sound. With those wide-field cues, we can simultaneously acquire and process visual and auditory signals to quickly identify objects, make decisions, and take action. While vision often takes precedence in perception, our visual sensitivity degrades in the periphery. In contrast, auditory sensitivity can exhibit an opposite trend due to the elevated interaural time difference. What occurs when these senses are simultaneously integrated, as is common in VR applications such as 360° video watching and immersive gaming? We present a computational and probabilistic model to predict VR users' reaction latency to visual-auditory multisensory targets. To this aim, we first conducted a psychophysical experiment in VR to measure the reaction latency by tracking the onset of eye movements. Experiments with numerical metrics and user studies with naturalistic scenarios showcase the model's accuracy and generalizability. Lastly, we discuss the potential applications, such as measuring the sufficiency of target appearance duration in immersive video playback, and suggesting the optimal spatial layouts for AR interface design.
Idioma: Inglés
DOI: 10.1109/TVCG.2024.3456185
Año: 2024
Publicado en: IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 30, 11 (2024), 7364-7374
ISSN: 1077-2626

Financiación: info:eu-repo/grantAgreement/ES/AEI/PID2022-141539NB-I00
Tipo y forma: Article (PostPrint)
Área (Departamento): Área Lenguajes y Sistemas Inf. (Dpto. Informát.Ingenie.Sistms.)
Exportado de SIDERAL (2025-02-27-09:42:07)


Visitas y descargas

Este artículo se encuentra en las siguientes colecciones:
articulos > articulos-por-area > lenguajes_y_sistemas_informaticos



 Notice créée le 2025-02-10, modifiée le 2025-02-27


Postprint:
 PDF
Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)