000118843 001__ 118843
000118843 005__ 20240319081012.0
000118843 0247_ $$2doi$$a10.1007/s12193-022-00392-4
000118843 0248_ $$2sideral$$a130142
000118843 037__ $$aART-2022-130142
000118843 041__ $$aeng
000118843 100__ $$aJuan, M. Carmen
000118843 245__ $$aA SLAM-based augmented reality app for the assessment of spatial short-term memory using visual and auditory stimuli
000118843 260__ $$c2022
000118843 5060_ $$aAccess copy available to the general public$$fUnrestricted
000118843 5203_ $$aA SLAM-based Augmented Reality (AR) app has been designed, developed, and validated to assess spatial short-term memory. Our app can be used with visual and auditory stimuli and can run on mobile devices. It can be used in any indoor environment. The anchors and data of the app are persistently stored in the cloud. As an authoring tool, the type of stimulus, its number, and specific positions in the real environment can be customized for each session. A study involving 48 participants was carried out to analyze the performance outcomes comparing the location and remembering of stimuli in a real environment using visual versus auditory stimuli. The number of objects placed correctly was similar for the two different stimuli used. However, the group that used the auditory stimulus spent significantly more time completing the task and required significantly more attempts. The performance outcomes were independent of age and gender. For the auditory stimuli, correlations among all of the variables of the AR app and the variables of two other tasks (object-recall and map-pointing) were found. We also found that the greater the number of correctly placed auditory stimuli, the greater the perceived competence and the less mental effort required. The greater the number of errors, the less the perceived competence. Finally, the auditory stimuli are valid stimuli that may benefit the assessment of the memorization of spatial-auditory associations, but the memorization of spatial-visual associations is dominant, as our results suggest.
000118843 536__ $$9info:eu-repo/grantAgreement/ES/DGA-FEDER/S31-20D$$9info:eu-repo/grantAgreement/ES/MINECO-ERDF/AR3Senses-TIN2017-87044-R
000118843 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000118843 590__ $$a2.9$$b2022
000118843 592__ $$a0.629$$b2022
000118843 591__ $$aCOMPUTER SCIENCE, CYBERNETICS$$b13 / 24 = 0.542$$c2022$$dQ3$$eT2
000118843 593__ $$aSignal Processing$$c2022$$dQ2
000118843 591__ $$aCOMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE$$b89 / 145 = 0.614$$c2022$$dQ3$$eT2
000118843 593__ $$aHuman-Computer Interaction$$c2022$$dQ2
000118843 594__ $$a5.3$$b2022
000118843 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion
000118843 700__ $$0(orcid)0000-0002-4249-602X$$aMéndez López, Magdalena$$uUniversidad de Zaragoza
000118843 700__ $$0(orcid)0000-0002-4732-6417$$aÁlvarez Fidalgo, Camino$$uUniversidad de Zaragoza
000118843 700__ $$aMolla, Ramón
000118843 700__ $$aVivo, Robert
000118843 700__ $$aPáramo, David
000118843 7102_ $$14009$$2735$$aUniversidad de Zaragoza$$bDpto. Psicología y Sociología$$cÁrea Psicolog.Evolut.Educac
000118843 7102_ $$14009$$2725$$aUniversidad de Zaragoza$$bDpto. Psicología y Sociología$$cÁrea Psicobiología
000118843 773__ $$g16 (2022), [319-333 pp.]$$pJournal on Multimodal User Interfaces$$tJournal on Multimodal User Interfaces$$x1783-7677
000118843 8564_ $$s1437337$$uhttps://zaguan.unizar.es/record/118843/files/texto_completo.pdf$$yVersión publicada
000118843 8564_ $$s2593565$$uhttps://zaguan.unizar.es/record/118843/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada
000118843 909CO $$ooai:zaguan.unizar.es:118843$$particulos$$pdriver
000118843 951__ $$a2024-03-18-15:14:07
000118843 980__ $$aARTICLE