000078272 001__ 78272
000078272 005__ 20191126134631.0
000078272 0247_ $$2doi$$a10.1016/j.ijhcs.2018.01.010
000078272 0248_ $$2sideral$$a104537
000078272 037__ $$aART-2018-104537
000078272 041__ $$aeng
000078272 100__ $$0(orcid)0000-0002-2726-6760$$aGarcía-Magariño, I.$$uUniversidad de Zaragoza
000078272 245__ $$aBodily sensation maps: Exploring a new direction for detecting emotions from user self-reported data
000078272 260__ $$c2018
000078272 5060_ $$aAccess copy available to the general public$$fUnrestricted
000078272 5203_ $$aThe ability of detecting emotions is essential in different fields such as user experience (UX), affective computing, and psychology. This paper explores the possibility of detecting emotions through user-generated bodily sensation maps (BSMs). The theoretical basis that inspires this work is the proposal by Nummenmaa et al. (2014) of BSMs for 14 emotions. To make it easy for users to create a BSM of how they feel, and convenient for researchers to acquire and classify users’ BSMs, we created a mobile app, called EmoPaint. The app includes an interface for BSM creation, and an automatic classifier that matches the created BSM with the BSMs for the 14 emotions. We conducted a user study aimed at evaluating both components of EmoPaint. First, it shows that the app is easy to use, and is able to classify BSMs consistently with the considered theoretical approach. Second, it shows that using EmoPaint increases accuracy of users’ emotion classification when compared with an adaptation of the well-known method of using the Affect Grid with the Circumplex Model, focused on the same set of 14 emotions of Nummenmaa et al. Overall, these results indicate that the novel approach of using BSMs in the context of automatic emotion detection is promising, and encourage further developments and studies of BSM-based methods.
000078272 536__ $$9info:eu-repo/grantAgreement/ES/MINECO/TEC2013-50049-EXP$$9info:eu-repo/grantAgreement/ES/UZ/JIUZ-2017-TEC-02$$9info:eu-repo/grantAgreement/ES/UZ/JIUZ-2017-TEC-03
000078272 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000078272 590__ $$a2.006$$b2018
000078272 591__ $$aCOMPUTER SCIENCE, CYBERNETICS$$b9 / 23 = 0.391$$c2018$$dQ2$$eT2
000078272 591__ $$aPSYCHOLOGY, MULTIDISCIPLINARY$$b42 / 137 = 0.307$$c2018$$dQ2$$eT1
000078272 591__ $$aERGONOMICS$$b6 / 16 = 0.375$$c2018$$dQ2$$eT2
000078272 592__ $$a0.688$$b2018
000078272 593__ $$aEducation$$c2018$$dQ1
000078272 593__ $$aEngineering (miscellaneous)$$c2018$$dQ1
000078272 593__ $$aSoftware$$c2018$$dQ1
000078272 593__ $$aHuman Factors and Ergonomics$$c2018$$dQ1
000078272 593__ $$aHuman-Computer Interaction$$c2018$$dQ1
000078272 593__ $$aHardware and Architecture$$c2018$$dQ1
000078272 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000078272 700__ $$aChittaro, L.
000078272 700__ $$0(orcid)0000-0001-7550-6688$$aPlaza, I.$$uUniversidad de Zaragoza
000078272 7102_ $$15008$$2785$$aUniversidad de Zaragoza$$bDpto. Ingeniería Electrón.Com.$$cÁrea Tecnología Electrónica
000078272 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf.
000078272 773__ $$g113 (2018), 32-47$$pInt. j. human-comput. stud.$$tInternational Journal of Human Computer Studies$$x1071-5819
000078272 8564_ $$s803669$$uhttps://zaguan.unizar.es/record/78272/files/texto_completo.pdf$$yPostprint
000078272 8564_ $$s48999$$uhttps://zaguan.unizar.es/record/78272/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000078272 909CO $$ooai:zaguan.unizar.es:78272$$particulos$$pdriver
000078272 951__ $$a2019-11-26-13:41:18
000078272 980__ $$aARTICLE