000101256 001__ 101256
000101256 005__ 20220405150417.0
000101256 0247_ $$2doi$$a10.1109/TASE.2020.2980246
000101256 0248_ $$2sideral$$a117972
000101256 037__ $$aART-2020-117972
000101256 041__ $$aeng
000101256 100__ $$0(orcid)0000-0002-3567-3294$$aAzagra, Pablo$$uUniversidad de Zaragoza
000101256 245__ $$aIncremental Learning of Object Models From Natural Human-Robot Interactions
000101256 260__ $$c2020
000101256 5060_ $$aAccess copy available to the general public$$fUnrestricted
000101256 5203_ $$aIn order to perform complex tasks in realistic human environments, robots need to be able to learn new concepts in the wild, incrementally, and through their interactions with humans. This article presents an end-to-end pipeline to learn object models incrementally during the human-robot interaction (HRI). The pipeline we propose consists of three parts: 1) recognizing the interaction type; 2) detecting the object that the interaction is targeting; and 3) learning incrementally the models from data recorded by the robot sensors. Our main contributions lie in the target object detection, guided by the recognized interaction, and in the incremental object learning. The novelty of our approach is the focus on natural, heterogeneous, and multimodal HRIs to incrementally learn new object models. Throughout the article, we highlight the main challenges associated with this problem, such as high degree of occlusion and clutter, domain change, low-resolution data, and interaction ambiguity. This article shows the benefits of using multiview approaches and combining visual and language features, and our experimental results outperform standard baselines.
000101256 536__ $$9info:eu-repo/grantAgreement/ES/DGA/T45-17R$$9info:eu-repo/grantAgreement/EC/FP7/248663/EU/European Coordinated Research on Long-term Challenges in Information and Communication Sciences and Technologies/CHIST- ERA$$9info:eu-repo/grantAgreement/ES/MCIU-AEI-FEDER/PGC2018-096367-B-I00$$9info:eu-repo/grantAgreement/ES/MCIU-AEI/RTC-2017-6421-7
000101256 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000101256 590__ $$a5.083$$b2020
000101256 591__ $$aAUTOMATION & CONTROL SYSTEMS$$b16 / 63 = 0.254$$c2020$$dQ2$$eT1
000101256 592__ $$a1.314$$b2020
000101256 593__ $$aElectrical and Electronic Engineering$$c2020$$dQ1
000101256 593__ $$aControl and Systems Engineering$$c2020$$dQ1
000101256 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000101256 700__ $$0(orcid)0000-0003-1368-1151$$aCivera, Javier$$uUniversidad de Zaragoza
000101256 700__ $$0(orcid)0000-0002-7580-9037$$aMurillo, Ana C.$$uUniversidad de Zaragoza
000101256 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000101256 773__ $$g17, 4 (2020), 1883 - 1900$$pIEEE Trans. Autom. Sci. Eng.$$tIEEE Transactions on Automation Science and Engineering$$x1545-5955
000101256 8564_ $$s12099107$$uhttps://zaguan.unizar.es/record/101256/files/texto_completo.pdf$$yPostprint
000101256 8564_ $$s3063519$$uhttps://zaguan.unizar.es/record/101256/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000101256 909CO $$ooai:zaguan.unizar.es:101256$$particulos$$pdriver
000101256 951__ $$a2022-04-05-14:37:51
000101256 980__ $$aARTICLE