000071183 001__ 71183
000071183 005__ 20190902095627.0
000071183 0247_ $$2doi$$a10.1109/LRA.2018.2860039
000071183 0248_ $$2sideral$$a106907
000071183 037__ $$aART-2018-106907
000071183 041__ $$aeng
000071183 100__ $$aBescós, Berta$$uUniversidad de Zaragoza
000071183 245__ $$aDynaSLAM: Tracking, Mapping and Inpainting in Dynamic Scenes
000071183 260__ $$c2018
000071183 5060_ $$aAccess copy available to the general public$$fUnrestricted
000071183 5203_ $$aThe assumption of scene rigidity is typical in SLAM algorithms. Such a strong assumption limits the use of most visual SLAM systems in populated real-world environments, which are the target of several relevant applications like service robotics or autonomous vehicles. In this paper we present DynaSLAM, a visual SLAM system that, building on ORB- SLAM2 [1], adds the capabilities of dynamic object detection and background inpainting. DynaSLAM is robust in dynamic scenarios for monocular, stereo and RGB-D configurations. We are capable of detecting the moving objects either by multi-view geometry, deep learning or both. Having a static map of the scene allows inpainting the frame background that has been occluded by such dynamic objects. We evaluate our system in public monocular, stereo and RGB-D datasets. We study the impact of several accuracy/speed trade-offs to assess the limits of the proposed methodology. DynaSLAM outperforms the accuracy of standard visual SLAM baselines in highly dynamic scenarios. And it also estimates a map of the static parts of the scene, which is a must for long-term applications in real-world environments
000071183 536__ $$9info:eu-repo/grantAgreement/ES/DGA/T04$$9info:eu-repo/grantAgreement/ES/MINECO/BES-2016-077836$$9info:eu-repo/grantAgreement/ES/MINECO/DPI2015-67275$$9info:eu-repo/grantAgreement/ES/MINECO/DPI2015-68905-P
000071183 540__ $$9info:eu-repo/semantics/openAccess$$aby$$uhttp://creativecommons.org/licenses/by/3.0/es/
000071183 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion
000071183 700__ $$aFácil, José María$$uUniversidad de Zaragoza
000071183 700__ $$0(orcid)0000-0003-1368-1151$$aCivera, Javier$$uUniversidad de Zaragoza
000071183 700__ $$0(orcid)0000-0003-0668-977X$$aNeira, José$$uUniversidad de Zaragoza
000071183 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000071183 7102_ $$15007$$2570$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Lenguajes y Sistemas Inf.
000071183 773__ $$g3, 4 (2018), 4076 - 4083$$pIEEE Robot. autom. let.$$tIEEE ROBOTICS AND AUTOMATION LETTERS$$x2377-3766
000071183 8564_ $$s3349022$$uhttps://zaguan.unizar.es/record/71183/files/texto_completo.pdf$$yVersión publicada
000071183 8564_ $$s134226$$uhttps://zaguan.unizar.es/record/71183/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada
000071183 909CO $$ooai:zaguan.unizar.es:71183$$particulos$$pdriver
000071183 951__ $$a2019-09-02-09:52:29
000071183 980__ $$aARTICLE