000078835 001__ 78835
000078835 005__ 20210223082304.0
000078835 0247_ $$2doi$$a10.1109/LRA.2018.2889156Y
000078835 0248_ $$2sideral$$a111186
000078835 037__ $$aART-2019-111186
000078835 041__ $$aeng
000078835 100__ $$0(orcid)0000-0002-3821-1968$$aLee, S.H.$$uUniversidad de Zaragoza
000078835 245__ $$aLoosely-Coupled Semi-Direct Monocular SLAM
000078835 260__ $$c2019
000078835 5060_ $$aAccess copy available to the general public$$fUnrestricted
000078835 5203_ $$aWe propose a novel semi-direct approach for monocular simultaneous localization and mapping (SLAM) that combines the complementary strengths of direct and feature-based methods. The proposed pipeline loosely couples direct odometry and feature-based SLAM to perform three levels of parallel optimizations: 1) photometric bundle adjustment (BA) that jointly optimizes the local structure and motion, 2) geometric BA that refines keyframe poses and associated feature map points, and 3) pose graph optimization to achieve global map consistency in the presence of loop closures. This is achieved in real-time by limiting the feature-based operations to marginalized keyframes from the direct odometry module. Exhaustive evaluation on two benchmark datasets demonstrates that our system outperforms the state-of-the-art monocular odometry and SLAM systems in terms of overall accuracy and robustness.
000078835 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000078835 590__ $$a3.608$$b2019
000078835 592__ $$a1.555$$b2019
000078835 591__ $$aROBOTICS$$b6 / 28 = 0.214$$c2019$$dQ1$$eT1
000078835 593__ $$aArtificial Intelligence$$c2019$$dQ1
000078835 593__ $$aBiomedical Engineering$$c2019$$dQ1
000078835 593__ $$aComputer Science Applications$$c2019$$dQ1
000078835 593__ $$aMechanical Engineering$$c2019$$dQ1
000078835 593__ $$aControl and Optimization$$c2019$$dQ1
000078835 593__ $$aControl and Systems Engineering$$c2019$$dQ1
000078835 593__ $$aHuman-Computer Interaction$$c2019$$dQ1
000078835 593__ $$aComputer Vision and Pattern Recognition$$c2019$$dQ1
000078835 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000078835 700__ $$0(orcid)0000-0003-1368-1151$$aCivera, J.$$uUniversidad de Zaragoza
000078835 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000078835 773__ $$g4, 2 (2019), 399-406$$pIEEE Robot. autom. let.$$tIEEE ROBOTICS AND AUTOMATION LETTERS$$x2377-3766
000078835 8564_ $$s811810$$uhttps://zaguan.unizar.es/record/78835/files/texto_completo.pdf$$yPostprint
000078835 8564_ $$s116314$$uhttps://zaguan.unizar.es/record/78835/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000078835 909CO $$ooai:zaguan.unizar.es:78835$$particulos$$pdriver
000078835 951__ $$a2021-02-23-08:20:43
000078835 980__ $$aARTICLE