000144803 001__ 144803
000144803 005__ 20251113150205.0
000144803 0247_ $$2doi$$a10.1109/TIV.2024.3418525
000144803 0248_ $$2sideral$$a139594
000144803 037__ $$aART-2025-139594
000144803 041__ $$aeng
000144803 100__ $$aLiu, Changxiang
000144803 245__ $$aPE-VINS: Accurate Monocular Visual-Inertial SLAM With Point-Edge Features
000144803 260__ $$c2025
000144803 5060_ $$aAccess copy available to the general public$$fUnrestricted
000144803 5203_ $$aVisual-Inertial Navigation Systems (VINS) is a significant undertaking in computer vision, robotics, and autonomous driving. Currently, point-line VINS have attracted significant attention due to their increased robustness and accuracy compared to point-only VINS. However, their effectiveness relies on the existence of clear line structures within the scene. Point-line VINS may become inaccurate or fail when scenes contain scattered lines or other features like arcs. Moreover, extracting and matching line features can bring computational overheads due to complex geometric models. In order to address VINS challenges without the overheads related to lines, we propose a novel approach, denoted as PE-VINS, which adds edge features to point-based VINS. Our proposed employs edge features in scenes to establish extra correspondences between views and then enhance its accuracy and robustness. Our method identifies edge features using image gradients and selects the most informative ones in the front end. We leverage sparse optical flow to track selected edge features and triangulate them using the initial pose predicted by the Inertial Measurement Unit (IMU). In the back end, we present a novel edge feature residual formulation that differs from the traditional reprojection residual. We tightly couple the new edge residual with the reprojection and IMU preintegration residual to better refine camera poses. We test our PE-VINS on public datasets, and our results show that it outperforms existing point-line-based methods and achieves state-of-the-art VINS performance. The code will be released at https://github.com/BlueAkoasm/PE-VINS .
000144803 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000144803 590__ $$a14.3$$b2024
000144803 592__ $$a2.821$$b2024
000144803 591__ $$aCOMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE$$b6 / 204 = 0.029$$c2024$$dQ1$$eT1
000144803 593__ $$aArtificial Intelligence$$c2024$$dQ1
000144803 591__ $$aTRANSPORTATION SCIENCE & TECHNOLOGY$$b4 / 77 = 0.052$$c2024$$dQ1$$eT1
000144803 593__ $$aControl and Optimization$$c2024$$dQ1
000144803 591__ $$aENGINEERING, ELECTRICAL & ELECTRONIC$$b7 / 366 = 0.019$$c2024$$dQ1$$eT1
000144803 593__ $$aAutomotive Engineering$$c2024$$dQ1
000144803 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000144803 700__ $$aYu, Hongshan
000144803 700__ $$aCheng, Panfei
000144803 700__ $$aSun, Wei
000144803 700__ $$0(orcid)0000-0003-1368-1151$$aCivera, Javier$$uUniversidad de Zaragoza
000144803 700__ $$aChen, Xieyuanli
000144803 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000144803 773__ $$g10, 2 (2025), 808 - 818$$pIEEE trans. intell. veh.$$tIEEE transactions on intelligent vehicles$$x2379-8858
000144803 8564_ $$s14016338$$uhttps://zaguan.unizar.es/record/144803/files/texto_completo.pdf$$yPostprint
000144803 8564_ $$s3661713$$uhttps://zaguan.unizar.es/record/144803/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000144803 909CO $$ooai:zaguan.unizar.es:144803$$particulos$$pdriver
000144803 951__ $$a2025-11-13-15:00:48
000144803 980__ $$aARTICLE