000075365 001__ 75365
000075365 005__ 20190709135419.0
000075365 0247_ $$2doi$$a10.1016/j.cviu.2016.04.007
000075365 0248_ $$2sideral$$a95017
000075365 037__ $$aART-2017-95017
000075365 041__ $$aeng
000075365 100__ $$0(orcid)0000-0002-8949-2632$$aPerez-Yus, A.$$uUniversidad de Zaragoza
000075365 245__ $$aStairs detection with odometry-aided traversal from a wearable RGB-D camera
000075365 260__ $$c2017
000075365 5060_ $$aAccess copy available to the general public$$fUnrestricted
000075365 5203_ $$aStairs are one of the most common structures present in human-made scenarios, but also one of the most dangerous for those with vision problems. In this work we propose a complete method to detect, locate and parametrise stairs with a wearable RGB-D camera. Our algorithm uses the depth data to determine if the horizontal planes in the scene are valid steps of a staircase judging their dimensions and relative positions. As a result we obtain a scaled model of the staircase with the spatial location and orientation with respect to the subject. The visual odometry is also estimated to continuously recover the current position and orientation of the user while moving. This enhances the system giving the ability to come back to previously detected features and providing location awareness of the user during the climb. Simultaneously, the detection of the staircase during the traversal is used to correct the drift of the visual odometry. A comparison of results of the stair detection with other state-of-the-art algorithms was performed using public dataset. Additional experiments have also been carried out, recording our own natural scenes with a chest-mounted RGB-D camera in indoor scenarios. The algorithm is robust enough to work in real-time and even under partial occlusions of the stair.
000075365 536__ $$9info:eu-repo/grantAgreement/ES/MINECO/BES-2013-065834$$9info:eu-repo/grantAgreement/ES/MINECO/DPI2014-61792-EXP$$9info:eu-repo/grantAgreement/ES/MINECO/DPI2015-65962-R
000075365 540__ $$9info:eu-repo/semantics/openAccess$$aby-nc-nd$$uhttp://creativecommons.org/licenses/by-nc-nd/3.0/es/
000075365 590__ $$a2.391$$b2017
000075365 591__ $$aENGINEERING, ELECTRICAL & ELECTRONIC$$b99 / 260 = 0.381$$c2017$$dQ2$$eT2
000075365 591__ $$aCOMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE$$b43 / 132 = 0.326$$c2017$$dQ2$$eT1
000075365 592__ $$a0.717$$b2017
000075365 593__ $$aComputer Vision and Pattern Recognition$$c2017$$dQ1
000075365 593__ $$aSoftware$$c2017$$dQ1
000075365 593__ $$aSignal Processing$$c2017$$dQ1
000075365 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion
000075365 700__ $$0(orcid)0000-0002-3220-6789$$aGutierrez-Gomez, D.
000075365 700__ $$0(orcid)0000-0001-9347-5969$$aLopez-Nicolas, G.$$uUniversidad de Zaragoza
000075365 700__ $$0(orcid)0000-0001-5209-2267$$aGuerrero, J. J.$$uUniversidad de Zaragoza
000075365 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000075365 773__ $$g154 (2017), 192-205$$pComput. vis. image underst.$$tCOMPUTER VISION AND IMAGE UNDERSTANDING$$x1077-3142
000075365 8564_ $$s7258850$$uhttps://zaguan.unizar.es/record/75365/files/texto_completo.pdf$$yPostprint
000075365 8564_ $$s123465$$uhttps://zaguan.unizar.es/record/75365/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint
000075365 909CO $$ooai:zaguan.unizar.es:75365$$particulos$$pdriver
000075365 951__ $$a2019-07-09-11:25:22
000075365 980__ $$aARTICLE