000128011 001__ 128011 000128011 005__ 20241125101129.0 000128011 0247_ $$2doi$$a10.1002/rob.22232 000128011 0248_ $$2sideral$$a135109 000128011 037__ $$aART-2023-135109 000128011 041__ $$aeng 000128011 100__ $$aCremona, Javier 000128011 245__ $$aGNSS-stereo-inertial SLAM for arable farming 000128011 260__ $$c2023 000128011 5060_ $$aAccess copy available to the general public$$fUnrestricted 000128011 5203_ $$aThe accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo-inertial ORB-SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in-house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state-of-the-art approaches. We characterize the performance of GNSS-stereo-inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS-stereo-inertial baselines. In addition to such analysis, we also release the code of our implementation as open source. 000128011 536__ $$9info:eu-repo/grantAgreement/ES/DGA/T45-17R$$9info:eu-repo/grantAgreement/ES/MCIU-AEI-FEDER/PGC2018-096367-B-I00$$9info:eu-repo/grantAgreement/ES/MICINN/PID2021-127685NB-I00 000128011 540__ $$9info:eu-repo/semantics/openAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/ 000128011 590__ $$a4.2$$b2023 000128011 592__ $$a1.949$$b2023 000128011 591__ $$aROBOTICS$$b14 / 46 = 0.304$$c2023$$dQ2$$eT1 000128011 593__ $$aControl and Systems Engineering$$c2023$$dQ1 000128011 593__ $$aComputer Science Applications$$c2023$$dQ1 000128011 594__ $$a15.0$$b2023 000128011 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/submittedVersion 000128011 700__ $$0(orcid)0000-0003-1368-1151$$aCivera, Javier$$uUniversidad de Zaragoza 000128011 700__ $$aKofman, Ernesto 000128011 700__ $$aPire, Taihú 000128011 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát. 000128011 773__ $$g(2023), [11 pp.]$$pJournal of Field Robotics$$tJournal of Field Robotics$$x1556-4959 000128011 8564_ $$s8766487$$uhttps://zaguan.unizar.es/record/128011/files/texto_completo.pdf$$yPreprint 000128011 8564_ $$s2298428$$uhttps://zaguan.unizar.es/record/128011/files/texto_completo.jpg?subformat=icon$$xicon$$yPreprint 000128011 909CO $$ooai:zaguan.unizar.es:128011$$particulos$$pdriver 000128011 951__ $$a2024-11-22-11:58:24 000128011 980__ $$aARTICLE