000168185 001__ 168185
000168185 005__ 20260128122452.0
000168185 0247_ $$2doi$$a10.1177/02783649211004959
000168185 0248_ $$2sideral$$a147683
000168185 037__ $$aART-2021-147683
000168185 041__ $$aeng
000168185 100__ $$aMorales, Jesús
000168185 245__ $$aThe UMA-SAR Dataset: Multimodal data collection from a ground vehicle during outdoor disaster response training exercises
000168185 260__ $$c2021
000168185 5203_ $$aThis article presents a collection of multimodal raw data captured from a manned all-terrain vehicle in the course of two realistic outdoor search and rescue (SAR) exercises for actual emergency responders conducted in Málaga (Spain) in 2018 and 2019: the UMA-SAR dataset. The sensor suite, applicable to unmanned ground vehicles (UGVs), consisted of overlapping visible light (RGB) and thermal infrared (TIR) forward-looking monocular cameras, a Velodyne HDL-32 three-dimensional (3D) lidar, as well as an inertial measurement unit (IMU) and two global positioning system (GPS) receivers as ground truth. Our mission was to collect a wide range of data from the SAR domain, including persons, vehicles, debris, and SAR activity on unstructured terrain. In particular, four data sequences were collected following closed-loop routes during the exercises, with a total path length of 5.2 km and a total time of 77 min. In addition, we provide three more sequences of the empty site for comparison purposes (an extra 4.9 km and 46 min). Furthermore, the data is offered both in human-readable format and as rosbag files, and two specific software tools are provided for extracting and adapting this dataset to the users’ preference. The review of previously published disaster robotics repositories indicates that this dataset can contribute to fill a gap regarding visual and thermal datasets and can serve as a research tool for cross-cutting areas such as multispectral image fusion, machine learning for scene understanding, person and object detection, and localization and mapping in unstructured environments.
000168185 536__ $$9info:eu-repo/grantAgreement/ES/AEI/RTI2018-093421-B-I00
000168185 540__ $$9info:eu-repo/semantics/closedAccess$$aAll rights reserved$$uhttp://www.europeana.eu/rights/rr-f/
000168185 590__ $$a6.887$$b2021
000168185 591__ $$aROBOTICS$$b6 / 30 = 0.2$$c2021$$dQ1$$eT1
000168185 592__ $$a3.403$$b2021
000168185 593__ $$aApplied Mathematics$$c2021$$dQ1
000168185 593__ $$aArtificial Intelligence$$c2021$$dQ1
000168185 593__ $$aSoftware$$c2021$$dQ1
000168185 593__ $$aModeling and Simulation$$c2021$$dQ1
000168185 593__ $$aMechanical Engineering$$c2021$$dQ1
000168185 594__ $$a14.8$$b2021
000168185 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion
000168185 700__ $$aVázquez-Martín, Ricardo
000168185 700__ $$aMandow, Anthony
000168185 700__ $$aMorilla-Cabello, David$$uUniversidad de Zaragoza
000168185 700__ $$aGarcía-Cerezo, Alfonso
000168185 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000168185 773__ $$g40, 6-7 (2021), 835-847$$pInt. j. rob. res.$$tThe International Journal of Robotics Research$$x0278-3649
000168185 8564_ $$s3850909$$uhttps://zaguan.unizar.es/record/168185/files/texto_completo.pdf$$yVersión publicada
000168185 8564_ $$s2471002$$uhttps://zaguan.unizar.es/record/168185/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada
000168185 909CO $$ooai:zaguan.unizar.es:168185$$particulos$$pdriver
000168185 951__ $$a2026-01-28-11:23:02
000168185 980__ $$aARTICLE