000148181 001__ 148181 000148181 005__ 20250114175434.0 000148181 0247_ $$2doi$$a10.1109/LRA.2021.3057568 000148181 0248_ $$2sideral$$a124071 000148181 037__ $$aART-2021-124071 000148181 041__ $$aeng 000148181 100__ $$0(orcid)0000-0001-6406-7042$$aTeruel, E.$$uUniversidad de Zaragoza 000148181 245__ $$aA Practical Method to Cover Evenly a Dynamic Region with a Swarm 000148181 260__ $$c2021 000148181 5060_ $$aAccess copy available to the general public$$fUnrestricted 000148181 5203_ $$aMany applications require exploring or monitoring a region. This can be achieved by a sensor network, a large team of robots which can cover only a very small fraction each. When the region is convex, small, and static, it suffices to deploy the robots as a Centroidal Voronoi Tessellation (CVT). Instead, we consider that the area to cover is wide, not necessarily convex, and complex. Then, a smaller simple region is maneuvered and deformed to rake the full area. A few waypoints describing the region along time are provided to the robots. The goal is that the robots coordinate to dynamically deploy over this region evenly, near a CVT. Unfortunately, the distributed CVT computation algorithm converges too slowly for such exploration method to be practical. In this work, CVT computation is complemented with feedback and feedforward based control techniques, and dynamic consensus, to adjust the robot speeds so that they coordinate to cover the dynamic region. We demonstrate in simulation that the proposed method succeeds to achieve the goal of tracking the region, with the robots evenly deployed, while keeping the connectivity and avoiding collisions. We also compare the performance of the proposed method versus other alternatives. 000148181 536__ $$9info:eu-repo/grantAgreement/ES/MCIU-AEI-FEDER/PGC2018-098719-B-I00 000148181 540__ $$9info:eu-repo/semantics/openAccess$$aby$$uhttp://creativecommons.org/licenses/by/3.0/es/ 000148181 590__ $$a4.321$$b2021 000148181 591__ $$aROBOTICS$$b11 / 30 = 0.367$$c2021$$dQ2$$eT2 000148181 592__ $$a2.206$$b2021 000148181 593__ $$aArtificial Intelligence$$c2021$$dQ1 000148181 593__ $$aBiomedical Engineering$$c2021$$dQ1 000148181 593__ $$aMechanical Engineering$$c2021$$dQ1 000148181 593__ $$aControl and Optimization$$c2021$$dQ1 000148181 593__ $$aControl and Systems Engineering$$c2021$$dQ1 000148181 593__ $$aComputer Vision and Pattern Recognition$$c2021$$dQ1 000148181 594__ $$a8.0$$b2021 000148181 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/acceptedVersion 000148181 700__ $$0(orcid)0000-0001-9458-6257$$aAragues, R.$$uUniversidad de Zaragoza 000148181 700__ $$0(orcid)0000-0001-9347-5969$$aLópez-Nicolás, G.$$uUniversidad de Zaragoza 000148181 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát. 000148181 773__ $$g6, 2 (2021), 1359-1366$$pIEEE Robot. autom. let.$$tIEEE Robotics and Automation Letters$$x2377-3766 000148181 8564_ $$s756622$$uhttps://zaguan.unizar.es/record/148181/files/texto_completo.pdf$$yPostprint 000148181 8564_ $$s3486275$$uhttps://zaguan.unizar.es/record/148181/files/texto_completo.jpg?subformat=icon$$xicon$$yPostprint 000148181 909CO $$ooai:zaguan.unizar.es:148181$$particulos$$pdriver 000148181 951__ $$a2025-01-14-15:48:59 000148181 980__ $$aARTICLE