000132203 001__ 132203
000132203 005__ 20250923084415.0
000132203 0247_ $$2doi$$a10.1016/j.cviu.2023.103876
000132203 0248_ $$2sideral$$a137365
000132203 037__ $$aART-2024-137365
000132203 041__ $$aeng
000132203 100__ $$aCasao, Sara$$uUniversidad de Zaragoza
000132203 245__ $$aDistributed multi-target tracking and active perception with mobile camera networks
000132203 260__ $$c2024
000132203 5060_ $$aAccess copy available to the general public$$fUnrestricted
000132203 5203_ $$aSmart cameras are an essential component in surveillance and monitoring applications, and they have been typically deployed in networks of fixed camera locations. The addition of mobile cameras, mounted on robots, can overcome some of the limitations of static networks such as blind spots or back-lightning, allowing the system to gather the best information at each time by active positioning. This work presents a hybrid camera system, with static and mobile cameras, where all the cameras collaborate to observe people moving freely in the environment and efficiently visualize certain attributes from each person. Our solution combines a multi-camera distributed tracking system, to localize with precision all the people, with a control scheme that moves the mobile cameras to the best viewpoints for a specific classification task. The main contribution of this paper is a novel framework that exploits the synergies that result from the cooperation of the tracking and the control modules, obtaining a system closer to the real-world application and capable of high-level scene understanding. The static camera network provides global awareness of the control scheme to move the robots. In exchange, the mobile cameras onboard the robots provide enhanced information about the people on the scene. We perform a thorough analysis of the people monitoring application performance under different conditions thanks to the use of a photo-realistic simulation environment. Our experiments demonstrate the benefits of collaborative mobile cameras with respect to static or individual camera setups.
000132203 540__ $$9info:eu-repo/semantics/openAccess$$aby-nc-nd$$uhttp://creativecommons.org/licenses/by-nc-nd/3.0/es/
000132203 590__ $$a3.5$$b2024
000132203 592__ $$a0.856$$b2024
000132203 591__ $$aENGINEERING, ELECTRICAL & ELECTRONIC$$b132 / 366 = 0.361$$c2024$$dQ2$$eT2
000132203 593__ $$aComputer Vision and Pattern Recognition$$c2024$$dQ1
000132203 591__ $$aCOMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE$$b84 / 204 = 0.412$$c2024$$dQ2$$eT2
000132203 593__ $$aSignal Processing$$c2024$$dQ1
000132203 593__ $$aSoftware$$c2024$$dQ2
000132203 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion
000132203 700__ $$aSerra-Gómez, Álvaro
000132203 700__ $$aMurillo, Ana C.
000132203 700__ $$aBöhmer, Wendelin
000132203 700__ $$aAlonso-Mora, Javier
000132203 700__ $$0(orcid)0000-0002-5176-3767$$aMontijano, Eduardo$$uUniversidad de Zaragoza
000132203 7102_ $$15007$$2520$$aUniversidad de Zaragoza$$bDpto. Informát.Ingenie.Sistms.$$cÁrea Ingen.Sistemas y Automát.
000132203 773__ $$g238 (2024), 103876 [9 pp.]$$pComput. vis. image underst.$$tCOMPUTER VISION AND IMAGE UNDERSTANDING$$x1077-3142
000132203 787__ $$tCollaborative cameras data$$whttps://sites.google.com/unizar.es/poc-team/research/hlunderstanding/collaborativecameras
000132203 8564_ $$s1697241$$uhttps://zaguan.unizar.es/record/132203/files/texto_completo.pdf$$yVersión publicada
000132203 8564_ $$s2610769$$uhttps://zaguan.unizar.es/record/132203/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada
000132203 909CO $$ooai:zaguan.unizar.es:132203$$particulos$$pdriver
000132203 951__ $$a2025-09-22-14:32:35
000132203 980__ $$aARTICLE