Distributed multi-target tracking and active perception with mobile camera networks
Resumen: Smart cameras are an essential component in surveillance and monitoring applications, and they have been typically deployed in networks of fixed camera locations. The addition of mobile cameras, mounted on robots, can overcome some of the limitations of static networks such as blind spots or back-lightning, allowing the system to gather the best information at each time by active positioning. This work presents a hybrid camera system, with static and mobile cameras, where all the cameras collaborate to observe people moving freely in the environment and efficiently visualize certain attributes from each person. Our solution combines a multi-camera distributed tracking system, to localize with precision all the people, with a control scheme that moves the mobile cameras to the best viewpoints for a specific classification task. The main contribution of this paper is a novel framework that exploits the synergies that result from the cooperation of the tracking and the control modules, obtaining a system closer to the real-world application and capable of high-level scene understanding. The static camera network provides global awareness of the control scheme to move the robots. In exchange, the mobile cameras onboard the robots provide enhanced information about the people on the scene. We perform a thorough analysis of the people monitoring application performance under different conditions thanks to the use of a photo-realistic simulation environment. Our experiments demonstrate the benefits of collaborative mobile cameras with respect to static or individual camera setups.
Idioma: Inglés
DOI: 10.1016/j.cviu.2023.103876
Año: 2024
Publicado en: COMPUTER VISION AND IMAGE UNDERSTANDING 238 (2024), 103876 [9 pp.]
ISSN: 1077-3142

Factor impacto JCR: 3.5 (2024)
Categ. JCR: ENGINEERING, ELECTRICAL & ELECTRONIC rank: 132 / 366 = 0.361 (2024) - Q2 - T2
Categ. JCR: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE rank: 84 / 204 = 0.412 (2024) - Q2 - T2

Factor impacto SCIMAGO: 0.856 - Computer Vision and Pattern Recognition (Q1) - Signal Processing (Q1) - Software (Q2)

Tipo y forma: Artículo (Versión definitiva)
Área (Departamento): Área Ingen.Sistemas y Automát. (Dpto. Informát.Ingenie.Sistms.)
Dataset asociado: Collaborative cameras data ( https://sites.google.com/unizar.es/poc-team/research/hlunderstanding/collaborativecameras)

Creative Commons Debe reconocer adecuadamente la autoría, proporcionar un enlace a la licencia e indicar si se han realizado cambios. Puede hacerlo de cualquier manera razonable, pero no de una manera que sugiera que tiene el apoyo del licenciador o lo recibe por el uso que hace. No puede utilizar el material para una finalidad comercial. Si remezcla, transforma o crea a partir del material, no puede difundir el material modificado.


Exportado de SIDERAL (2025-09-22-14:32:35)


Visitas y descargas

Este artículo se encuentra en las siguientes colecciones:
Artículos > Artículos por área > Máster Universitario en Ingeniería de Sistemas y Automática



 Registro creado el 2024-03-01, última modificación el 2025-09-23


Versión publicada:
 PDF
Valore este documento:

Rate this document:
1
2
3
 
(Sin ninguna reseña)