Distributed multi-target tracking and active perception with mobile camera networks
Resumen: Smart cameras are an essential component in surveillance and monitoring applications, and they have been typically deployed in networks of fixed camera locations. The addition of mobile cameras, mounted on robots, can overcome some of the limitations of static networks such as blind spots or back-lightning, allowing the system to gather the best information at each time by active positioning. This work presents a hybrid camera system, with static and mobile cameras, where all the cameras collaborate to observe people moving freely in the environment and efficiently visualize certain attributes from each person. Our solution combines a multi-camera distributed tracking system, to localize with precision all the people, with a control scheme that moves the mobile cameras to the best viewpoints for a specific classification task. The main contribution of this paper is a novel framework that exploits the synergies that result from the cooperation of the tracking and the control modules, obtaining a system closer to the real-world application and capable of high-level scene understanding. The static camera network provides global awareness of the control scheme to move the robots. In exchange, the mobile cameras onboard the robots provide enhanced information about the people on the scene. We perform a thorough analysis of the people monitoring application performance under different conditions thanks to the use of a photo-realistic simulation environment. Our experiments demonstrate the benefits of collaborative mobile cameras with respect to static or individual camera setups.
Idioma: Inglés
DOI: 10.1016/j.cviu.2023.103876
Año: 2024
Publicado en: COMPUTER VISION AND IMAGE UNDERSTANDING 238 (2024), 103876 [9 pp.]
ISSN: 1077-3142

Factor impacto JCR: 3.5 (2024)
Categ. JCR: ENGINEERING, ELECTRICAL & ELECTRONIC rank: 132 / 366 = 0.361 (2024) - Q2 - T2
Categ. JCR: COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE rank: 84 / 204 = 0.412 (2024) - Q2 - T2

Factor impacto SCIMAGO: 0.856 - Computer Vision and Pattern Recognition (Q1) - Signal Processing (Q1) - Software (Q2)

Tipo y forma: Article (Published version)
Área (Departamento): Área Ingen.Sistemas y Automát. (Dpto. Informát.Ingenie.Sistms.)
Dataset asociado: Collaborative cameras data ( https://sites.google.com/unizar.es/poc-team/research/hlunderstanding/collaborativecameras)
Exportado de SIDERAL (2025-09-22-14:32:35)


Visitas y descargas

Este artículo se encuentra en las siguientes colecciones:
articulos > articulos-por-area > ingenieria_de_sistemas_y_automatica



 Notice créée le 2024-03-01, modifiée le 2025-09-23


Versión publicada:
 PDF
Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)