<?xml version="1.0" encoding="UTF-8"?>
<collection>
<dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:invenio="http://invenio-software.org/elements/1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"><dc:identifier>doi:10.1016/j.isatra.2023.08.013</dc:identifier><dc:language>eng</dc:language><dc:creator>Aldana-López, Rodrigo</dc:creator><dc:creator>Aragüés, Rosario</dc:creator><dc:creator>Sagüés, Carlos</dc:creator><dc:title>PLATE: A perception-latency aware estimator</dc:title><dc:identifier>ART-2023-136053</dc:identifier><dc:description>Target tracking is a popular problem with many potential applications. There has been a lot of effort on improving the quality of the detection of targets using cameras through different techniques. In general, with higher computational effort applied, i.e., a longer perception-latency, a better detection accuracy is obtained. However, it is not always useful to apply the longest perception-latency allowed, particularly when the environment does not require to and when the computational resources are shared between other tasks. In this work, we propose a new Perception-LATency aware Estimator (PLATE), which uses different perception configurations in different moments of time in order to optimize a certain performance measure. This measure takes into account a perception-latency and accuracy trade-off aiming for a good compromise between quality and resource usage. Compared to other heuristic frame-skipping techniques, PLATE comes with a formal complexity and optimality analysis. The advantages of PLATE are verified by several experiments including an evaluation over a standard benchmark with real data and using state of the art deep learning object detection methods for the perception stage.</dc:description><dc:date>2023</dc:date><dc:source>http://zaguan.unizar.es/record/129653</dc:source><dc:doi>10.1016/j.isatra.2023.08.013</dc:doi><dc:identifier>http://zaguan.unizar.es/record/129653</dc:identifier><dc:identifier>oai:zaguan.unizar.es:129653</dc:identifier><dc:identifier.citation>ISA TRANSACTIONS 142 (2023), 716-730</dc:identifier.citation><dc:rights>by-nc-nd</dc:rights><dc:rights>https://creativecommons.org/licenses/by-nc-nd/4.0/deed.es</dc:rights><dc:rights>info:eu-repo/semantics/openAccess</dc:rights></dc:dc>

</collection>