<?xml version="1.0" encoding="UTF-8"?>
<collection>
<dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:invenio="http://invenio-software.org/elements/1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"><dc:identifier>doi:10.1016/j.cviu.2025.104619</dc:identifier><dc:language>eng</dc:language><dc:creator>Gallego, Nerea</dc:creator><dc:creator>Plou, Carlos</dc:creator><dc:creator>Marcos, Miguel</dc:creator><dc:creator>Urcola, Pablo</dc:creator><dc:creator>Montesano, Luis</dc:creator><dc:creator>Montijano, Eduardo</dc:creator><dc:creator>Martinez-Cantin, Ruben</dc:creator><dc:creator>Murillo, Ana C.</dc:creator><dc:title>EventSleep2: Sleep activity recognition on complete night sleep recordings with an event camera</dc:title><dc:identifier>ART-2026-147979</dc:identifier><dc:description>Sleep is fundamental to health, and society is more and more aware of the impact and relevance of sleep disorders. Traditional diagnostic methods, like polysomnography, are intrusive and resource-intensive. Instead, research is focusing on developing novel, less intrusive or portable methods that combine intelligent sensors with activity recognition for diagnosis support and scoring. Event cameras offer a promising alternative for automated, in-home sleep activity recognition due to their excellent low-light performance and low power consumption. This work introduces EventSleep2-data, a significant extension to the EventSleep dataset, featuring 10 complete night recordings (around 7 h each) of volunteers sleeping in their homes. Unlike the original short and controlled recordings, this new dataset captures natural, full-night sleep sessions under realistic conditions. This new data incorporates challenging real-world scene variations, an efficient movement-triggered sparse data recording pipeline, and synchronized 2-channel EEG data for a subset of recordings. We also present EventSleep2-net, a novel event-based sleep activity recognition approach with a dual-head architecture to simultaneously analyze motion classes and static poses. The model is specifically designed to handle the motion-triggered, sparse nature of complete night recordings. Unlike the original EventSleep architecture, EventSleep2-net can predict both movement and static poses even during long periods with no events. We demonstrate state-of-the-art performance on both EventSleep1-data, the original dataset, and EventSleep2-data, with comprehensive ablation studies validating our design decisions. Together, EventSleep2-data and EventSleep2-net overcome the limitations of the previous setup and enable continuous, full-night analysis for real-world sleep monitoring, significantly advancing the potential of event-based vision for sleep disorder studies.</dc:description><dc:date>2026</dc:date><dc:source>http://zaguan.unizar.es/record/168539</dc:source><dc:doi>10.1016/j.cviu.2025.104619</dc:doi><dc:identifier>http://zaguan.unizar.es/record/168539</dc:identifier><dc:identifier>oai:zaguan.unizar.es:168539</dc:identifier><dc:relation>info:eu-repo/grantAgreement/ES/DGA/T45-23R</dc:relation><dc:relation>info:eu-repo/grantAgreement/EC/H2020/ 101135782/EU/Trustworthy Efficient AI for Cloud-Edge Computing/MANOLO</dc:relation><dc:relation>This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No H2020 101135782-MANOLO</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN-AEI/PID2021-125209OB-I00</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN/PID2021-125514NB-I00</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN/PID2024–158322OB-I00</dc:relation><dc:relation>info:eu-repo/grantAgreement/ES/MICINN/PID2024-159284NB-I00</dc:relation><dc:identifier.citation>COMPUTER VISION AND IMAGE UNDERSTANDING 264 (2026), 104619 [13 pp.]</dc:identifier.citation><dc:rights>by-nc</dc:rights><dc:rights>https://creativecommons.org/licenses/by-nc/4.0/deed.es</dc:rights><dc:rights>info:eu-repo/semantics/openAccess</dc:rights></dc:dc>

</collection>