Resumen: Human action recognition systems are typically focused on identifying different actions, rather than fine grained variations of the same action. This work explores strategies to identify different pointing directions in order to build a natural interaction system to guide autonomous systems such as drones. Commanding a drone with hand-held panels or tablets is common practice but intuitive user-drone interfaces might have significant benefits. The system proposed in this work just requires the user to provide occasional high-level navigation commands by pointing the drone towards the desired motion direction. Due to the lack of data on these settings, we present a new benchmarking video dataset to validate our framework and facilitate future research on the area. Our results show good accuracy for pointing direction recognition, while running at interactive rates and exhibiting robustness to variability in user appearance, viewpoint, camera distance and scenery. Idioma: Inglés DOI: 10.1109/CVPRW50498.2020.00528 Año: 2020 Publicado en: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops 2020-June (2020), 4480-4488 ISSN: 2160-7508 Factor impacto SCIMAGO: 1.122 - Electrical and Electronic Engineering - Computer Vision and Pattern Recognition