Towards a Dataset of Activities for Action Recognition in Open Fields
Author(s) -
Alexander Gabriel,
Nicola Bellotto,
Paul Baxter
Publication year - 2019
Publication title -
journal of robotics and autonomous systems
Language(s) - English
Resource type - Conference proceedings
ISSN - 2516-502X
DOI - 10.31256/ukras19.17
Subject(s) - computer science , robot , human–computer interaction , action recognition , task (project management) , artificial intelligence , action (physics) , context (archaeology) , mobile robot , gesture , human–robot interaction , engineering , physics , systems engineering , quantum mechanics , class (philosophy) , paleontology , biology
In an agricultural context, having autonomous robots that can work side-by-side with human workers provide a range of productivity benefits. In order for this to be achieved safely and effectively, these autonomous robots require the ability to understand a range of human behaviors in order to facilitate task communication and coordination. The recognition of human actions is a key part of this, and is the focus of this paper. Available datasets for Action Recognition generally feature controlled lighting and framing while recording subjects from the front. They mostly reflect good recording conditions but fail to model the data a robot will have to work with in the field, such as varying distance and lighting conditions. In this work, we propose a set of recording conditions, gestures and behaviors that better reflect the environment an agricultural robot might find itself in and record a dataset with a range of sensors that demonstrate these conditions.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom