z-logo
open-access-imgOpen Access
Puck tracking system for aerohockey game with YOLO2
Author(s) -
A. E. Tolmacheva,
D. A. Ogurcov,
Mikhail Dorrer
Publication year - 2019
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1399/3/033116
Subject(s) - robot , artificial intelligence , convolutional neural network , computer science , task (project management) , computer vision , tracking (education) , tracking system , artificial neural network , deep learning , trajectory , human–computer interaction , engineering , systems engineering , kalman filter , psychology , pedagogy , physics , astronomy
This article is devoted to the preparation of the YOLO2 convolutional neural network for use in the robot artificial intelligence tracking module in air hockey competitions in the human-robot or robot-robot category. Such developments are unique in the sphere of artificial intelligence and robotic technologies. The task was to collect and prepare educational material, as well as further training and testing of the convolutional neural network to create a “vision” and further predict the trajectory of a fixed object. For this, the YOLO2 model was used, created by the developer Ali Farhadi in the low-level language C [1]. As a result of testing in the next game, the results of the correct detection of the object in the range of 80% were obtained, despite the lack of a standard camera position and poor image quality. In the future, the created system can be used in the further development of the AI robot systems and for creating tactics of behavior in various air hockey matches.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here