
A Study on Intelligent Guide Stick Using YOLOv3 Algorithm – Improving Spatial Awareness with Self-made Data Set
Author(s) -
Stone Yan
Publication year - 2021
Publication title -
proceedings of business and economic studies
Language(s) - English
Resource type - Journals
eISSN - 2209-265X
pISSN - 2209-2641
DOI - 10.26689/pbes.v4i3.2203
Subject(s) - set (abstract data type) , computer science , computer vision , software , artificial intelligence , population , human–computer interaction , medicine , environmental health , programming language
The increasing negligence of “blind lanes” on streets in metropolises such as Shanghai is an inconvenience to the blind population, therefore, alternatives should be explored. When walking on these lanes, there should not be any obstacles for the users, yet bikes can be seen parked there in addition to littered objects. This makes it potentially more dangerous to walk on them in comparison to walking on non-tactile paved lanes. An alternative to these lanes has been discovered which is a camera that is attached to a blind staff. This camera provides auditory feedbacks in regard to the user’s surroundings. Using YOLOv3 (You Only Look Once, Version 3), the software is trained using 140 images to identify three different classes which are blind lanes, waist-high obstacles, and dog feces, as well as the right direction of these objects. If the camera captures any of these three categories, it will provide and voice feedback, hence, warning the user. With this system, the blind can essentially have functional vision that would better guarantee their safety when walking on streets.