z-logo
open-access-imgOpen Access
Robot Orientation Estimation Based on Single-Frame of Fish-eye Image
Author(s) -
Muhammad Fuad,
Trihastuti Agustinah,
Djoko Purwanto,
Tri Arief Sardjono,
Rudy Dikairono
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1569/2/022092
Subject(s) - computer vision , artificial intelligence , orientation (vector space) , hough transform , computer science , heading (navigation) , robot , inertial measurement unit , machine vision , mathematics , engineering , image (mathematics) , geometry , aerospace engineering
In order to develop the steering control for collision avoidance behaviour, robot must be able to determine its heading orientation with respect to environment. Orientation can be measured by dedicated sensors or through visual features perception. In vision-based orientation estimation problem, most of approaches are making use of a matching process between pair of frames. This paper proposes a method of estimating robot’s heading orientation by using only a single-frame of fish-eye image. CIE-LAB colour space is applied to handle colour and illumination intensity change. Straight line segments are extracted from thresholded CIE-LAB image take advantage of Progressive Probabilistic Hough Transform. Angle of the corresponding line segment is measured using combination of Law of Cosines and quadrant principle. Heading orientation in yaw angle is estimated by implementing voting mechanism based on region grouping and length of perpendicular line. Some experiments are made in robot soccer field environment to compare orientation estimation system against IMU’s measurement. Discussion about the performance and limitation of the system are included in this paper.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here