z-logo
open-access-imgOpen Access
Person following control for a mobile robot based on color invariance corresponding to varying illumination
Author(s) -
Shinsuke Oh-hara,
Kaoru Saito,
A. Fujimori
Publication year - 2022
Publication title -
international journal of robotics and automation (ijra)/iaes international journal of robotics and automation
Language(s) - English
Resource type - Journals
eISSN - 2722-2586
pISSN - 2089-4856
DOI - 10.11591/ijra.v11i1.pp33-42
Subject(s) - computer vision , artificial intelligence , mobile robot , robot , computer science , sight , object (grammar) , tracking (education) , particle filter , psychology , filter (signal processing) , physics , astronomy , pedagogy
In this paper, we present a method of  person following control for a mobile  robot  using  visual  information.  Color  information  is  often  used  for  object  tracking.  Color  information  of  objects  varies  greatly  under illumination  changing  environment.  In  such  conditions ,  the  robot  controlled  by  visu al  information  may lose  sight of  a person.  In this paper,  we  consider  a  robust  person following method by color invariance and image - based control. Color  invariance show s robust  features  of  colored  objects  in  terms  of  changing  illumination conditions . At f irst, we estimate  the  lowest positions of both feet  of a tracked person through particle filters based on color invariances . Then,  we control the velocity of the robot to track the person by using the image - based controller. Experimental results using a n a ctual  robot demonstrate the  effectiveness of the proposed method.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here