z-logo
open-access-imgOpen Access
Bio-inspired visual self-localization in real world scenarios using Slow Feature Analysis
Author(s) -
Benjamin Metka,
Mathias Franzius,
Ute Bauer-Wersing
Publication year - 2018
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0203994
Subject(s) - artificial intelligence , computer science , orientation (vector space) , computer vision , pattern recognition (psychology) , representation (politics) , invariant (physics) , position (finance) , visualization , omnidirectional antenna , feature (linguistics) , mathematics , telecommunications , linguistics , philosophy , geometry , finance , politics , political science , antenna (radio) , law , economics , mathematical physics
We present a biologically motivated model for visual self-localization which extracts a spatial representation of the environment directly from high dimensional image data by employing a single unsupervised learning rule. The resulting representation encodes the position of the camera as slowly varying features while being invariant to its orientation resembling place cells in a rodent’s hippocampus. Using an omnidirectional mirror allows to manipulate the image statistics by adding simulated rotational movement for improved orientation invariance. We apply the model in indoor and outdoor experiments and, for the first time, compare its performance against two state of the art visual SLAM methods. Results of the experiments show that the proposed straightforward model enables a precise self-localization with accuracies in the range of 13-33cm demonstrating its competitiveness to the established SLAM methods in the tested scenarios.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here