z-logo
open-access-imgOpen Access
An Improved SIFT Algorithm for Monocular Vision Positioning
Author(s) -
Xinyue Niu,
Jiexin Pu,
Chi Zhang
Publication year - 2019
Publication title -
iop conference series. materials science and engineering
Language(s) - English
Resource type - Journals
eISSN - 1757-899X
pISSN - 1757-8981
DOI - 10.1088/1757-899x/612/3/032124
Subject(s) - scale invariant feature transform , artificial intelligence , computer vision , feature (linguistics) , computer science , monocular , matching (statistics) , template matching , monocular vision , image registration , euclidean distance , image (mathematics) , algorithm , pattern recognition (psychology) , mathematics , philosophy , linguistics , statistics
In view of the high real-time and accuracy requirements of the monocular hand-eye vision system in the positioning process, and in the case that the existing image matching algorithm can not meet these two requirements well at the same time, this paper improves the SIFT feature matching algorithm based on local features. First, the corner points determined in the Harris operator are used instead of the key points determined by the SIFT algorithm as feature points in the template image and the image to be matched. Then, a 32-dimensional feature description vector is constructed for each of the selected feature points through a Gaussian circular window. In the registration phase, the Euclidean distance is used as a measure function to match the 32-dimensional feature descriptors. Finally, 100 template images acquired by the monocular hand-eye vision experimental platform are used to test the matching effect of the improved SIFT algorithm, which proves that the improved algorithm has higher improvement in matching time and registration accuracy than the original algorithm. It is applicable to image registration for monocular vision positioning in industrial practice.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here