
Blur‐Invariant Feature Descriptor Using Multidirectional Integral Projection
Author(s) -
Lee Man Hee,
Park In Kyu
Publication year - 2016
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.16.0115.0631
Subject(s) - artificial intelligence , computer vision , feature (linguistics) , motion blur , invariant (physics) , computer science , projection (relational algebra) , focus (optics) , scale invariant feature transform , rotation (mathematics) , feature matching , robustness (evolution) , matching (statistics) , mathematics , pattern recognition (psychology) , feature extraction , image (mathematics) , algorithm , philosophy , linguistics , mathematical physics , physics , biochemistry , chemistry , statistics , gene , optics
Feature detection and description are key ingredients of common image processing and computer vision applications. Most existing algorithms focus on robust feature matching under challenging conditions, such as in‐plane rotations and scale changes. Consequently, they usually fail when the scene is blurred by camera shake or an object's motion. To solve this problem, we propose a new feature description algorithm that is robust to image blur and significantly improves the feature matching performance. The proposed algorithm builds a feature descriptor by considering the integral projection along four angular directions (0°, 45°, 90°, and 135°) and by combining four projection vectors into a single high‐dimensional vector. Intensive experiment shows that the proposed descriptor outperforms existing descriptors for different types of blur caused by linear motion, nonlinear motion, and defocus. Furthermore, the proposed descriptor is robust to intensity changes and image rotation.