Premium
Repetition Maximization based Texture Rectification
Author(s) -
Aiger Dror,
CohenOr Daniel,
Mitra Niloy J.
Publication year - 2012
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/j.1467-8659.2012.03023.x
Subject(s) - computer science , intersection (aeronautics) , rectification , image rectification , artificial intelligence , transformation (genetics) , homography , set (abstract data type) , perspective (graphical) , computer vision , similarity (geometry) , image (mathematics) , process (computing) , maximization , pattern recognition (psychology) , mathematics , mathematical optimization , aerospace engineering , chemistry , engineering , operating system , power (physics) , biochemistry , quantum mechanics , programming language , statistics , physics , projective test , projective space , gene
Abstract Many photographs are taken in perspective. Techniques for rectifying resulting perspective distortions typically rely on the existence of parallel lines in the scene. In scenarios where such parallel lines are hard to automatically extract or manually annotate, the unwarping process remains a challenge. In this paper, we introduce an automatic algorithm to rectifying images containing textures of repeated elements lying on an unknown plane. We unwrap the input by maximizing for image self‐similarity over the space of homography transformations. We map a set of detected regional descriptors to surfaces in a transformation space, compute the intersection points among triplets of such surfaces, and then use consensus among the projected intersection points to extract the correcting transform. Our algorithm is global, robust, and does not require explicit or accurate detection of similar elements. We evaluate our method on a variety of challenging textures and images. The rectified outputs are directly useful for various tasks including texture synthesis, image completion, etc.