
Using turbulence scintillation to assist object ranging from a single camera viewpoint
Author(s) -
Chensheng Wu,
Jonathan Ko,
Joseph Coffaro,
Daniel A. Paulson,
John R. Rzasa,
Larry C. Andrews,
Ronald L. Phillips,
Robert Crabbs,
Christophér C. Davis
Publication year - 2018
Publication title -
applied optics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.668
H-Index - 197
eISSN - 2155-3165
pISSN - 1559-128X
DOI - 10.1364/ao.57.002177
Subject(s) - scintillation , ranging , computer vision , turbulence , computer science , distortion (music) , object (grammar) , optics , artificial intelligence , pixel , physics , noise (video) , image processing , perspective (graphical) , clear air turbulence , remote sensing , image (mathematics) , geology , detector , telecommunications , meteorology , amplifier , bandwidth (computing)
Image distortions caused by atmospheric turbulence are often treated as unwanted noise or errors in many image processing studies. Our study, however, shows that in certain scenarios the turbulence distortion can be very helpful in enhancing image processing results. This paper describes a novel approach that uses the scintillation traits recorded on a video clip to perform object ranging with reasonable accuracy from a single camera viewpoint. Conventionally, a single camera would be confused by the perspective viewing problem, where a large object far away looks the same as a small object close by. When the atmospheric turbulence phenomenon is considered, the edge or texture pixels of an object tend to scintillate and vary more with increased distance. This turbulence induced signature can be quantitatively analyzed to achieve object ranging with reasonable accuracy. Despite the inevitable fact that turbulence will cause random blurring and deformation of imaging results, it also offers convenient solutions to some remote sensing and machine vision problems, which would otherwise be difficult.