z-logo
Premium
A simplified approach using deep neural network for fast and accurate shape from focus
Author(s) -
Mutahira Husna,
Muhammad Mannan Saeed,
Li Mikhail,
Shin DongRyeol
Publication year - 2021
Publication title -
microscopy research and technique
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.536
H-Index - 118
eISSN - 1097-0029
pISSN - 1059-910X
DOI - 10.1002/jemt.23623
Subject(s) - focus (optics) , artificial intelligence , mean squared error , computer science , artificial neural network , computer vision , interpolation (computer graphics) , image (mathematics) , measure (data warehouse) , object (grammar) , filter (signal processing) , pattern recognition (psychology) , mathematics , optics , data mining , statistics , physics
Three‐dimensional shape recovery is an important issue in the field of computer vision. Shape from Focus (SFF) is one of the passive techniques that uses focus information to estimate the three‐dimensional shape of an object in the scene. Images are taken at multiple positions along the optical axis of the imaging device and are stored in a stack. In order to reconstruct the three dimensional shape of the object, the best‐focused positions are acquired by maximizing the focus curves obtained via application of a focus measure operator. In this article, Deep Neural Network (DNN) is employed to extract the more accurate depth of each object point in the image stack. The size of each image in the stack is first reduced and then provided to the proposed DNN network to aggregate the shape. The initial shape is refined by applying a median filter, and later the reconstructed shape is sized back to original by utilizing bi‐linear interpolation. The results are compared with commonly used focus measure operators by employing root mean squared error ( RMSE ), correlation, and image quality index ( Q ). Compared to other methods, the proposed SFF method using DNN shows higher precision and low computational time consumption.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here