z-logo
Premium
Performance of a mixed Lagrange time delay estimation autoregressive (MLTDEAR) model for single‐image signal‐ to‐noise ratio estimation in scanning electron microscopy
Author(s) -
SIM K. S.,
CHUAH H. T.,
ZHENG C.
Publication year - 2005
Publication title -
journal of microscopy
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.569
H-Index - 111
eISSN - 1365-2818
pISSN - 0022-2720
DOI - 10.1111/j.1365-2818.2005.01488.x
Subject(s) - autoregressive model , estimator , autocorrelation , noise (video) , algorithm , lagrange multiplier , image (mathematics) , mathematics , signal to noise ratio (imaging) , computer science , statistics , mathematical optimization , artificial intelligence
Summary A novel technique based on the statistical autoregressive (AR) model has recently been developed as a solution to estimate the signal‐to‐noise ratio (SNR) in scanning electron microscope (SEM) images. In another research study, the authors also developed an algorithm by cascading the AR model with the Lagrange time delay (LTD) estimator. This technique is named the mixed Lagrange time delay estimation autoregressive (MLTDEAR) model. In this paper, the fundamental performance limits for the problem of single‐image SNR estimation as derived from the Cramer–Rao inequality is presented. We compared the experimental performances of several existing methods – the simple method, the first‐order linear interpolator, the AR‐based estimator as well as the MLTDEAR method – with respect to this performance bound. In a few test cases involving different images, the efficiency of the MLTDEAR single‐image estimation technique proved to be significantly better than that of the other three methods. Study of the effect of different SEM setting conditions that affect the autocorrelation function curve is also discussed.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here