z-logo
Premium
Detecting persistent gross errors by sequential analysis of principal components
Author(s) -
Tong Hongwei,
Crowe Cameron M.
Publication year - 1997
Publication title -
aiche journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.958
H-Index - 167
eISSN - 1547-5905
pISSN - 0001-1541
DOI - 10.1002/aic.690430513
Subject(s) - principal component analysis , statistics , observational error , process (computing) , mathematics , statistical hypothesis testing , principal (computer security) , error detection and correction , non sampling error , econometrics , computer science , algorithm , operating system
Measurements such as flow rates from a chemical process violate conservation laws and other process constraints because they are contaminated by random errors and possibly gross errors such as process disturbances, leaks, departures from steady state, and biased instrumentation. Data reconcilation is aimed at estimating the true values of measured variables that are consistent with the constraints, at detecting gross errors, and at solving for unmeasured variables. An approach to constructing sequential principal‐component tests for detecting and identifying persistent gross errors during data reconciliation by combining principal‐component analysis and sequential analysis is presented. The tests detect gross errors as early as possible with fewer measuremennts. They were sharper in detecting and have a substantially greater power in correctly identifying gross errors than the currently used statistical tests in data reconciliation.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here