z-logo
open-access-imgOpen Access
A method for obtaining a least squares fit of a hyperplane to uncertain data
Author(s) -
David B. Reister,
Max D. Morris
Publication year - 1994
Publication title -
osti oai (u.s. department of energy office of scientific and technical information)
Language(s) - English
Resource type - Reports
DOI - 10.2172/10153960
Subject(s) - hyperplane , eigenvalues and eigenvectors , mathematics , cartesian coordinate system , least squares function approximation , linear least squares , transformation (genetics) , mathematical analysis , combinatorics , statistics , geometry , linear model , physics , biochemistry , chemistry , quantum mechanics , estimator , gene
For many least squares problems, the uncertainty is in one of the variables [for example, y = f(x) or z = f(x,y)]. However, for some problems, the uncertainty is in the geometric transformation from measured data to Cartesian coordinates and all of the calculated variables are uncertain. When we seek the best least squares fit of a hyperplane to the data, we obtain an over determined system (we have n + l equations to determine n unknowns). By neglecting one of the equations at a time, we can obtain n + l different solutions for the unknown parameters. However, we cannot average the n + l hyperplanes to obtain a single best estimate. To obtain a solution without neglecting any of the equations, we solve an eigenvalue problem and use the eigenvector associated with the smallest eigenvalue to determine the unknown parameters. We have performed numerical experiments that compare our eigenvalue method to the approach of neglecting one equation at a time.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom