z-logo
open-access-imgOpen Access
One Line or Two? Perspectives on Piecewise Regression
Author(s) -
R.P. Ewing,
D.W. Meek
Publication year - 2006
Language(s) - English
Resource type - Reports
DOI - 10.2172/899336
Subject(s) - akaike information criterion , bayesian information criterion , piecewise , line (geometry) , computer science , bayesian probability , data mining , artificial intelligence , mathematics , machine learning , mathematical analysis , geometry
Sometimes we are faced with data that could reasonably be represented either as a single line, or as two or more line segments. How do we identify the best breakpoint(s), and decide how many segments are ''really'' present? Most of us are taught to distrust piecewise regression, because it can be easily abused. The best method for identifying the breakpoint varies according to specifics of the data; for example, the minimum sum of squares method excels for ''well-behaved'' data. In some cases, hidden Markov methods are more likely to succeed than are more ''obvious'' methods. Likewise, the most appropriate method for deciding between one or two lines depends on your expectations and understanding of the data: an unexpected break requires more justification than an expected one, and some decision criteria (e.g., the Akaike Information Criterion) are less strict than others (e.g., the Bayesian Information Criterion). This presentation will review some options and make specific, practical recommendations

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here