z-logo
Premium
A Brief Survey of Modern Optimization for Statisticians
Author(s) -
Lange Kenneth,
Chi Eric C.,
Zhou Hua
Publication year - 2014
Publication title -
international statistical review
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.051
H-Index - 54
eISSN - 1751-5823
pISSN - 0306-7734
DOI - 10.1111/insr.12022
Subject(s) - computer science , differentiable function , lasso (programming language) , isolation (microbiology) , point (geometry) , maximization , minification , mathematical optimization , regular polygon , algorithm , mathematics , mathematical analysis , geometry , world wide web , microbiology and biotechnology , biology
Summary Modern computational statistics is turning more and more to high‐dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include non‐differentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well‐chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here