ℓ0-Regularized Sparsity for Probabilistic Mixture Models
Author(s) -
Dzung T. Phan,
Tsuyoshi Idé
Publication year - 2019
Publication title -
society for industrial and applied mathematics ebooks
Language(s) - English
Resource type - Book series
DOI - 10.1137/1.9781611975673.20
Subject(s) - probabilistic logic , bernoulli's principle , computer science , key (lock) , task (project management) , mixture model , artificial intelligence , bayesian probability , anomaly detection , machine learning , algorithm , mathematical optimization , mathematics , engineering , computer security , systems engineering , aerospace engineering
This paper revisits a classical task of learning probabilistic mixture models. Our major goal is to sparsely learn the mixture weights to automatically determine the right number of clusters. The key idea is to use a novel Bernoulli prior on the mixture weights in a Bayesian learning framework, and formalize the task of determining the mixture weights as an `0-regularized optimization problem. By leveraging a specific mathematical structure, we derive a quadratic time algorithm for efficiently solving the non-convex `0-based problem. In experiments, we evaluate the performance of our proposed approach over existing methods in recovery capability and anomaly detection for synthetic as well as realworld data sets.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom