z-logo
open-access-imgOpen Access
Incremental Learning of Temporally-Coherent Gaussian Mixture Models
Author(s) -
O. Arandjelović,
Roberto Cipolla
Publication year - 2005
Publication title -
deakin research online (deakin university)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.19.59
Subject(s) - mixture model , computer science , component (thermodynamics) , artificial intelligence , algorithm , pattern recognition (psychology) , gaussian , current (fluid) , gaussian process , machine learning , physics , quantum mechanics , thermodynamics , electrical engineering , engineering
In this paper we address the problem of learning Gaussian Mixture Models (GMMs) incrementally. Unlike previous approaches which universally assume that new data comes in blocks representable by GMMs which are then merged with the current model estimate, our method works for the case when novel data points arrive oneby- one, while requiring little additional memory. We keep only two GMMs in the memory and no historical data. The current fit is updated with the assumption that the number of components is fixed, which is increased (or reduced) when enough evidence for a new component is seen. This is deduced from the change from the oldest fit of the same complexity, termed the Historical GMM, the concept of which is central to our method. The performance of the proposed method is demonstrated qualitatively and quantitatively on several synthetic data sets and video sequences of faces acquired in realistic imaging conditions

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom