
Variational Bayesian Inference for Finite Inverted Dirichlet Mixture Model and Its Application to Object Detection
Author(s) -
Lai Yuping,
Ping Yuan,
He Wenda,
Wang Baocheng,
Wang Jingzhong,
Zhang Xiufeng
Publication year - 2018
Publication title -
chinese journal of electronics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.267
H-Index - 25
eISSN - 2075-5597
pISSN - 1022-4653
DOI - 10.1049/cje.2018.03.003
Subject(s) - mixture model , inference , expectation–maximization algorithm , dirichlet distribution , computer science , bayesian inference , object (grammar) , flexibility (engineering) , algorithm , maximization , synthetic data , artificial intelligence , bayesian probability , pattern recognition (psychology) , mathematics , mathematical optimization , maximum likelihood , statistics , mathematical analysis , boundary value problem
As a variant of Finite mixture model (FMM), finite Inverted Dirichlet mixture model (IDMM) can not avoid the conventional challenges, such as how to select the appropriate number of mixture components based on the observed data. Towards easing these issues, we propose a variational inference framework for learning IDMM which has been proved to be an efficient tool for modeling vectors with positive elements. Compared with the conventional Expectation maximization (EM) algorithm commonly used for learning FMM, the proposed approach prevents over‐fitting well. Furthermore, it is able to do automatic determination of the number of mixture components and parameters estimation, simultaneously. Experimental results on both synthetic and real data of object detection confirm significant improvements on flexibility and efficiency being achieved.