z-logo
Premium
Conditional Density Approximations with Mixtures of Polynomials
Author(s) -
Varando Gherardo,
LópezCruz Pedro L.,
Nielsen Thomas D.,
Larrañaga Pedro,
Bielza Concha
Publication year - 2015
Publication title -
international journal of intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.291
H-Index - 87
eISSN - 1098-111X
pISSN - 0884-8173
DOI - 10.1002/int.21699
Subject(s) - nonparametric statistics , density estimation , mathematics , parametric statistics , computer science , joint probability distribution , marginal distribution , conditional probability distribution , quotient , algorithm , machine learning , artificial intelligence , statistics , random variable , estimator , pure mathematics
Mixtures of polynomials (MoPs) are a nonparametric density estimation technique especially designed for hybrid Bayesian networks with continuous and discrete variables. Algorithms to learn one‐ and multidimensional (marginal) MoPs from data have recently been proposed. In this paper, we introduce two methods for learning MoP approximations of conditional densities from data. Both approaches are based on learning MoP approximations of the joint density and the marginal density of the conditioning variables, but they differ as to how the MoP approximation of the quotient of the two densities is found. We illustrate and study the methods using data sampled from known parametric distributions, and demonstrate their applicability by learning models based on real neuroscience data. Finally, we compare the performance of the proposed methods with an approach for learning mixtures of truncated basis functions (MoTBFs). The empirical results show that the proposed methods generally yield models that are comparable to or significantly better than those found using the MoTBF‐based method.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here