z-logo
Premium
Frobenius Norm Regularization for the Multivariate Von Mises Distribution
Author(s) -
RodriguezLujan Luis,
Larrañaga Pedro,
Bielza Concha
Publication year - 2017
Publication title -
international journal of intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.291
H-Index - 87
eISSN - 1098-111X
pISSN - 0884-8173
DOI - 10.1002/int.21834
Subject(s) - von mises distribution , mathematics , overfitting , von mises yield criterion , estimator , multivariate statistics , divergence (linguistics) , goodness of fit , multivariate normal distribution , kullback–leibler divergence , statistics , computer science , artificial intelligence , artificial neural network , physics , linguistics , philosophy , finite element method , thermodynamics
Penalizing the model complexity is necessary to avoid overfitting when the number of data samples is low with respect to the number of model parameters. In this paper, we introduce a penalization term that places an independent prior distribution for each parameter of the multivariate von Mises distribution. We also propose a circular distance that can be used to estimate the Kullback–Leibler divergence between any two circular distributions as goodness‐of‐fit measure. We compare the resulting regularized von Mises models on synthetic data and real neuroanatomical data to show that the distribution fitted using the penalized estimator generally achieves better results than nonpenalized multivariate von Mises estimator.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here