z-logo
open-access-imgOpen Access
Fuzzyc-Means Algorithms Using Kullback-Leibler Divergence and Helliger Distance Based on Multinomial Manifold
Author(s) -
Ryo Inokuchi,
Sadaaki Miyamoto
Publication year - 2008
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2008.p0443
Subject(s) - hellinger distance , divergence (linguistics) , geodesic , kullback–leibler divergence , multinomial distribution , euclidean distance , manifold (fluid mechanics) , statistical manifold , distance measures , cluster analysis , computer science , statistical distance , metric (unit) , algorithm , euclidean space , nonlinear dimensionality reduction , pattern recognition (psychology) , mathematics , information geometry , artificial intelligence , dimensionality reduction , probability distribution , statistics , curvature , combinatorics , mathematical analysis , geometry , philosophy , linguistics , engineering , mechanical engineering , operations management , scalar curvature , economics
In this paper, we discuss fuzzy clustering algorithms for discrete data. Data space is represented as a statistical manifold of the multinomial distribution, and then the Euclidean distance are not adequate in this setting. The geodesic distance on the multinomial manifold can be derived analytically, but it is difficult to use it as a metric directly. We propose fuzzy c -means algorithms using other metrics: the Kullback-Leibler divergence and the Hellinger distance, instead of the Euclidean distance. These two metrics are regarded as approximations of the geodesic distance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom