z-logo
Premium
Analysis of cost function based on Kullback–Leibler divergence in independent component analysis for two uniformly distributed source signals
Author(s) -
Tanzawa Kota,
Koshita Shunsuke,
Abe Masahide,
Kawamata Masayuki
Publication year - 2018
Publication title -
electronics and communications in japan
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.131
H-Index - 13
eISSN - 1942-9541
pISSN - 1942-9533
DOI - 10.1002/ecj.12088
Subject(s) - independent component analysis , divergence (linguistics) , blind signal separation , component (thermodynamics) , independence (probability theory) , signal processing , function (biology) , kullback–leibler divergence , computer science , signal (programming language) , algorithm , mathematics , statistics , artificial intelligence , telecommunications , philosophy , linguistics , physics , evolutionary biology , biology , thermodynamics , channel (broadcasting) , radar , programming language
Independent component analysis plays a central role in blind source separation, leading to many applications of signal processing such as telecommunications, speech processing, and biomedical signal processing. Although the independent component analysis requires cost functions for evaluation of mutual independence of observed signals, little has been reported on theoretical investigation of the characteristics of such cost functions. In this paper, we mathematically analyze the cost function based on Kullback–Leibler divergence in independent component analysis. Our analysis proves that the cost function becomes unimodal when the number of source signals is two and both of the source signals have uniform distributions. In order to derive this result, we make use of whitening of observed signals and we describe the cost function in closed form.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here