Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
Author(s) -
Ahmad Reza Heravi,
Ghosheh Abed Hodtani
Publication year - 2018
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2018.2792329
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL's literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or multi-modal non-Gaussian distributions, information theoretic criteria [such as minimum error entropy (MEE)] outperform second order statistical ones. The objective of this research is to investigate this better performance of MEE criterion against that of minimum mean square error. Having found similar results for MEEand MSE-based methods, in the non-Gaussian environment under particular conditions, we need a precise demarcation between this occasional similarity and occasional outperformance. Based on the theoretic findings, we reveal a better touchstone for the outperformance of MEE versus MSE.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom