z-logo
open-access-imgOpen Access
Bridging the Gap between Naive Bayes and Maximum Entropy Text Classification
Author(s) -
Alfons Juan,
David Vilar,
Hermann Ney
Publication year - 2007
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5220/0002425700590065
Subject(s) - bridging (networking) , naive bayes classifier , computer science , artificial intelligence , principle of maximum entropy , bayes' theorem , entropy (arrow of time) , pattern recognition (psychology) , machine learning , bayesian probability , support vector machine , physics , computer network , quantum mechanics
The naive Bayes and maximum entropy approaches to text classifi- cation are typically discussed as completely unrelated techniques. In this pa per, however, we show that both approaches are simply two different ways of doing parameter estimation for a common log-linear model of class posteriors. In par- ticular, we show how to map the solution given by maximum entropy into an optimal solution for naive Bayes according to the conditional maximum likeli- hood criterion.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom