Evaluation of an inference network-based retrieval model
ACM Transactions on Information Systems (TOIS) - Special issue on research and development in information retrieval
What is the goal of sensory coding?
Neural Computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Independent component analysis for identification of artifacts in magnetoencephalographic recordings
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
New approximations of differential entropy for independent component analysis and projection pursuit
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Study of Approaches to Hypertext Categorization
Journal of Intelligent Information Systems
Naive (Bayes) at Forty: The Independence Assumption in Information Retrieval
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Machine Learning
Latent classification models for binary data
Pattern Recognition
A comparative study of PCA, ICA and class-conditional ICA for Naïve Bayes classifier
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
NB+: An improved Naïve Bayesian algorithm
Knowledge-Based Systems
Orthogonally rotational transformation for naive bayes learning
CIS'05 Proceedings of the 2005 international conference on Computational Intelligence and Security - Volume Part I
Hi-index | 0.00 |
In the past years, Naive Bayes has experienced a renaissance in machine learning, particularly in the area of information retrieval. This classifier is based on the not always realistic assumption that class-conditional distributions can be factorized in the product of their marginal densities. On the other side, one of the most common ways of estimating the Independent Component Analysis (ICA) representation for a given random vector consists in minimizing the Kullback-Leibler distance between the joint density and the product of the marginal densities (mutual information). From this that ICA provides a representation where the independence assumption can be held on stronger grounds. In this paper we propose class-conditional ICA as a method that provides an adequate representation where Naive Bayes is the classifier of choice. Experiments on two public databases are performed in order to confirm this hypothesis.