Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Machine Learning - Special issue on learning with probabilistic representations
Naive (Bayes) at Forty: The Independence Assumption in Information Retrieval
ECML '98 Proceedings of the 10th European Conference on Machine Learning
WBCsvm: Weighted Bayesian Classification based on Support Vector Machines
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
What Is the Nearest Neighbor in High Dimensional Spaces?
VLDB '00 Proceedings of the 26th International Conference on Very Large Data Bases
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
Locally adaptive metrics for clustering high dimensional data
Data Mining and Knowledge Discovery
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Constrained locally weighted clustering
Proceedings of the VLDB Endowment
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Class-dependent projection based method for text categorization
Pattern Recognition Letters
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Induction of selective Bayesian classifiers
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Calculating Feature Weights in Naive Bayes with Kullback-Leibler Measure
ICDM '11 Proceedings of the 2011 IEEE 11th International Conference on Data Mining
Hi-index | 0.00 |
Naive Bayes (NB for short) is one of the popular methods for supervised classification in a knowledge management system. Currently, in many real-world applications, high-dimensional data pose a major challenge to conventional NB classifiers, due to noisy or redundant features and local relevance of these features to classes. In this paper, an automated feature weighting solution is proposed to result in a NB method effective in dealing with high-dimensional data. We first propose a locally weighted probability model, for Bayesian modeling in high-dimensional spaces, to implement a soft feature selection scheme. Then we propose an optimization algorithm to find the weights in linear time complexity, based on the Logitnormal priori distribution and the Maximum a Posteriori principle. Experimental studies show the effectiveness and suitability of the proposed model for high-dimensional data classification.