C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning mixture models using a genetic version of the EM algorithm
Pattern Recognition Letters
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control and Artificial Intelligence
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Evolutionary Algorithms in Molecular Design
Evolutionary Algorithms in Molecular Design
Efficient greedy learning of Gaussian mixture models
Neural Computation
Artificial Neural Network Learning: A Comparative Review
SETN '02 Proceedings of the Second Hellenic Conference on AI: Methods and Applications of Artificial Intelligence
BYY harmony learning, structural RPCL, and topological self-organizing on mixture models
Neural Networks - New developments in self-organizing maps
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Generating random correlation matrices based on partial correlations
Journal of Multivariate Analysis
SMEM Algorithm for Mixture Models
Neural Computation
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Data classification is an intensively studied machine learning problem and there are two major categories of data classification algorithms, namely the logic based and the kernel based. The logic based classifiers, such as the decision tree and the rule-based classifier, feature the advantage of presenting a good summary about the distinctive characteristics of different classes of data. On the other hand, the kernel based classifiers, such as the neural network and the support vector machine (SVM), typically can deliver higher prediction accuracy than the logic based classifiers. However, the user of a kernel based classifier normally cannot get an overall picture about the distribution of the data set. For some applications, the overall picture of the distribution of the data set can provide valuable insights about the distinctive characteristic s of different classes of data and therefore is highly desirable. In this article, aiming to close the gap between the logic based classifiers and the kernel based classifiers, we propose a novel approach to carry out density estimation based on a mixture model composed of a limited number of generalized Gaussian components. One favorite feature of the classifier constructed with the proposed approach is that a user can easily obtain an overall picture of the distributions of the data set by examining the eigenvectors and eigenvalues of the covariance matrices associated with the generalized Gaussian components. Experimental results show that the classifier constructed with the proposed approach is capable of delivering superior prediction accuracy in comparison with the conventional logic based classifiers and the EM (Expectation Maximization) based classifier. On the other hand, though it cannot match the prediction accuracy delivered by the SVM, the proposed classifier enjoys one major advantage due to providing the user with an overall picture of the underlying distributions.