Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Independent component analysis for identification of artifacts in magnetoencephalographic recordings
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Hi-index | 0.00 |
Given a number of samples possibly belonging to different classes, we say these samples live in an ICA space if all their class-conditional distributions are separable and thus can be expressed as a product of unidimensional distributions. Since this hypothesis is unfrequent on real-world problems we also provide a framework through class-conditional Independent Component Analysis (CC-ICA) where it can be held on stronger grounds. For this representation, we focus on the problem of feature subset selection for classification, observing that divergence arises as a simple and natural criterion for class separability. Since divergence is monotonic on the dimensionality, optimality can be ensured without the need for an exhaustive search for features. We adapt the Bayes decision scheme to our independence assumptions and framework. A first experiment on Trunk's artificial dataset, where class-conditional independence is already known, illustrates the robustness and accuracy of our technique. A second experiment, on the UCI letter database, evaluates the importance of the representation when assuming independence. A third experiment on the Corel database illustrates the performance of our criterion on high dimensional data.