Independent component analysis: algorithms and applications
Neural Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cluster Analysis of Biomedical Image Time-Series
International Journal of Computer Vision
X-means: Extending K-means with Efficient Estimation of the Number of Clusters
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Robust information-theoretic clustering
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
GraphScope: parameter-free mining of large time-evolving graphs
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Outlier-robust clustering using independent components
Proceedings of the 2008 ACM SIGMOD international conference on Management of data
CoCo: coding cost for parameter-free outlier detection
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Selection of an optimal polyhedral surface model using the minimum description length principle
Proceedings of the 32nd DAGM conference on Pattern recognition
Natural image segmentation with adaptive texture and boundary encoding
ACCV'09 Proceedings of the 9th Asian conference on Computer Vision - Volume Part I
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Independent component analysis (ICA) is an essential building block for data analysis in many applications. Selecting the truly meaningful components from the result of an ICA algorithm, or comparing the results of different algorithms, however, is nontrivial problems. We introduce a very general technique for evaluating ICA results rooted in information-theoretic model selection. The basic idea is to exploit the natural link between non-Gaussianity and data compression: the better the data transformation represented by one or several ICs improves the effectiveness of data compression, the higher is the relevance of the ICs. We propose two different methods which allow an efficient data compression of non-Gaussian signals: Phi-transformed histograms and fuzzy histograms. In an extensive experimental evaluation, we demonstrate that our novel information-theoretic measures robustly select non-Gaussian components from data in a fully automatic way, that is, without requiring any restrictive assumptions or thresholds.