Keeping the neural networks simple by minimizing the description length of the weights
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
A minimum description length framework for unsupervised learning
A minimum description length framework for unsupervised learning
Gene Expression Data Classification Using Independent Variable Group Analysis
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks, Part II
Hi-index | 0.00 |
When modeling large problems with limited representational resources, it is important to be able to construct compact models of the data. Structuring the problem into sub-problems that can be modeled independently is a means for achieving compactness. In this article we introduce Independent Variable Group Analysis (IVGA), a practical, efficient, and general approach for obtaining sparse codes. We apply the IVGA approach for a situation where the dependences within variable groups are modeled using vector quantization. In particular, we derive a cost function needed for model optimization with VQ. Experimental results are presented to show that variables are grouped according to statistical independence, and that a more compact model ensues due to the algorithm.