Independent component analysis: theory and applications
Independent component analysis: theory and applications
Learning Lie groups for invariant visual perception
Proceedings of the 1998 conference on Advances in neural information processing systems II
Neural Network Based Processing for Smart Sensors Arrays
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Quasi-Geodesic Neural Learning Algorithms Over the Orthogonal Group: A Tutorial
The Journal of Machine Learning Research
Matrix Exponentiated Gradient Updates for On-line Learning and Bregman Projection
The Journal of Machine Learning Research
Tracking directions-of-arrival with invariant subspace updating
ICASSP '96 Proceedings of the Acoustics, Speech, and Signal Processing, 1996. on Conference Proceedings., 1996 IEEE International Conference - Volume 05
Visualising error surfaces for adaptive filters and other purposes
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 06
Clustering and visualization approaches for human cell cycle gene expression data analysis
International Journal of Approximate Reasoning
Speech Emotion Classification on a Riemannian Manifold
PCM '08 Proceedings of the 9th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
An algorithm to compute averages on matrix Lie groups
IEEE Transactions on Signal Processing
Optimization on Lie manifolds and pattern recognition
Pattern Recognition
IEEE Transactions on Knowledge and Data Engineering
Learning Gradients: Predictive Models that Infer Geometry and Statistical Dependence
The Journal of Machine Learning Research
Stress majorization with orthogonal ordering constraints
GD'05 Proceedings of the 13th international conference on Graph Drawing
Equivariant adaptive source separation
IEEE Transactions on Signal Processing
Super-exponential methods for blind deconvolution
IEEE Transactions on Information Theory
Fast fixed-point neural blind-deconvolution algorithm
IEEE Transactions on Neural Networks
Robust linearly optimized discriminant analysis
Neurocomputing
Color-to-gray based on chance of happening preservation
Neurocomputing
Hi-index | 0.01 |
The present contribution suggests to utilize a multidimensional scaling algorithm as a visualization tool for high-dimensional smoothly constrained learnable-system's patterns that lie on Riemannian manifolds. Such visualization tool proves useful in machine learning whenever learning/adaptation algorithms insist on high-dimensional Riemannian parameter manifolds. In particular, the manuscript describes the cases of interest in the recent scientific literature that the parameter space is the set of special orthogonal matrices, the unit hypersphere and the manifold of symmetric positive-definite matrices. The paper also recalls the notion of multidimensional scaling and discusses its algorithmic implementation. Some numerical experiments performed on toy problems help the readers to get acquainted with the problem at hand, while experiments performed on independent component analysis data as well as averaging data show the usefulness of the proposed visualization tool.