Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Self-Organizing Maps
Robust blind source separation by beta divergence
Neural Computation
Information geometry of U-Boost and Bregman divergence
Neural Computation
A generalized maximum entropy approach to bregman co-clustering and matrix approximation
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Relative information of type s, Csiszár's f-divergence, and information inequalities
Information Sciences—Informatics and Computer Science: An International Journal
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Fuzzy classification by fuzzy labeled neural gas
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Matrix Nearness Problems with Bregman Divergences
SIAM Journal on Matrix Analysis and Applications
Robust parameter estimation with a small bias against heavy contamination
Journal of Multivariate Analysis
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
Sided and symmetrized Bregman centroids
IEEE Transactions on Information Theory
Representation of functional data in neural networks
Neurocomputing
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Divergence-based vector quantization
Neural Computation
Hi-index | 0.01 |
We present a systematic approach to the mathematical treatment of the t-distributed stochastic neighbor embedding (t-SNE) and the stochastic neighbor embedding (SNE) method. This allows an easy adaptation of the methods or exchange of their respective modules. In particular, the divergence which measures the difference between probability distributions in the original and the embedding space can be treated independently from other components like, e.g. the similarity of data points or the data distribution. We focus on the extension for different divergences and propose a general framework based on the consideration of Frechet-derivatives. This way the general approach can be adapted to the user specific needs.