The nature of statistical learning theory
The nature of statistical learning theory
Jacobi Angles for Simultaneous Diagonalization
SIAM Journal on Matrix Analysis and Applications
Faithful representation of separable distributions
Neural Computation
Information-theoretic approach to blind separation of sources in non-linear mixture
Signal Processing - Special issue on neural networks
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
High-order contrasts for independent component analysis
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Maximum Likelihood Approach to Nonlinear Blind Source Separation
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
The Effect of the Input Density Distribution on Kernel-based Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Kernel Independent Component Analysis
Kernel Independent Component Analysis
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Source separation in post-nonlinear mixtures
IEEE Transactions on Signal Processing
A blind source separation technique using second-order statistics
IEEE Transactions on Signal Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
The Journal of Machine Learning Research
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
Journal of VLSI Signal Processing Systems
Independent Slow Feature Analysis and Nonlinear Blind Source Separation
Neural Computation
Nonlinear independent component analysis with minimal nonlinear distortion
Proceedings of the 24th international conference on Machine learning
Linear-Time Computation of Similarity Measures for Sequential Data
The Journal of Machine Learning Research
Nonlinear estimation of subpixel proportion via kernel least square regression
International Journal of Remote Sensing
Nonlinear Coordinate Unfolding Via Principal Curve Projections with Application to Nonlinear BSS
Neural Information Processing
Kernel-based nonlinear independent component analysis
ICA'07 Proceedings of the 7th international conference on Independent component analysis and signal separation
Extraction of signals with specific temporal structure using kernel methods
IEEE Transactions on Signal Processing
Modeling face appearance with nonlinear independent component analysis
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Nonlinear adaptive blind source separation based on kernel function
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
A novel dimension reduction procedure for searching non-gaussian subspaces
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
Hi-index | 0.01 |
We propose kTDSEP, a kernel-based algorithm for nonlinear blind source separation (BSS). It combines complementary research fields: kernel feature spaces and BSS using temporal information. This yields an efficient algorithm for nonlinear BSS with invertible nonlinearity. Key assumptions are that the kernel feature space is chosen rich enough to approximate the nonlinearity and that signals of interest contain temporal information. Both assumptions are fulfilled for a wide set of real-world applications. The algorithm works as follows: First, the data are (implicitly) mapped to a high (possibly infinite)-dimensional kernel feature space. In practice, however, the data form a smaller submanifold in feature space-- even smaller than the number of training data points--a fact that has already been used by, for example, reduced set techniques for support vector machines. We propose to adapt to this effective dimension as a preprocessing step and to construct an orthonormal basis of this submanifold. The latter dimension-reduction step is essential for making the subsequent application of BSS methods computationally and numerically tractable. In the reduced space, we use a BSS algorithm that is based on second-order temporal decorrelation. Finally, we propose a selection procedure to obtain the original sources from the extracted nonlinear components automatically.Experiments demonstrate the excellent performance and efficiency of our kTDSEP algorithm for several problems of nonlinear BSS and for more than two sources.