Gröbner bases and primary decomposition of polynomial ideals
Journal of Symbolic Computation
Hilbert series for ideals generated by generic forms
Journal of Symbolic Computation
The singular value decomposition for polynomial systems
ISSAC '95 Proceedings of the 1995 international symposium on Symbolic and algebraic computation
An Algorithm for the Computation of the Radical of an Ideal in the Ring of Polynomials
AAECC-9 Proceedings of the 9th International Symposium, on Applied Algebra, Algebraic Algorithms and Error-Correcting Codes
Yet Another Ideal Decomposition Algorithm
AAECC-12 Proceedings of the 12th International Symposium on Applied Algebra, Algebraic Algorithms and Error-Correcting Codes
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Numerical Polynomial Algebra
Generalized Principal Component Analysis (GPCA)
IEEE Transactions on Pattern Analysis and Machine Intelligence
An algorithm for the computation of the radical of an ideal
Proceedings of the 2006 international symposium on Symbolic and algebraic computation
Ideals, Varieties, and Algorithms: An Introduction to Computational Algebraic Geometry and Commutative Algebra, 3/e (Undergraduate Texts in Mathematics)
A Projection Pursuit Algorithm for Exploratory Data Analysis
IEEE Transactions on Computers
Proceedings of the 25th international conference on Machine learning
Approximate computation of zero-dimensional polynomial ideals
Journal of Symbolic Computation
Algebraic Geometry and Statistical Learning Theory
Algebraic Geometry and Statistical Learning Theory
Stationary subspace analysis as a generalized eigenvalue problem
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
The Stationary Subspace Analysis Toolbox
The Journal of Machine Learning Research
A widely applicable Bayesian information criterion
The Journal of Machine Learning Research
Hi-index | 0.00 |
We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion.