Matrix analysis
Lectures on modern convex optimization: analysis, algorithms, and engineering applications
Lectures on modern convex optimization: analysis, algorithms, and engineering applications
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
An introduction to variable and feature selection
The Journal of Machine Learning Research
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
All of Nonparametric Statistics (Springer Texts in Statistics)
All of Nonparametric Statistics (Springer Texts in Statistics)
Large-scale semidefinite programming via a saddle point Mirror-Prox algorithm
Mathematical Programming: Series A and B
Dual extrapolation and its applications to solving variational inequalities and related problems
Mathematical Programming: Series A and B
In Search of Non-Gaussian Components of a High-Dimensional Distribution
The Journal of Machine Learning Research
Optimal Solutions for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Sparse non-Gaussian component analysis
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Sparse non-Gaussian component analysis is an unsupervised linear method of extracting any structure from high-dimensional distributed data based on estimating a low-dimensional non-Gaussian data component. In this paper we discuss a new approach with known apriori reduced dimension to direct estimation of the projector on the target space using semidefinite programming. The new approach avoids the estimation of the data covariance matrix and overcomes the traditional separation of element estimation of the target space and target space reconstruction. This allows to reduced the sampling size while improving the sensitivity to a broad variety of deviations from normality. Moreover the complexity of the new approach is limited to O(dlogd). We also discuss the procedures which allows to recover the structure when its effective dimension is unknown.