LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
ACM Transactions on Mathematical Software (TOMS)
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Database for Handwritten Text Recognition Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparative Study on Feature Selection in Text Categorization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Higher order learning with graphs
ICML '06 Proceedings of the 23rd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
A least squares formulation for canonical correlation analysis
Proceedings of the 25th international conference on Machine learning
Hypergraph spectral learning for multi-label classification
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
SIAM Journal on Optimization
Overview and recent advances in partial least squares
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
A scalable two-stage approach for a class of dimensionality reduction techniques
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Localized twin SVM via convex minimization
Neurocomputing
Least squares online linear discriminant analysis
Expert Systems with Applications: An International Journal
A Rayleigh-Ritz style method for large-scale discriminant analysis
Pattern Recognition
Hi-index | 0.00 |
Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.