The support vector decomposition machine
ICML '06 Proceedings of the 23rd international conference on Machine learning
Restricted Boltzmann machines for collaborative filtering
Proceedings of the 24th international conference on Machine learning
Multi-classification by categorical features via clustering
Proceedings of the 25th international conference on Machine learning
Solving low-rank matrix completion problems efficiently
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Fast and scalable algorithms for semi-supervised link prediction on static and dynamic graphs
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
SIAM Journal on Matrix Analysis and Applications
PAC-Bayesian Analysis of Co-clustering and Beyond
The Journal of Machine Learning Research
Uniqueness of Low-Rank Matrix Completion by Rigidity Theory
SIAM Journal on Matrix Analysis and Applications
Tensor factorization using auxiliary information
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
COLT'05 Proceedings of the 18th annual conference on Learning Theory
A Simpler Approach to Matrix Completion
The Journal of Machine Learning Research
Restricted strong convexity and weighted matrix completion: optimal bounds with noise
The Journal of Machine Learning Research
Accelerated Linearized Bregman Method
Journal of Scientific Computing
Personalized collaborative clustering
Proceedings of the 23rd international conference on World wide web
Hi-index | 0.00 |
Matrices that can be factored into a product of two simpler matrices can serve as a useful and often natural model in the analysis of tabulated or high-dimensional data. Models based on matrix factorization (Factor Analysis, PCA) have been extensively used in statistical analysis and machine learning for over a century, with many new formulations and models suggested in recent years (Latent Semantic Indexing, Aspect Models, Probabilistic PCA, Exponential PCA, Non-Negative Matrix Factorization and others). In this thesis we address several issues related to learning with matrix factorizations: we study the asymptotic behavior and generalization ability of existing methods, suggest new optimization methods, and present a novel maximum-margin high-dimensional matrix factorization formulation. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)