An alternating least squares algorithms for PARAFAC2 and three-way DEDICOM
Computational Statistics & Data Analysis
Editorial: 2nd Special issue on matrix computations and statistics
Computational Statistics & Data Analysis
Convergence of the sequence of parameters generated by alternating least squares algorithms
Computational Statistics & Data Analysis
I-Scal: Multidimensional scaling of interval dissimilarities
Computational Statistics & Data Analysis
Approximate low-rank factorization with structured factors
Computational Statistics & Data Analysis
Acceleration of the alternating least squares algorithm for principal components analysis
Computational Statistics & Data Analysis
Block relaxation and majorization methods for the nearest correlation matrix with factor structure
Computational Optimization and Applications
TALMUD: transfer learning for multiple domains
Proceedings of the 21st ACM international conference on Information and knowledge management
Hi-index | 0.03 |
A general procedure is described for setting up monotonically convergent algorithms to solve some general matrix optimization problems, if desired, subject to a wide variety of constraints. An overview is given of a number of ready-made building blocks (derived in earlier publications) from which concrete algorithms are set-up with little effort. These algorithms are based on alternating least squares (block relaxation) and iterative majorization. It is demonstrated how the construction of an algorithm for a particular problem that falls in one of the classes of optimization problems under study, reduces to a simple combination of tools. Also, a procedure is reviewed for setting up a weighted least squares algorithm for any problem for which an unweighted least squares solution is available. All procedures are illustrated by means of examples.