Dense Fast Random Projections and Lean Walsh Transforms
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Fast Algorithms for Approximating the Singular Value Decomposition
ACM Transactions on Knowledge Discovery from Data (TKDD)
Subspace embeddings for the L1-norm with applications
Proceedings of the forty-third annual ACM symposium on Theory of computing
Tall and skinny QR factorizations in MapReduce architectures
Proceedings of the second international workshop on MapReduce and its applications
Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with sub-constant error
Proceedings of the twenty-second annual ACM-SIAM symposium on Discrete Algorithms
Low rank matrix-valued chernoff bounds and approximate matrix multiplication
Proceedings of the twenty-second annual ACM-SIAM symposium on Discrete Algorithms
Randomized Algorithms for Matrices and Data
Foundations and Trends® in Machine Learning
Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error
ACM Transactions on Algorithms (TALG) - Special Issue on SODA'11
Simple and deterministic matrix sketching
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Low rank approximation and regression in input sparsity time
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Fast approximation of matrix coherence and statistical leverage
The Journal of Machine Learning Research
Short Communication: A rectilinear Gaussian model for estimating straight-line parameters
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
Least squares approximation is a technique to find an approximate solution to a system of linear equations that has no exact solution. In a typical setting, one lets n be the number of constraints and d be the number of variables, with $${n \gg d}$$. Then, existing exact methods find a solution vector in O(nd 2) time. We present two randomized algorithms that provide accurate relative-error approximations to the optimal value and the solution vector of a least squares approximation problem more rapidly than existing exact algorithms. Both of our algorithms preprocess the data with the Randomized Hadamard transform. One then uniformly randomly samples constraints and solves the smaller problem on those constraints, and the other performs a sparse random projection and solves the smaller problem on those projected coordinates. In both cases, solving the smaller problem provides relative-error approximations, and, if n is sufficiently larger than d, the approximate solution can be computed in O(nd ln d) time.