The interface probing technique in domain decomposition
SIAM Journal on Matrix Analysis and Applications
A new approach to the minimum cut problem
Journal of the ACM (JACM)
Efficient randomized pattern-matching algorithms
IBM Journal of Research and Development - Mathematics and computing
Functions of Matrices: Theory and Computation (Other Titles in Applied Mathematics)
Functions of Matrices: Theory and Computation (Other Titles in Applied Mathematics)
Computing $A^\alpha, \log(A)$, and Related Matrix Functions by Contour Integrals
SIAM Journal on Numerical Analysis
SIAM Review
Identification of Matrices Having a Sparse Representation
IEEE Transactions on Signal Processing
Double sweep preconditioner for optimized Schwarz methods applied to the Helmholtz problem
Journal of Computational Physics
Hi-index | 0.01 |
When a matrix $A$ with $n$ columns is known to be well-approximated by a linear combination of basis matrices $B_1,\ldots,B_p$, we can apply $A$ to a random vector and solve a linear system to recover this linear combination. The same technique can be used to obtain an approximation to $A^{-1}$. A basic question is whether this linear system is well-conditioned. This is important for two reasons: a well-conditioned system means (1) we can invert it and (2) the error in the reconstruction can be controlled. In this paper, we show that if the Gram matrix of the $B_j$'s is sufficiently well-conditioned and each $B_j$ has a high numerical rank, then $n\propto p \log^2 n$ will ensure that the linear system is well-conditioned with high probability. Our main application is probing linear operators with smooth pseudodifferential symbols such as the wave equation Hessian in seismic imaging [L. Demanet et al., Appl. Comput. Harmonic Anal., 32 (2012), pp. 155-168]. We also demonstrate numerically that matrix probing can produce good preconditioners for inverting elliptic operators in variable media.