Low Rank Approximation at Sublinear Cost by Means of Subspace Sampling

Victor Y. Pan, Qi Luan, John Svadlenka, Liang Zhao

Low Rank Approximation (LRA) of a matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with LRA can be performed at sublinear cost, that is, by using much fewer arithmetic operations and memory cells than an input matrix has entries. Although every sublinear cost algorithm for LRA fails to approximate the worst case inputs, we prove that our sublinear cost variations of a popular subspace sampling algorithm output accurate LRA of a large class of inputs. Namely, they do so with a high probability for a random input matrix that admits its LRA. In other papers we proposed and analyzed sublinear cost algorithms for other important matrix computations. Our numerical tests are in good accordance with our formal results.

Knowledge Graph



Sign up or login to leave a comment