Refinement of Low Rank Approximation of a Matrix at Sublinear Cost

Victor Y. Pan, Qi Luan

Low rank approximation (LRA) of a matrix is a hot subject of modern computations. In application to Big Data mining and analysis the input matrices are so immense that one must apply sublinear cost algorithms, which only access and process a tiny fraction of all input entries. Under this restriction one cannot compute accurate LRA of the worst case input matrix and even of the matrices of some specific small families in our Appendix, but we recently prove that one can do this with a high probability (whp) for a random input, which is in good accordance with our tests and with more than a decade of worldwide computation of LRA at sublinear cost by means of Cross--Approximation algorithms. A natural challenge is to complement such computation of LRA by its refinement at sublinear cost, and we take that challenge and propose two algorithms for this task together with some recipes for a posteriori estimation of the errors of LRA, also at sublinear cost.

Knowledge Graph



Sign up or login to leave a comment