The sheer size of today's datasets dictates that learning algorithms compress or reduce their input data and/or make use of parallelism. Multiresolution Matrix Factorization (MMF) makes a connection between such computational strategies and some classical themes in Applied Mathematics, namely Multiresolution Analysis and Multigrid Methods. In particular, the similarity matrices appearing in data often have multiresolution structure, which can be exploited both for learning and to facilitate computation.
In addition to the general MMF framework, I will present our new, high performance parallel MMF software library and show some results on matrix compression/sketching and Gaussian process regression tasks (joint work with Nedelina Teneva, Pramod Mudrakarta, Yi Ding, Jonathan Eskreis-Winkler and Vikas Garg).