ARC Colloquium: David Woodruff, IBM Almaden Research Center, San Jose, CA.

Primary tabs

Title: Low Rank Approximation and Regression in Input Sparsity Time


We improve the running times of algorithms for least squares regression and low-rank approximation to account for the sparsity of the input matrix.  Namely, if nnz (A) denotes the number of non-zero entries of an input matrix A:

  • we show how to solve approximate least squares regression given an n x d matrix A in nnz(A) + poly(d log n) time
  • we show how to find an approximate best rank-k approximation of an n x n matrix in nnz(A) + n*poly(k log n) time

All approximations are relative error. Previous algorithms based on fast Johnson-Lindenstrauss transforms took at least ndlog d or nnz(A)*k time. We have implemented our algorithms, and preliminary results suggest the algorithms are competitive in practice.

Joint work with Ken Clarkson.



  • Workflow Status:
  • Created By:
    Elizabeth Ndongi
  • Created:
  • Modified By:
    Fletcher Moore
  • Modified:


    No categories were selected.


    No keywords were submitted.

Target Audience

    No target audience selected.