Ph.D. Dissertation Defense - Rakshith Srinivasa

Event Details
  • Date/Time:
    • Monday October 26, 2020
      1:00 pm - 3:00 pm
  • Location: https://bluejeans.com/73318142
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Subspace Learning by Randomized Sketching

Full Summary: No summary paragraph submitted.

TitleSubspace Learning by Randomized Sketching

Committee:

Dr. Justin Romberg, ECE, Chair , Advisor

Dr. Mark Davenport, ECE

Dr. Vladimir Koltchinskii, Math

Dr. Santosh Vempala, CoC

Dr. Kiryung Lee, Ohio State

Abstract: High dimensional data is often accompanied by inherent low dimensionality that can be leveraged to design scalable machine learning and signal processing algorithms. Developing efficient computational frameworks that take advantage of the underlying structure in the data is crucial. In this thesis, we consider a particular form of inherent low dimensionality in data: subspace models. In many applications, data is known to lie close to a low dimensional subspace. The underlying subspace itself may or may not be known a priori. Incorporating this structure into data acquisition systems and algorithms can aid in scalability. We first consider two specific applications in the field of array signal processing where subspace priors on the data are commonly used. For both these applications, we develop algorithms that require a number of measurements that scale with only the dimension of the underlying subspace. In doing so, we show that arrays demand dimensionality reduction maps that can operate on individual subsets or blocks of data at a time, without having access to other blocks. Inspired by such block constraints, we consider more general problems in numerical linear algebra where data has a natural partition into blocks. This is common in applications with distributed or decentralized data. We study the problems of sketched ridge regression and sketched matrix multiplication under this constraint and give sample optimal theoretical guarantees on block diagonal sketching matrices. Extending the block model to low-rank matrix sensing, we then study the problem of recovering a low-rank matrix from compressed observations of each column. While each column itself is compressed to a point that is beyond recovery, we leverage their joint structure to recover the matrix as a whole. To do so, we establish a new framework to design estimators of low-rank matrices that obey the constraints imposed by different observation models. Finally, we extend our framework of designing low-rank matrix estimators to the application of blind deconvolution. We provide a novel estimator that enjoys uniform recovery guarantees over the entire signal class while being sample optimal.

Additional Information

In Campus Calendar
No
Groups

ECE Ph.D. Dissertation Defenses

Invited Audience
Public
Categories
Other/Miscellaneous
Keywords
Phd Defense, graduate students
Status
  • Created By: Daniela Staiculescu
  • Workflow Status: Published
  • Created On: Oct 14, 2020 - 6:24pm
  • Last Updated: Oct 14, 2020 - 6:24pm