PhD Defense by Uthaipon (Tao) Tantipongpipat

Event Details
  • Date/Time:
    • Thursday April 30, 2020
      10:30 am - 12:30 pm
  • Location: REMOTE
  • Phone:
  • URL: BlueJeans
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Fair and Diverse Data Representation in Machine Learning

Full Summary: No summary paragraph submitted.

Final doctoral examination and defense of Uthaipon (Tao) Tantipongpipat

 

Thursday, April 30, 2020 - 10:30am (EDT)
Link -- https://bluejeans.com/656592653
Title:  Fair and Diverse Data Representation in Machine Learning

Advisor: Dr. Mohit Singh, ISyE, Georgia Institute of Technology

Committee:

Dr. Rachel Cummings, ISyE, Georgia Institute of Technology

Dr. Aleksandar Nikolov, Computer Science, University of Toronto

Dr. Sebastian Pokutta, Institute of Mathematics, Technical University of Berlin

Dr. Santosh Vempala, College of Computing, Georgia Institute of Technology

Reader: Dr. Santosh Vempala, College of Computing, Georgia Institute of Technology

Summary: The work contains two major lines of research: subset selection and multi-criteria dimensionality reduction with an application to fairness. Subset selection can be applied to a classical problem of optimal design in statistics and many others in machine learning when learning is subject to a labelling budget constraint. This thesis also extends the arguably most commonly used dimensionality reduction technique, Principal Component Analysis (PCA), to satisfy a fairness criterion of choice. We model an additional fairness constraint as multi-criteria dimensionality reduction where we are given multiple objectives that need to be optimized simultaneously.

Our first contribution is to show that approximability of certain criteria for optimal design problems can be obtained by novel polynomial-time sampling algorithms, improving upon best previous approximation ratios in the literature. We also show that the A-optimal design problem is NP-hard to approximate within a fixed constant when k = d.

One of the most common heuristics used in practice to solve A and D-optimal design problems is the local search heuristic, also known as the Fedorov’s exchange method. This is due to its simplicity and its empirical performance. However, despite its wide usage, no theoretical bound has been proven for this algorithm. We bridge this gap and prove approximation guarantees for the local search algorithms for A- and D-optimal design problems.

Our model of multi-criteria dimensionality reduction captures several fairness criteria for dimensionality reduction such as the Fair-PCA problem introduced by Samadi et al. in 2018 and the Nash Social Welfare (NSW) problem. In the Fair-PCA problem, the input data is divided into k groups, and the goal is to find a single d-dimensional representation for all groups for which the maximum reconstruction error of any one group is minimized. In NSW, the goal is to maximize the product of the individual variances of the groups achieved by the common low-dimensional space. We develop algorithms for multi-criteria dimensionality reduction and show their theoretical performance and fast implementations in practice.

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Faculty/Staff, Public, Graduate students, Undergraduate students
Categories
Other/Miscellaneous
Keywords
Phd Defense
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Apr 29, 2020 - 12:42pm
  • Last Updated: Apr 29, 2020 - 12:42pm