event

Ph.D. Proposal Oral Exam - Yeo Joon Youn

Primary tabs

Title:  Theoretical Analysis of Communication Efficiency, Statistical Heterogeneity, and Privacy in Federated Optimization

Committee: 

Dr. Abernethy, Advisor

Dr. Muthukumar, Co-Advisor  

Dr. Romberg, Chair

Dr. Tumanov

Abstract: The objective of the proposed research is to design federated optimization algorithms that are communication-efficient, robust to statistical heterogeneity, and privacy-preserving. Federated optimization is a new form of distributed training on very large datasets that leverages many devices each containing local data. In Federated Learning (FL), a number of clients collaboratively learn the global objective function by communicating with a central server without sharing any locally stored data. The challenge of communication efficiency is of primary interest in FL when there is a heavy communication burden with a lot of edge computing devices and limited network bandwidth. Furthermore, a more practical scenario such as heterogeneous local data distribution (statistical heterogeneity) is considered in FL, which is more challenging compared to the traditional distributed optimization framework assuming identically distributed data across the network. Also, privacy-sensitive data on local devices necessitates privacy-preserving training in FL. Thus, we aim to build federated optimization algorithms with theoretical guarantees by tackling each significant issue. The preliminary research includes Federated optimization algorithm with Acceleration and Quantization (FedAQ) which solves the severe communication bottleneck problem in FL systems. The improved theoretical guarantees are achieved by combining an accelerated method of federated averaging, reducing the number of training and synchronization steps, with an efficient quantization scheme that significantly reduces communication complexity. Moreover, we propose a generalized form of global objective functions in FL to make federated optimization algorithms robust in heterogeneous local data distribution settings. Finally, we consider a new quantization scheme with an inherent differential privacy guarantee. This scheme does not require any additional noise and enables the federated optimization algorithm to improve both utility-privacy trade-off and communication efficiency at the same time.

Status

  • Workflow Status:Published
  • Created By:Daniela Staiculescu
  • Created:09/07/2022
  • Modified By:Daniela Staiculescu
  • Modified:09/07/2022

Categories

Target Audience