ISL Colloquium
Menu Close

Optimizing the Cost of Distributed Learning

Carlee Joe-Wong
Professor, CMU
Thursday, October 1, 2020 at 4:30 PM • Online (Zoom)

Abstract

Federated and distributed learning enables training machine learning models across multiple agents while preserving data privacy. Practical systems, however, must grapple with agents’ data being non-identically distributed and with the heterogeneous resources available across agents.

In this talk, I will discuss our recent work on accounting for these heterogeneous costs and resources in two distributed learning settings. I will first consider a multi-teacher distillation setting, where multiple teachers independently train models on their own data and send parameter updates to a student, which learns a combination of these models. In this setting, I’ll describe our policy for selecting which teachers the student should learn from at each round to minimize the student’s learning cost. I will then discuss our federated learning work, where multiple agents collaboratively train a machine learning model by sharing model updates (but not raw data) with a parameter server. Here, I will describe our cost-aware client selection method that enables heterogeneous agents to determine how much they should participate in the federated learning process based on their local costs.

Bio

Carlee Joe-Wong is an Assistant Professor in the Electrical and Computer Engineering Department at Carnegie Mellon University. She received the Ph.D. degree in Electrical Engineering from Princeton University and the B.S. degree in Electrical Engineering from Stanford University.

Her research interests are in resource allocation and optimization for networks, with applications to wireless networks, smart grids, and cloud computing systems. She received the NSF CAREER Award in 2021 and the Google Faculty Research Award in 2019. Her research on economics of mobile data plans received the Applied Networking Research Prize in 2016.