TY - GEN
T1 - PRIVACY-PRESERVING FEDERATED MULTI-TASK LINEAR REGRESSION
T2 - 47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022
AU - Lee, Harlin
AU - Bertozzi, Andrea L.
AU - Kovačević, Jelena
AU - Chi, Yuejie
N1 - Funding Information:
This work was supported in part by the grants NSF CCF-2007911, ECCS-1818571 and DMS-1952339, ARO W911NF-18-1-0303, and ONR N00014-19-1-2404. Part of this work was completed when the first author was a graduate student at Carnegie Mellon University [1].
Publisher Copyright:
© 2022 IEEE
PY - 2022
Y1 - 2022
N2 - We investigate multi-task learning (MTL), where multiple learning tasks are performed jointly rather than separately to leverage their similarities and improve performance. We focus on the federated multi-task linear regression setting, where each machine possesses its own data for individual tasks and sharing the full local data between machines is prohibited. Motivated by graph regularization, we propose a novel fusion framework that only requires a one-shot communication of local estimates. Our method linearly combines the local estimates to produce an improved estimate for each task, and we show that the ideal mixing weight for fusion is a function of task similarity and task difficulty. A practical algorithm is developed and shown to significantly reduce mean squared error (MSE) on synthetic data, as well as improve performance on an income prediction task where the real-world data is disaggregated by race.
AB - We investigate multi-task learning (MTL), where multiple learning tasks are performed jointly rather than separately to leverage their similarities and improve performance. We focus on the federated multi-task linear regression setting, where each machine possesses its own data for individual tasks and sharing the full local data between machines is prohibited. Motivated by graph regularization, we propose a novel fusion framework that only requires a one-shot communication of local estimates. Our method linearly combines the local estimates to produce an improved estimate for each task, and we show that the ideal mixing weight for fusion is a function of task similarity and task difficulty. A practical algorithm is developed and shown to significantly reduce mean squared error (MSE) on synthetic data, as well as improve performance on an income prediction task where the real-world data is disaggregated by race.
KW - federated learning
KW - graph regularization
KW - linear regression
KW - multi-task learning
UR - http://www.scopus.com/inward/record.url?scp=85131231260&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131231260&partnerID=8YFLogxK
U2 - 10.1109/ICASSP43922.2022.9746007
DO - 10.1109/ICASSP43922.2022.9746007
M3 - Conference contribution
AN - SCOPUS:85131231260
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 5947
EP - 5951
BT - 2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 23 May 2022 through 27 May 2022
ER -