TY - GEN
T1 - Gradient coding
T2 - 34th International Conference on Machine Learning, ICML 2017
AU - Tandon, Rashish
AU - Lei, Qi
AU - Dimakis, Alexandras G.
AU - Karampatziakis, Nikos
N1 - Publisher Copyright:
Copyright © 2017 by the authors.
PY - 2017
Y1 - 2017
N2 - We propose a novel coding theoretic framework for mitigating stragglers in distributed learning. We show how carefully replicating data blocks and coding across gradients can provide tolerance to failures and stragglers for synchronous Gradient Descent. We implement our schemes in python (using MPI) to run on Amazon EC2, and show how we compare against baseline approaches in running time and generalization error.
AB - We propose a novel coding theoretic framework for mitigating stragglers in distributed learning. We show how carefully replicating data blocks and coding across gradients can provide tolerance to failures and stragglers for synchronous Gradient Descent. We implement our schemes in python (using MPI) to run on Amazon EC2, and show how we compare against baseline approaches in running time and generalization error.
UR - http://www.scopus.com/inward/record.url?scp=85048470484&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048470484&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85048470484
T3 - 34th International Conference on Machine Learning, ICML 2017
SP - 5166
EP - 5178
BT - 34th International Conference on Machine Learning, ICML 2017
PB - International Machine Learning Society (IMLS)
Y2 - 6 August 2017 through 11 August 2017
ER -