Quasi-Global Momentum

Quasi-Global Momentum

Improved decentralized training with data heterogeneity

Decentralized training of deep learning models is a key element for enabling data privacy and on-device learning over networks. In realistic learning scenarios, the presence of heterogeneity across different clients' local datasets poses an optimization challenge and may severely deteriorate the generalization performance. We propose a novel momentum-based method to mitigate this decentralized training difficulty.

DecentralizedDeep Neural NetworksDistributed Learning
Key facts
Maturity
Support
C4DT
Inactive
Lab
Unknown
  • Technical
  • Research papers

Machine Learning and Optimization Laboratory

Machine Learning and Optimization Laboratory
Martin Jaggi

Prof. Martin Jaggi

The Machine Learning and Optimization Laboratory is interested in machine learning, optimization algorithms and text understanding, as well as several application domains.

This page was last edited on 2024-04-09.