AggregaThor

AggregaThor

Framework over TensorFlow implementing robust stochastic gradient descent

AggregaThor is a distributed SGD framework layered over TensorFlow that integrates pluggable Byzantine-resilient gradient aggregation rules (e.g. Krum, Bulyan, coordinate-wise median) and supports asynchronous parameter-server topologies. It tolerates both Byzantine worker failures and unreliable network links, and its modular design allows aggregation rules and communication backends to be swapped independently.

Byzantine ResilienceDistributed LearningTensorFlow
Maturity
Support
C4DT
Inactive
Lab
Unknown

Distributed Computing Lab

Distributed Computing Lab
Rachid Guerraoui

Prof. Rachid Guerraoui

The Distributed Computing Lab focuses currently on Scalable Implementations of Cryptocurrencies, Byzantine fault tolerance and privacy in distributed machine learning, distributed algorithms making use of RDMA and NVRAM.

This page was last edited on 2024-03-22.