Federated Averaging (FedAvg) is a distributed optimization algorithm where a central server coordinates the training of a shared global model across a federation of client devices. Each client computes a local model update using its private data and sends only the updated parameters—not the raw data—to the server. The server then performs a weighted average of these updates to produce a new global model, which is redistributed to clients for the next round. This iterative process preserves data privacy by design and is the core protocol of federated learning.
