QAFeL is a novel asynchronous federated learning framework that combines buffered aggregation with bidirectional quantized communications, achieving up to 8× lower communication costs while preserving convergence speed and accuracy.
Federated Learning is a distributed machine learning paradigm that enables training of models on decentralized data, without the need for centralized data storage or sharing of raw data. The invention introduces Quantized Asynchronous Federated Learning (QAFeL), an extension of FedBuff, which integrates a quantization scheme and a shared hidden state between server and clients. This approach enables highly efficient client–server interactions, reducing bandwidth requirements while maintaining strong theoretical convergence guarantees. Extensive experiments on benchmarks validate its scalability and robustness under practical federated learning conditions.
| Country | Type | Number | Dated | Case |
| United States Of America | Published Application | 20240354589 | 10/24/2024 | 2023-764 |