Method And System For Quantized Machine Learning And Federated Learning

Tech ID: 34239 / UC Case 2023-764-0

Brief Description

QAFeL is a novel asynchronous federated learning framework that combines buffered aggregation with bidirectional quantized communications, achieving up to 8× lower communication costs while preserving convergence speed and accuracy.

Full Description

Federated Learning is a distributed machine learning paradigm that enables training of models on decentralized data, without the need for centralized data storage or sharing of raw data. The invention introduces Quantized Asynchronous Federated Learning (QAFeL), an extension of FedBuff, which integrates a quantization scheme and a shared hidden state between server and clients. This approach enables highly efficient client–server interactions, reducing bandwidth requirements while maintaining strong theoretical convergence guarantees. Extensive experiments on benchmarks validate its scalability and robustness under practical federated learning conditions.

Suggested uses

  • Edge AI & IoT: Efficient on-device learning with constrained bandwidth. 
  • Healthcare & Finance: Privacy-preserving federated training with reduced communication overhead. 
  • Telecommunications & Mobile AI: Large-scale deployment of FL in 5G/6G networks and mobile devices.

Advantages

  • 8× communication reduction without loss in convergence speed. 
  • Error control via hidden state to mitigate quantization and staleness effects. 
  • Proven theoretical guarantees with experimental validation.

Patent Status

Country Type Number Dated Case
United States Of America Published Application 20240354589 10/24/2024 2023-764
 

Contact

Learn About UC TechAlerts - Save Searches and receive new technology matches

Other Information

Categorized As


5270 California Avenue / Irvine,CA
92697-7700 / Tel: 949.824.2683
  • Facebook
  • Twitter
  • Twitter
  • Twitter
  • Twitter