ScholarMate
客服热线:400-1616-289

Layer-Based Communication-Efficient Federated Learning with Privacy Preservation

Lian, Zhuotao; Wang, Weizheng; Huang, Huakun; Su, Chunhua*
Science Citation Index Expanded
广州大学

摘要

In recent years, federated learning has attracted more and more attention as it could collaboratively train a global model without gathering the users' raw data. It has brought many challenges. In this paper, we proposed layer-based federated learning system with privacy preservation. We successfully reduced the communication cost by selecting several layers of the model to upload for global averaging and enhanced the privacy protection by applying local differential privacy. We evaluated our system in non independently and identically distributed scenario on three datasets. Compared with existing works, our solution achieved better performance in both model accuracy and training time.

关键词

federated learning privacy preservation parameter selection communication-efficient non-IID data