Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Trading Off Privacy, Utility, and Efficiency in Federated Learning

Xiaojin Zhang, Yan Kang, Kai Chen, Lixin Fan, Qiang Yang
2023 ACM Transactions on Intelligent Systems and Technology  
Federated learning (FL) enables participating parties to collaboratively build a global model with boosted utility without disclosing private data information. Appropriate protection mechanisms have to be adopted to fulfill the opposing requirements in preserving privacy and maintaining high model utility . In addition, it is a mandate for a federated learning system to achieve high efficiency in order to enable large-scale model training and deployment. We propose a unified federated learning
more » ... ramework that reconciles horizontal and vertical federated learning. Based on this framework, we formulate and quantify the trade-offs between privacy leakage, utility loss, and efficiency reduction, which leads us to the No-Free-Lunch (NFL) theorem for the federated learning system. NFL indicates that it is unrealistic to expect an FL algorithm to simultaneously provide excellent privacy, utility, and efficiency in certain scenarios. We then analyze the lower bounds for the privacy leakage, utility loss, and efficiency reduction for several widely-adopted protection mechanisms, including Randomization , Homomorphic Encryption , Secret Sharing, and Compression . Our analysis could serve as a guide for selecting protection parameters to meet particular requirements.
doi:10.1145/3595185 fatcat:xqfja4antvgbhl7lmqkbyffolu