%0 Journal Article
%T A Systematic Survey for Differential Privacy Techniques in Federated Learning
%A Yi Zhang
%A Yunfan Lu
%A Fengxia Liu
%J Journal of Information Security
%P 111-135
%@ 2153-1242
%D 2023
%I Scientific Research Publishing
%R 10.4236/jis.2023.142008
%X Federated
learning is a distributed machine learning technique that trains a global model
by exchanging model parameters or intermediate results among multiple data
sources. Although federated learning achieves physical isolation of data, the
local data of federated learning clients are still at risk of leakage under the attack of malicious individuals. For this
reason, combining data protection techniques (e.g., differential privacy
techniques) with federated learning is a sure way to further improve the
data security of federated learning models. In this survey, we review recent
advances in the research of differentially-private federated learning models.
First, we introduce the workflow of federated learning and the theoretical
basis of differential privacy. Then, we
review three differentially-private federated learning paradigms: central
differential privacy, local differential privacy, and distributed differential
privacy. After this, we review the algorithmic optimization and communication
cost optimization of federated learning models with differential privacy.
Finally, we review the applications of federated learning models with
differential privacy in various domains. By systematically summarizing the
existing research, we propose future research opportunities.
%K Federated Learning
%K Differential Privacy
%K Privacy Computing
%U http://www.scirp.org/journal/PaperInformation.aspx?PaperID=123374