633 0

On Defensive Neural Networks Against Inference Attack in Federated Learning

Title
On Defensive Neural Networks Against Inference Attack in Federated Learning
Author
조성현
Keywords
Federated Learning; Inference Attack; Deep Learning; Edge Computing; Differential Privacy
Issue Date
2021-06
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Citation
ICC 2021 - IEEE International Conference on Communications Communications , ICC 2021 - IEEE International Conference on. :1-6 Jun, 2021
Abstract
Federated Learning (FL) is a promising technique for edge computing environments as it provides better data privacy protection. It enables each edge node in the system to send a central server a computed value, named gradient, rather than sending raw data. However, recent research results show that the FL is still vulnerable to an inference attack, which is an adversarial algorithm that is capable of identifying the data used to compute the gradient. One prevalent mitigation strategy is differential privacy which computes a gradient with noised data, but this causes another problem that is accuracy degradation. To effectively deal with this problem, this paper proposes a new digestive neural network (DNN) and integrates it into FL. The proposed scheme distorts raw data by DNN to make it unrecognizable then computes a gradient by a classification network. The gradients generated by edge nodes will be sent to the server to complete a trained model. The simulation results show that the proposed scheme has 9.31% higher classification accuracy and 19.25% lower attack accuracy on average than the differential private schemes.
URI
https://ieeexplore.ieee.org/document/9500936?arnumber=9500936&SID=EBSCO:edseeehttps://repository.hanyang.ac.kr/handle/20.500.11754/171618
ISBN
978-1-7281-7122-7
ISSN
1938-1883
DOI
10.1109/ICC42927.2021.9500936
Appears in Collections:
ETC[S] > 연구정보
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE