z-logo
Premium
Adaptive differential privacy preserving based on multi‐objective optimization in deep neural networks
Author(s) -
Fan Tian,
Cui Zhihua
Publication year - 2021
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.6367
Subject(s) - differential privacy , computer science , bottleneck , artificial neural network , noise (video) , process (computing) , artificial intelligence , deep learning , differential evolution , the internet , stochastic gradient descent , artificial noise , information sensitivity , optimization problem , data mining , algorithm , computer security , computer network , image (mathematics) , channel (broadcasting) , transmitter , world wide web , embedded system , operating system
Summary Privacy data security has become an important bottleneck for the overall development of artificial intelligence and a key challenge that needs to be broken in the Internet era. The current research mainly considers differential privacy to effectively protect the private information in the data. However, as the noise increases, the precision of the training model will decrease. In order to solve above problem, an adaptive differential privacy (ADP) method is constructed and applied to deep neural networks. ADP adds noise adaptively in the training process according to the importance of features. We also build the differential privacy multi‐objective optimization model (DPMOM). DPMOM adopts multi‐objective optimization characteristics, takes accuracy and privacy protection as the optimization objectives. It optimizes the super parameters of deep neural networks and the noise of differential privacy. In addition, to better solve the ADP model, with the NSGA‐II algorithm as the basic framework, a multi‐objective optimization algorithm based on differential privacy protection (DPPMOA) is designed. Simulation experiments show that compared with other machine learning methods and differentially private stochastic gradient descent, the accuracy of ADP is higher under the same amount of noise. Through comparison with NSGA‐II, IBEA, PESA‐II, and AGE‐II, DPPMOA is proved that the solution set of this algorithm is better.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here