z-logo
open-access-imgOpen Access
On the Meaning and Limits of Empirical Differential Privacy
Author(s) -
Anne-Sophie Charest,
Yiwei Thomas Hou
Publication year - 2017
Publication title -
journal of privacy and confidentiality
Language(s) - English
Resource type - Journals
ISSN - 2575-8527
DOI - 10.29012/jpc.v7i3.406
Subject(s) - differential privacy , computer science , meaning (existential) , conjugate prior , measure (data warehouse) , bayesian probability , discretization , empirical measure , differential (mechanical device) , empirical research , data mining , posterior probability , mathematics , artificial intelligence , statistics , aerospace engineering , psychology , mathematical analysis , engineering , psychotherapist
Empirical differential privacy (EDP) has been proposed as an alternative to differential privacy (DP), with the important advantages that the procedure can be applied to any bayesian model and requires less technical work from the part of the user. While EDP has been shown to be easy to implement, little is known of its theoretical underpinnings. This paper proposes a careful investigation of the meaning and limits of EDP as a measure of privacy. We show that EDP can not simply be considered an empirical version of DP, and that it could instead be thought of as a sensitivity measure on posterior distributions. We also show that EDP is not well-defined, in that its value depends crucially on the choice of discretization used in the procedure, and that it can be very computationnaly intensive to apply in practice. We illustrate these limitations with two simple conjugate bayesian model: the beta-binomial model and the normal-normal model.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom