Testing Differential Privacy with Dual Interpreters
Applying differential privacy at scale requires convenient ways to check that programs computing with sensitive data appropriately preserve privacy. We propose here a fully automated framework for {\em testing} differential privacy, adapting a well-known "pointwise" technique from informal...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Applying differential privacy at scale requires convenient ways to check that
programs computing with sensitive data appropriately preserve privacy. We
propose here a fully automated framework for {\em testing} differential
privacy, adapting a well-known "pointwise" technique from informal proofs of
differential privacy. Our framework, called DPCheck, requires no programmer
annotations, handles all previously verified or tested algorithms, and is the
first fully automated framework to distinguish correct and buggy
implementations of PrivTree, a probabilistically terminating algorithm that has
not previously been mechanically checked.
We analyze the probability of DPCheck mistakenly accepting a non-private
program and prove that, theoretically, the probability of false acceptance can
be made exponentially small by suitable choice of test size.
We demonstrate DPCheck's utility empirically by implementing all benchmark
algorithms from prior work on mechanical verification of differential privacy,
plus several others and their incorrect variants, and show DPCheck accepts the
correct implementations and rejects the incorrect variants.
We also demonstrate how DPCheck can be deployed in a practical workflow to
test differentially privacy for the 2020 US Census Disclosure Avoidance System
(DAS). |
---|---|
DOI: | 10.48550/arxiv.2010.04126 |