Previously, we have discussed some differentially private algorithms and implementation bugs that can cause these algorithms to fail to protect privacy in practice. Previous posts have described two ways of addressing this problem: automated testing and manual proofs.
In this post, we examine a third option: automatic proof. Automatic proof tools for differential privacy analyze the program and attempt to build proof that the program satisfies differential privacy.
The testing tools described in the last post can be used on any program, written in any programming language. However, testing-based tools can only find bugs – they can never prove that no bugs exist. The automatic proof tools described in this post are less broadly applicable but can produce proof of privacy equivalent to the manual proofs written by privacy experts.