Verified Differential Privacy for Finite Computers
Differential Privacy not only ensures the anonymity of data, but provides a way of rigorously quantifying and proving how private released information is. A drawback of Differential Privacy is that most suggested implementations are formalized using real numbers. This makes Differentially Private algorithms unimplementable on finite machines. A common way to get around this issue is to implement these algorithms using floating point numbers. However, as shown by Ilya Morinov, these naive implementations lead to privacy breaches using an attack on the least significant bits of the floating point numbers. Therefore, finding ways to verify implementations of Differential Privacy is of significant interest to many. Using Coq, we have formalized a version of the Geometric Truncated Mechanism (GTM), first suggest by Arpita Ghosh, Tim Roughgarden, and Mukund Sundararajan using fixed precision arithmetic, and verified that the GTM is in fact Differentially Private.
Extended abstract (Verified-Diff-Priv.pdf) | 389KiB |
Sat 21 JanDisplayed time zone: Eastern Time (US & Canada) change
14:00 - 15:30 | |||
14:00 30mTalk | Verified Differential Privacy for Finite Computers CoqPL Vivien Rindisbacher Boston University, Arthur Azevedo de Amorim Boston University, Marco Gaboardi Boston University File Attached | ||
14:30 30mTalk | Formalizing Monoidal Categories and Actions for Syntax with Binders CoqPL Benedikt Ahrens TU Delft, The Netherlands, Ralph Matthes IRIT (CNRS and Univ. of Toulouse), Kobe Wullaert Technical University Delft File Attached | ||
15:00 30mTalk | Certifying Complexity Analysis CoqPL Clément Aubert Augusta University, Thomas Rubiano LIPN – UMR 7030 Université Sorbonne Paris Nord, Neea Rusch Augusta University, Thomas Seiller CNRS File Attached |