Differential Privacy and Secure Computation
Prof. Kobbi Nissim, Georgwtown University
Dov Gordon, George Mason University
Dr. Uri Stemmer, Ben Gurion University
As the volume and diversity of collected and processed data continues to increase, the number of instances of misuse continues to grow alongside. Citizens, corporations and governments are all becoming increasingly concerned and aware of the need for new systems and tools for preserving privacy, but none are willing to do so at a too-high toll on utility. Modern cryptography has introduced important frameworks for navigating the privacy and utility tradeoff, most notably is secure computation that allows a group of parties to compute on data while revealing nothing but the prescribed outcome. Secure computation, however, does not provide a mechanism for ensuring that the result of that computation protects individuals’ privacy; it protects the process, but not the outcome. For this, we have differential privacy, a formal framework for limiting the exposure of individual data when it is incorporated into an analysis.
We aim to explore the ways in which secure computation and differential privacy can be composed synergistically to provide utility beyond what either framework alone can provide. In particular, to deepen our understanding of what is feasible, both asymptotically and concretely, for certain key applications of interest.