Title:
Strategic behavior and database privacy

Thumbnail Image
Author(s)
Krehbiel, Sara
Authors
Advisor(s)
Peikert, Chris
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Series
Supplementary to
Abstract
This dissertation focuses on strategic behavior and database privacy. First, we look at strategic behavior as a tool for distributed computation. We blend the perspectives of game theory and mechanism design in proposals for distributed solutions to the classical set cover optimization problem. We endow agents with natural individual incentives, and we show that centrally broadcasting non-binding advice effectively guides the system to a near-optimal state while keeping the original incentive structure intact. We next turn to the database privacy setting, in which an analyst wishes to learn something from a database, but the individuals contributing the data want to protect their personal information. The notion of differential privacy allows us to do both by obscuring true answers to statistical queries with a small amount of noise. The ability to conduct a task differentially privately depends on whether the amount of noise required for privacy still permits statistical accuracy. We show that it is possible to give a satisfying tradeoff between privacy and accuracy for a computational problem called independent component analysis (ICA), which seeks to decompose an observed signal into its underlying independent source variables. We do this by releasing a perturbation of a compact representation of the observed data. This approach allows us to preserve individual privacy while releasing information that can be used to reconstruct the underlying relationship between the observed variables. In almost all of the differential privacy literature, the privacy requirement must be specified before looking at the data, and the noise added for privacy limits the statistical utility of the sanitized data. The third part of this dissertation ties together privacy and strategic behavior to answer the question of how to determine an appropriate level of privacy when data contributors prefer more privacy but an analyst prefers more accuracy. The proposed solution to this problem views privacy as a public good and uses market design techniques to collect these preferences and then privately select and enforce a socially efficient level of privacy.
Sponsor
Date Issued
2015-07-27
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI