Practical privacy: the SuLQ framework. We consider a statistical database in which a trusted administrator introduces noise to the query responses with the goal of maintaining privacy of individual database entries. In such a database, a query consists of a pair (S, f) where S is a set of rows in the database and f is a function mapping database rows to {0, 1}. The true answer is ΣiεS f(di), and a noisy version is released as the response to the query. Results of Dinur, Dwork, and Nissim show that a strong form of privacy can be maintained using a surprisingly small amount of noise -- much less than the sampling error -- provided the total number of queries is sublinear in the number of database rows. We call this query and (slightly) noisy reply the SuLQ (Sub-Linear Queries) primitive. The assumption of sublinearity becomes reasonable as databases grow increasingly large.We extend this work in two ways. First, we modify the privacy analysis to real-valued functions f and arbitrary row types, as a consequence greatly improving the bounds on noise required for privacy. Second, we examine the computational power of the SuLQ primitive. We show that it is very powerful indeed, in that slightly noisy versions of the following computations can be carried out with very few invocations of the primitive: principal component analysis, k means clustering, the Perceptron Algorithm, the ID3 algorithm, and (apparently!) all algorithms that operate in the in the statistical query learning model [11].

This software is also peer reviewed by journal TOMS.

References in zbMATH (referenced in 16 articles )

Showing results 1 to 16 of 16.
Sorted by year (citations)

  1. Beimel, Amos; Nissim, Kobbi; Stemmer, Uri: Private learning and sanitization: pure vs. approximate differential privacy (2016)
  2. Choromanska, Anna; Choromanski, Krzysztof; Jagannathan, Geetha; Monteleoni, Claire: Differentially-private learning of low dimensional manifolds (2016)
  3. Ullman, Jonathan: Answering $n^2+o(1)$ counting queries with differential privacy is hard (2016)
  4. Balcan, Maria Florina; Feldman, Vitaly: Statistical active learning algorithms for noise tolerance and differential privacy (2015)
  5. Benkaouz, Yahya; Erradi, Mohammed: A distributed protocol for privacy preserving aggregation with non-permanent participants (2015)
  6. Feldman, Vitaly; Xiao, David: Sample complexity bounds on differentially private learning via communication complexity (2015)
  7. Ghosh, Arpita; Roth, Aaron: Selling privacy at auction (2015)
  8. Beimel, Amos; Brenner, Hai; Kasiviswanathan, Shiva Prasad; Nissim, Kobbi: Bounds on the sample complexity for private learning and private data release (2014)
  9. Soria-Comas, Jordi; Domingo-Ferrer, Josep: Optimal data-independent noise for differential privacy (2013)
  10. Feldman, Vitaly: A complete characterization of statistical query learning with applications to evolvability (2012)
  11. Matthews, Gregory J.; Harel, Ofer: Data confidentiality: a review of methods for statistical disclosure limitation and methods for assessing privacy (2011)
  12. Smith, Adam: Asymptotically optimal and private statistical estimation. (Invited talk) (2009)
  13. Dwork, Cynthia: Differential privacy: A survey of results (2008)
  14. Friedman, Arik; Wolff, Ran; Schuster, Assaf: Providing $k$-anonymity in data mining (2008)
  15. Dwork, Cynthia; Kenthapadi, Krishnaram; McSherry, Frank; Mironov, Ilya; Naor, Moni: Our data, ourselves: privacy via distributed noise generation (2006)
  16. Mukherjee, Shibnath; Chen, Zhiyuan; Gangopadhyay, Aryya: A privacy-preserving technique for Euclidean distance-based mining algorithms using Fourier-related transforms (2006)