Differential Privacy

From MgmtWiki
Revision as of 09:52, 25 March 2020 by Tom (talk | contribs) (Full Title or Meme)

Jump to: navigation, search

Full Title or Meme

Differential Privacy is a system for publicly sharing information about a data collection by describing the patterns of groups within the collection while withholding information that could identify the Subject.


Differential Privacy is a mathematical technique that injects inaccuracies or "noise," into the data, making some people younger and an equal number older, changing races or other attributes. The more noise you inject, the harder deanonymization becomes. Apple and Facebook started using this technique in 2019 to collect aggregate data without identifier particular users.


To much noise can render the data unless. One analysis showed that a differentially private verson of the 2010 census included households that supposedly had 90 people.[1]


U.S. Census Bureau officials said the agency is revamping its systems to prevent anyone from using published data to target individual respondents through the information they disclosed to the census. The bureau aims to use a mathematical process, called Differential Privacy, to modify census results sufficiently to reliably conceal respondents' identity. The agency will make small additions to and subtractions from each number, prior to almost every table's publication, and significantly cut the number of published statistics. Although data users are concerned these changes will disrupt their use of census data, not addressing the danger could allow information on individuals to be exposed, violating federal privacy law and elevating the risk of identity theft and other kinds of misuse.[2]


  1. Angel Chen, Differential Privacy. (2020-03) Technology Review p 27.
  2. Paul Overberg, Census Overhaul Seeks to Avoid Outing Individual Respondent Data The Wall Street Journal (2019-11-10)