Differential Privacy

Smart Appliance Screens
March 10, 2020
March 10, 2020

Differential Privacy

Differential privacy is achieved by strategically introducing random noise into a dataset.

Berkeley-based Oasis Labs develops cryptographic techniques that allow companies to use personal data in their algorithms without seeing our individual data points. This is an example of differential privacy, a mathematical concept that has been around for over a decade.

Differential privacy is achieved by strategically introducing random noise into the dataset. It is most useful when answering simple (low-sensitivity) queries.

Companies like Apple and Google now use the technique as a way to analyze aggregate data without compromising user privacy. It’s good for finding out traffic patterns in Google Maps, determining the most popular emoji for iPhone users, and discovering ride sharing trends across Uber’s global network—while keeping individual user behavior anonymous.

The U.S. Census Bureau will be using differential privacy in the 2020 Population Census.

A team of Amazon researchers recently proposed using differential privacy as a way to anonymize customer-supplied data. Apple set itself apart from its competitors by integrating differential privacy into its Safari browser, and Google uses its own differential privacy tool called RAPPOR as people search online.

It is important to remember this method is still evolving. Depending on applications and data sets, differential privacy is harder to maintain when variables are correlated. And at the moment, there is scant regulatory guidance for use of the approach in the tech industry.