Differential privacy allows companies to analyze data without learning too much about users
By Robert McMillan | The Wall Street Journal
Last year, Apple Inc. kicked off a massive experiment with new privacy technology aimed at solving an increasingly thorny problem: how to build products that understand users without snooping on their activities.
Its answer is differential privacy, a term virtually unknown outside of academic circles until a year ago. Today, other companies such as Microsoft Corp. and Uber Technologies Inc. are experimenting with the technology.
The problem differential privacy tries to tackle is that modern data-analysis tools are capable of finding links between large databases. Privacy experts worry these tools could be used to identify people in otherwise anonymous data sets.
“The availability of data has made it easier and easier for connections to be made and private information to be linked and de-anonymized. As such, many companies that collect or analyze data have started implementing tools to decrease the risk of this. One such tool is differential privacy. Differential privacy allows for a calculable amount of controlled confusion, so to speak, to be included within data being collected and analyzed so that analysts of the data would be unable to tie it back to a specific person. This, of course, is an extremely simplified explanation. Although such tools are useful, they are hardly a fail-safe method of ensuring information privacy.”