The Gold Standard of Privacy:
Differential Privacy

Differential Privacy (DP) is a vital concept for protecting personal data. It was created to allow organizations and researchers to apply analytics and machine learning methods on sensitive data while preserving the privacy of the individuals in the datasets.

Differential Privacy algorithms require a mathematical proof that the result of any computation performed on a dataset is identical (or at least very close) whether any individual was part of the dataset during analysis or not.

Read more about Differential Privacy on our blog

Differential Privacy guarantees the same output probability whether any individual was part of the data or not.

How Does It Work?

Differential Privacy offers the strongest possible privacy protection available for working with sensitive data, with a mathematical proof that guarantees individual-level privacy. This is done by introducing just enough randomization (statistical noise) during computation, masking the contribution of any single individual’s record and providing the highest level of privacy protection. This nominal amount of variance is insignificant in a broader analytical context, but is crucial to guarantee privacy.

Highly practical:
Zero overhead in computation & memory.

No feature is left behind:
All the data's value is utilized for the computation, without leaving anything on the table.

Protects outputs against reverse-engineering:
DP has a unique quality called "the post-processing immunity" which means no amount of computational power can suffice to expose individual from the underlying data.

Flexible:
DP requirements can be incorporated in any type of data analysis: from queries through training machine learning models to sharing data and even collecting data.

Universal privacy definition:
Mathematically guaranteed privacy, in a world where over 140 countries (and growing) have continuously-evolving privacy regulations.

Tech Giants' Stamp-of-Approval

Differential Privacy is already being utilized by tech giants Google, Apple and Microsoft for their unique competitive edge (but is not offered as a service):

nasa


We are continually expanding our Differential Privacy technology across our products including Google Maps and the Assistant ... Last year we published our COVID-19 reports which use Differential Privacy to help public health officials, economists and policymakers.

developers.googleblog.com
Read More
nasa


The information we gather using Differential Privacy helps us improve our services without compromising individual privacy ... Analysis happens only after the data has gone through privacy-enhancing techniques like Differential Privacy.

apple.com/privacy
Read More
img-logo


How do we create ML models that preserve the privacy of individuals while including the broadest possible data? Differential Privacy simultaneously enables researchers and analysts to extract useful insights ... and offers stronger privacy protections.

microsoft.com/en-us/ai
Read More

PVML's Differential Privacy

Applicability is our first priority: PVML incorporates beyond state-of-the-art research objectives along with software engineering and applied machine learning in order to provide the most efficient Differential Privacy algorithms.

We developed cutting edge technology for data analytics, training machine learning models and synthetic data generation, all using proprietary Differential Privacy algorithms that produce privacy-preserving results with high accuracy guarantees.

sort
DP for queries

query the data and drill-down freely, no query is off-limits and no need to use query auditing or think twice before performing a query.

insights
DP for machine learning

train machine learning models with no reverse-engineering concerns (model inversion attacks).

widgets
DP for existing data interfaces

integrate our Differential Privacy algorithmic capabilities with an existing interface without altering current practices.

blur_on
DP for synthetic data generation

generate high fidelity synthetic data with Differential Privacy guarantees.

Want a simple introduction to Differential Privacy?

Watch our short video:
  What's Differential Privacy?  


Frequently Asked Questions

TL;DR: With large datasets, the skew is insignificant, and we provide an error bound with each output.
Differential Privacy, in the wrong hands, can indeed provide poor results. Our key innovation is in applying our proprietary Differential Privacy algorithms in an optimized way, taking into account the desired computation for best accuracy. Combined with our unique design and architecture - when there is a lot of data - the skew in outputs is miniscule and an error bound guarantee is provided with each output to make sure data teams work with full scientific visibility.
TL;DR: Not with PVML.
Differential Privacy is an emerging research field that has very limited practical algorithmic implementations. In order to provide enterprise-ready software with comprehensive capabilities, we developed cutting-edge algorithms for much more than basic analytics, including training of machine learning models and synthetic data with Differential Privacy. We tied it all together with an easy-to-use product design that requires minimal revisioning to current practices.
Learn more about our product
TL;DR: No.
Differential Privacy comes with zero overhead in computation and memory, as the operation performed on the data remains practically the same except an addition of a statistical noise either on top of the output or during the computation.

Have other questions? Contact us!