Frequently Asked Questions

We compiled some of the most commonly asked questions:

TL;DR: We allow analytics and ML to be applied on sensitive data, providing mathematically guaranteed private outputs.
Differential privacy (DP) is a set of systems and practices that help keep the data of individuals safe and private. Differential Privacy offers the strongest possible privacy protection available today, with a mathematical proof to back up each algorithm. Differential privacy is achieved by introducing statistical noise. The noise is significant enough to protect the privacy of any individual in the data, but small enough that it will not impact the accuracy of analytics and machine learning methods applied on the data. Read more about DP on our blog
PVML offers proprietary Differential Privacy technology to extract useful insights and train AI models using datasets containing sensitive information. Our algorithms are performed "backstage" so the outputs returned to the analysts or trusted third-parties are private and ready to be used or shared.
Learn more about how we use DP
TL;DR: As opposed to HE, DP has no overhead in computation and memory cost, and it also guarantees privacy.
Homomorphic Encryption allows computation directly on encrypted data, however - it isn't efficient. Because Homomorphic Encryption comes with a large performance overhead, computations that are already costly to do on unencrypted data probably aren't feasible on encrypted data. Moreover, although the data is unreadable, the computations performed on it remain the same, including the outputs. When outputs are returned in perfect accuracy, the privacy of individuals in the data is not kept, and the dataset remains vulnerable to re-identification attacks.
TL;DR: PVML prioritizes applicable algorithmic capabilities, beyond what science can currently provide in the field.
Applicability is our first priority: PVML incorporates beyond state-of-the-art research objectives along with software engineering and applied machine learning in order to provide the most efficient Differential Privacy algorithms that produce privacy-preserving results with higher accuracy than existing Differential Privacy solutions.
TL;DR: PVML has been verified by legal and technological experts in the privacy field.
The legislation mandates companies to design their products and processes with privacy in mind, meaning that a company is responsible for ensuring and maintaining the privacy of the personal data it handles. We work alongside a legal team and various privacy experts that provide guidance and validation throughout our development process, thus making sure our Differential Privacy algorithms and overall approach maintain individuals' privacy mandated by various privacy regulations.
TL;DR: Yes, anonymization is an outdated technique that leaves data value on the table and fails to guarantee privacy.
Yes! Even when removing personally identifiable information (PIIs), the resulting records often include unique combinations of variables that might be linked to other publicly available information in order to re-identify specific people. In practice, as long as useful information about individuals is included in the data, it is vulnerable to re-identification attacks (and therefore, not anonymous).
Read more about the downfall of anonymization on our blog
TL;DR: No.
Your sensitive data stays wherever it is located (on-premise / on-cloud) and our platform does not require any type of access to it. Learn more

Have other questions? Contact us!