paint-brush
Privacy Enhancing Technologies: Top 3 Use Casesby@vaultree
275 reads

Privacy Enhancing Technologies: Top 3 Use Cases

by VaultreeDecember 15th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

By 2025, 60% of large organisations will adopt PET for processing data in untrusted environments and multiparty data analytics use cases. PET techniques can be applied in AI modelling, cross-border data transfers, and data analytics. They embody fundamental privacy protection principles that become essential to businesses as cybersecurity threats only increase, especially since cybersecurity has become an E-Security concern. The way enterprises manage and secure data is crucial for maintaining their privacy - and the privacy of their clients - as cyberattacks become more present.
featured image - Privacy Enhancing Technologies: Top 3 Use Cases
Vaultree HackerNoon profile picture


The way enterprises manage and secure data is crucial to maintaining their privacy - and the privacy of their clients. As data goes online and cyberattacks become more present, the need for safe environments and defensive tools increases. Privacy Enhancing Technology (PET) techniques can be applied to AI modelling, cross-border data transfers, and data analytics to help security and risk stakeholders manage constraints while respecting individual privacy.


A Gartner study published in June has found that, by 2025, 60% of large organizations will adopt PET for processing data in untrusted environments and multiparty data analytics use cases. PET techniques are more than a trend; they are a necessity. They embody fundamental privacy protection principles that have become essential to businesses as cybersecurity threats only increase, especially since cybersecurity has become an ESG concern.


Some Privacy Enhancing Technologies are among the most advanced for companies worldwide to protect their data. Here are some considerations regarding the three critical use cases:

AI model training and sharing models with third parties

There are some significant compliance and privacy concerns regarding using personal data in AI Model Training. On the one hand, data is necessary for AI modelling, but at the same time, the risks of leaks and security - or at least the repercussions if anything happens - are too high.


Security and risk management professionals should treat privacy risks in AI model training by applying differential privacy and synthetic data, Gartner says. With this, they can protect identifiable data. Federated Machine Learning can also be used to enhance privacy across training stages. Even if governments are behind in regulation, industry leaders should be proactive regarding protecting personal data and invest in PET techniques.


One option is using synthetic data from generative AI so that algorithms can be trained on a synthetic dataset. This helps protect privacy and avoids tracing information back to a person, but it can also be balanced to contain diversity, reducing bias.

Usage of public cloud platforms

It's no wonder companies that use cloud platforms have an even more pressing need to protect the information they store, transfer and work with. An essential PET tool that could be used in this case is cryptography.


Encryption techniques are essential tools to help protect data in that sense. With fully-homomorphic encryption (FHE) it is possible for data to be operated on and modified while in encrypted form. That is, without having to disclose the associated decryption keys. However, among the FHE schemes developed to date, none are considered efficient enough to be utilized in commercial products. Searchable Encryption, on the other hand, has proven to be much more efficient. It allows for an encrypted document collection to be searched, and matching search results are retrieved, while in the encrypted form.


Another tool is confidential computing. Information is isolated and protected by allowing the storage of data and execution of code to only happen in hardware-based trusted execution environments. Many cloud providers already offer confidential computing as part of their services, and companies can further increase protection by looking into third-party encryption solutions and providers.

Internal and external analytics and business intelligence activities

As privacy concerns increase, many restrictions and customer worries, though relevant and vital, become blockages for sharing and analyzing personal data. In addition, internal control can be challenging to achieve when data can be easily accessed and cross-examined within a company. For example, when all parts of the data (information and users) are on the same server, company, or cloud.


To increase internal controls, there are two leading PET techniques that companies can adopt, specialists say. One is synthetic data, which also helps in other case studies such as AI model training. The second one is differential privacy, which has the advantage of not changing the underlying data. It protects information by making alterations between the data at the source and the final answer to any query. In practice, the information is available but not in connection to the individual it concerns.


Privacy Enhancing Technology brings a set of tools that can be used to protect data and information. Cloud computing and data analysis are essential to modern-day businesses, but they don't come without risks. PET techniques can and should be used to mitigate those risks.


Also published here.