Privacy-enhancing technologies are regularly used — unsurprisingly — in products and services where some form of privacy-preserving or privacy-enhancing ability is needed. For organizations, the integration of PETs is commonly thought about in the context of risk and compliance. For example, trusted execution environments, a type of PET, are used by banking and finance companies for transaction verification and digital signing. This helps reduce the likelihood of fraud, data breaches and other attempts by malicious actors to compromise their systems.
In this way, the use of TEEs can be seen as a risk reduction measure. Indeed, American Express' 2024 10-K filing includes "transaction processing [and] authentication technologies and digital identification" as technologies in which they need to continue to invest in, lest their "revenue and profitability" becomes "materially adversely affected."
However, PETs are not just a compliance and risk-mitigation measure. In short, they do more than enhance privacy. Increasingly, their use provides actual utility to organizations when it comes to data-driven innovation. This is particularly notable given organizational, economic and policy focus on AI: 54% of respondents identified the potential adverse outcomes of AI technologies as one of their top digital risks in the IAPP's Navigate: Digital Risk Index 2025 report.
Organizations are having to contend with the fact that data-gathering and sharing provides innovative value yet also causes risk and compliance conundrums. Refusing to participate in the ever-growing data economy likely means being left behind, so organizations are increasingly turning to PETs for assistance. The IAPP will examine specific use cases for PETs in the future. Below is an overview of these technologies that covers their progressively larger role as innovation enablers.
Case studies
Since at least 2024, Mastercard has been exploring the integration of multiple PETs into their systems. The corporation found that using synthetic data provided them with a competitive advantage when performing analysis in emerging markets since existent data may be of poor quality or simply too difficult to collect. Additionally, they developed a proof-of-concept model using federated learning and homomorphic encryption — both of which the IAPP has previously written about — to share financial crime information between borders, including India, Singapore, the U.K., and the U.S. While the use of these technologies in this context helps them more easily meet their compliance requirements, e.g., it's easier to contend with cross-border dataflow requirements if large datasets potentially filled with personal data don't have to be migrated across jurisdictions, it's also a business advantage. Easier compliance requirements mean less resourcing is needed to deliver on said compliance. Additionally, the relative ease of sharing and processing this vital information more than likely helps Mastercard detect financial crime and other fraudulent activity earlier — potentially even preventing an attack altogether — which protects their brand from reputational harm and positively impacts their bottom line.
Google has also implemented TEEs in their ad platform to both enhance privacy and data protection while also providing enhanced insights and benefits to the users of their ad platform. In what Google calls "confidential matching," users of their ad platform upload first-party audience lists, which typically contain sensitive information such as email addresses and other contact information. Google's ad service then strips the lists of unused identifiers and outputs a list of matched Google users. Ad managers can then use the list of Google users to more successfully target their audience and thus create more effective ad campaigns. Importantly, the use of TEEs means that the processing of these uploaded audience lists is processed in such a way that no one, not even Google, can access the data being processed. Google has implemented this service by default and at no additional cost for the user. In other words, Google's use of TEEs improves their product, enhances security, and delivers clear benefits to the end-user of their product.
Looking at health care
The European Health Data Space Regulation, launched in early 2025, is a European Union framework designed to primarily help citizens contend with cross-border control over their personal health records, but also unlock large, anonymized datasets in a central location for public research and innovation. The framework's secondary purpose is only possible due to the use of PETs; medical researchers (through an infrastructure known as HealthData@EU) are able to access this data without privacy issues because of methods used to securely store and process the data, like pseudonymization, federated learning, and homomorphic encryption. The EU has just recently touted the benefits of such a framework, where easier and more streamlined access to secure health data has produced innovations in earlier detection of sepsis and breast cancer through the use of specifically trained AI models.
Formally acknowledging PETs
Many governments and associated bodies are recognizing both the importance of PETs from a privacy and data protection standpoint and as an innovative technology. The U.S. National Institute of Standards and Technology has developed a "PETs Testbed" for organizations to "investigate [PETs] and their respective suitability for specific use cases." Additionally, the U.S. recently created the Federal Cloud Credential Exchange, which utilizes federated learning to manage identity sharing between U.S. citizens and the federal government.
The U.K.'s Information Commissioner's Office released guidance in 2023 on how organizations can use PETs to achieve regulatory compliance. The guidance also identifies how PETs can provide value through the ability to "access datasets that would otherwise be too sensitive to share." Additionally, the ICO has written specifically on the value of PETs beyond compliance, citing greater data availability and collaborative statistical analysis as potential use cases.
In the EU, one of the key pillars of the Data Act is to encourage organizations to participate in data sharing while still maintaining confidentiality. The text of the act calls out pseudonymization specifically as a way for "algorithms to be brought to the data and allow valuable insights to be derived without the transmission between parties or unnecessary copying of the …data." Besides the Data Act, other EU institutions, such as Eurostat, have seen amendments to relevant regulations that include PETs as a way to innovate and comply with data protection laws. In early 2024, the European Parliament amended regulation for Eurostat: Institutes may set up "a secure infrastructure …to facilitate …further [data] sharing."
Singapore's Infocomm Media Development Authority developed a PET sandbox that provides organizations a space to pilot their PETs integration with regulatory guidance provided by the IMDA. The program additionally provides grant funding for organizations who are looking to implement PETs. The IMDA, as part of the PET Sandbox, also recognized the utility of PETs for AI model training and thus released the PET: Proposed Guide on Synthetic Data Generation, which assists organizations in their understanding and application of the technology.
Potential roadblocks
The above cases and government recognition exemplify the innovative value of PETs. However, there are several questions and challenges organizations may face when determining whether or not to implement them. Chief among these is balancing available resources with the projected cost of development and integration since deploying PETs can be an expensive endeavor. This is especially true when working with AI technologies; their rapid evolution paired with many models' needs for massive amounts of data incur relatively heavy computational costs, even without integrating PETs.
Consider building or using a machine learning model, for example. These models are typically trained on or process unencrypted data, which could present privacy concerns. Homomorphic encryption, however, allows users to perform operations and build models on encrypted data without needing to decrypt it first, protecting the privacy of the dataset while still providing utility. For instance, Apple has identified homomorphic encryption as "one of the key technologies" they use for machine learning, ensuring "server lookups are private, efficient, and scalable." While homomorphic encryption provides clear privacy benefits, it also presents challenges. Its use is even more computationally intensive and requires expertise to implement properly, both of which can be incredibly costly.
These challenges are not unique to just homomorphic encryption. All PETs are use case specific and require high technical and talent costs. Thus, organizations should thoughtfully examine their utilization. Privacy-by-design principles are instructive here; consider the use of PETs from the outset of a project rather than as a bolt-on feature later down the road.
To the future
A common refrain when discussing PETs is that they are not a silver bullet for privacy compliance or risk reduction. Simply implementing any of these technologies does not automatically insulate an organization from regulatory enforcement or ingratiate them with customers. But this train of thought also increasingly misses the bigger picture on the benefits and efficacy of PETs. Inherently, organizations will always support privacy efforts. Yet, those debating further investment or deployment of these technologies should recognize that the benefits of implementation and deployment go beyond risk mitigation and compliance.
In the end, many businesses still view privacy and its implementation as a defensive necessity or a fortress wall built to shield against regulators or avoid costly fines. But this is a limited perspective. Privacy-enhancing technologies are not just another tool in a privacy toolbelt. Rather, they are a strategic advantage. By allowing companies to analyze, share, and monetize insights from large datasets without ever exposing the sensitive raw data, PETs enable organizations to innovate. They transform privacy from a compliance-bound liability into a strategic asset, enabling a new class of innovation built on a foundation of trust and security. Businesses should soon realize that PETs don't just protect them from the future of regulation — they give them an advantage in building the future of a successful, more in-tune, data-driven practice.
Brandon LaLonde is the research and insights analyst for the IAPP.