TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | Automating business process risk assessments Related reading: Opt in, opt out - consent is what it’s all about 



The EU General Data Protection Regulation has brought plenty of compliance challenges for companies subject to its rules. One of those tasks is to create records of business processing activities, perform a manual investigation to determine the level of risk within the processes and then decide whether to conduct a data protection impact assessment.

Under Article 35 of the GDPR, DPIAs are mandated when data processing is "likely to result in a high risk to the rights and freedoms of natural persons.” The problem is, no one has been able to truly define "high risk." The Article 29 Working Party has attempted to clarify when DPIAs are required.

The European Data Protection Board recently released its opinions on the draft lists produced by the supervisory authorities of 22 EU member states on what they identify as high-risk data processing activities that would trigger a DPIA. (For more on these lists, IAPP Senior Westin Research Fellow Muge Fazlioglu, CIPP/E, CIPP/US, provided a comparative analysis of those lists for The Privacy Advisor.) For example, the supervisory authorities listed processing biometric, location, genetic data, and information collected by third parties as activities that reach the DPIA requirement. 

TrustArc CEO Chris Babel identified this challenge as one his company had the resources to manage, which eventually led to the creation of the TrustArc Intelligence Engine, a new feature that recently debuted at the IAPP Privacy. Security. Risk. conference in Austin, Texas.

“We basically looked and said we have the database of the information you need to trigger rules,” said Babel. “We have the intelligence you need from our team of privacy pros led by [TrustArc General Counsel and Chief Data Governance Officer] Hilary [Wandall, CIPP/E, CIPP/US, CIPM, FIP], and we have the engineers to codify risk-based rules" to identify when a DPIA should be performed. 

Within the TrustArc platform, users list the data elements of each business process, which can range from names and addresses to more sensitive information such as religious and political preferences. Users of the platform will also be required to enter how the data will be used.

TrustArc's engine continuously examines an organizations’ business process and data inventory through a proprietary algorithm. Should any combination surpass the level of elevated risk defined by TrustArc, it will notify the user about the issue and give them the opportunity to conduct a DPIA on the spot within the TrustArc platform.

“As soon as it sees various combinations that get above the bar we’ve defined, it triggers and shows you what could be at high risk,” said Babel “It gives you more information and tells you why it could be at high risk, and there are a ton of different combinations. It could be what data you are collecting and what you are doing with it.”

Babel added the algorithm also takes into consideration the amount of information in a particular business process. One business process with 1,000 records will be judged by a different standard than another with 100,000.

The TrustArc Intelligence Engine after it evaluates a business process.

Entities can choose to forgo a DPIA if they wish to address the risk in a different manner. If a user decides to go forward with an assessment, the intelligence engine will help automatically populate the form with the information at risk.

“You can click in and see why it spit out that result of high risk. You can then elect to do a DPIA, which puts you into doing the assessment with a lot of pre-populated information,” said Babel. “You can say, ‘I don’t think I need to,’ and then you leave the notes and capture whatever you need to as rationale for why you didn’t think you needed to do a DPIA." As an example, Babel said a user could go in and manually fix a business process record should a business unit enter incorrect information.

Babel said PSR was a great event to debut its intelligence engine, as many of the attendees of the conference currently face these problems on a consistent basis. Adding an automated feature to assist in GDPR compliance can help entities stay on top of their obligations.

“We are now entering a new part of all of this compliance where you need to be confident that what you put down on May 26 is still accurate. That is an increasingly big worry,” said Babel. “The second big worry is what the CCPA means for all of this.”

In the wake of the GDPR, companies now face the California Consumer Privacy Act of 2018, and with it on the horizon, Babel said companies will look at their work with the GDPR and see how much of it transfers over to the CCPA.

Companies do not want to re-do all of their GDPR work for the CCPA. If it comes to that, Babel notes, companies will quite frankly “lose their minds.”

photo credit: Got Credit Risk Key via photopin (license)


If you want to comment on this post, you need to login.