TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Would anyone in their right mind reopen the GDPR? The IAF’s answer is yes. Related reading: A look at what's in the EU's newly proposed regulation on AI

rss_feed

""

The Information Accountability Foundation believes the EU General Data Protection Regulation should be amended to explicitly include knowledge creation and scientific research as legal bases to process personal data, providing a foundation for the responsible use of artificial intelligence. The IAF’s blog summarizing its comments on the European Commission’s proposed regulation on artificial intelligence suggested that the GDPR “should make possible technology applications such as AI” and that “there are unresolved tensions between the AI Regulation and the GDPR.” In particular, it is the IAF’s view the GDPR needs to be revised to accommodate knowledge creation. Data-driven knowledge creation and scientific research are increasingly the basis for progress. To encourage and effectively control these activities, data protection law must regulate them and set sensible boundaries but not inadvertently preclude them. The AI Regulation is just the latest example of why the GDPR needs to be changed to add both knowledge creation and scientific research as legal bases to process data pertaining to people. The current approach does not enable a clear legitimate way to process data for these purposes and applies the same approach and requirements to the risks associated with the application of knowledge.

Critics will argue that knowledge creation does not always advance progress and that AI can be used to facilitate discrimination and other abusive outcomes. Not all progress is good. For example, advancements in carbon-based energy have resulted in global climate change. Yet stepping in the way of progress can create human costs. Carbon-based energy also has provided the mobility that changes lives. Mobility is good, but global warming is not. Likewise, society embraces the good and bans or mitigates the bad. Examples of knowledge creation are the discovery of fire, through the transmission of energy over the wire, to the development of sustainable, non-polluting energy sources.

Recent knowledge creation has progressed from analog to analog supported by digital to purely digital. Knowledge creation today begins with data. Much of that data pertains to people. Data today is both more robust and more granular. Observation has always been part of the scientific or discovery methodology, but sensors in everything and located everywhere fuel a thinking process that is accelerated by ever-expanding computing power. There is no question that observation in some ecosystems has been overdone and should be subject to checks. When these developments are applied to analyses involving data pertaining to people, the interests that are affected are often about how the balance between good and bad is determined. However, the real issue relates to data use. The use of data pertaining to people should be regulated, but such regulation should not preempt knowledge creation where arguably the risks to individuals are less.

The analysis of data about people in the European Union begins with the GDPR. There is global interest in how the GDPR is functioning because the adequacy provision in the GDPR drives the enactment of GDPR-type privacy laws around the world. Global impact makes global comments on the operation of the GDPR relevant.

In his thoughtful paper, “Fixing the GDPR: Towards Version 2.0,” EU Parliament member Axel Voss points out the GDPR begins with a general prohibition on personal data use unless there is a legal basis to process that data for a specific set of purposes. Neither knowledge creation nor scientific research involving personal data are legal bases to process data pertaining to people under the GDPR. So, the general prohibition on processing too often precludes thinking with data.

Scientific research is a compatible use for data under Article 5 of the GDPR, but scientists still find research difficult to do, particularly when the scientific research involves special categories of personal data, as covered by Article 9 of the GDPR. Minor changes in the GDPR might accommodate fixes for scientific research. However, knowledge creation, most often conducted by the private sector, not the scientific community, has no similar accommodation.

If the secondary use of the data for knowledge creation is not compatible, then a new legal basis for the secondary use of that data must exist. The IAF has spent the last four years looking for ways to make the legitimate interest legal basis work for knowledge creation. A model legitimate interest balancing tool was developed in 2017 with multiple stakeholders to consider a multilateral rather than a bilateral balancing process. Some commentators thought that such an approach had merit but was not aligned with the letter, if not the spirit of the GDPR.

The IAF learned from the European experience when developing its model privacy and data protection legislation, The Fair and Open Use Act. This model legislation also requires that data about people be processed for legitimate uses and includes eleven such uses, including knowledge creation and scientific research.

The European Commission is exploring ways to enhance the European digital economy. Among those efforts is the AI regulation. AI begins with a data-driven knowledge creation process. Data is used to develop and train algorithms. Much of that data began as data pertaining to people. The European Data Protection Board and the European Data Protection Supervisor advised the GDPR has jurisdiction over that personal data in response to the AI regulation consultation. Therefore, either exceptions to the GDPR need to be built into the AI regulation to permit knowledge creation, and scientific research or the GDPR needs to be amended to add knowledge creation and scientific research, with appropriate controls, as legal bases to process personal data. The IAF believes that knowledge creation and research go beyond AI, so the appropriate approach is amending the GDPR.

The IAF team understands that there are digital swashbucklers that will misuse any flexibility built into the GDPR. The swashbucklers will use data with or without new legal tools that do not preempt knowledge creation. The IAF is looking for tools for companies wishing to be highly creative, compliant and responsible at the same time. The IAF believes the best way to confront these swashbucklers is with guard posts built into any expanded legal bases. Knowledge creation needs to have the appropriate safeguards built-in. The Ontario Government released a discussion proposal, Modernizing Privacy in Ontario, allowing scientific research and knowledge to create analysis using deidentified data when consent is not possible using personal data. This proposal contains process examples that are useful in thinking about how to gate the necessary data to drive through to new knowledge.

The GDPR was built to last 20 years. However, experience has presented a significant issue with permissibility. The IAF welcomes 17 more years of the GDPR, but that longevity does not preclude improvements. The IAF made the following suggestions in its comments on the AI regulation:

  • Knowledge discovery should join scientific research as being recognized as a compatible purpose under Article 5. Doing so would recognize that research uses of data, whether they meet the scientific test or not, raises fewer risks than the actual application of data to take an action that impacts individuals.
  • Scientific research and knowledge discovery should be added to Article 6 as legal bases G and H. Saying a use of data is compatible with the purpose for which it was collected is not adequate. All uses require legal standing. The inclusion of data in a research project does not infringe on the fundamental right to dignity. However, the impacts to people of research not done can impact the human status in a manner that does impact dignity. To protect these new legal bases from misuse, appropriate conditions, particularly transparency related to the objectives and safeguards for processing, should be added as well. 
  • A pathway for using special categories of data in knowledge discovery and scientific research should be created in Article 9.
  • Article 35 should be amended to more clearly apply to the application of insights stage (use of data), where there is a much greater risk to individuals and not at the knowledge discovery stage.
  • Article 22 should be revised to describe more clearly profiling and automated decision making. Profiling is central to the knowledge discovery process. As such, it should be subject to a DPIA to ascertain that processing is conducted in a legal and fair manner. Automated decision-making is a separate process. It, too, should be subject to assessments, and the risks from a particular automated decision-making process should be understood and documented. Currently, the GDPR confuses the connections between the two.
  • Lastly, Recital 4, which is (should be) the heart and soul of the GDPR, states: "The processing of personal data should be designed to serve mankind. The right to data protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights . . . ."

The intent of this Recital has not been adopted in the application of the GDPR in areas such as balancing, as required, under a legitimate interest assessment. In the IAF's view, this is a byproduct of failing to explicitly carry forward the Recital's intent into a specific Article. This failure could be accomplished through revisions to Article 5 of the GDPR.

The IAF suggests that improvements would not only benefit and support the AI Regulation but should be a topic for discussion for all data regulated by the GDPR. Please join the IAF as it explores how to advance the addition of knowledge creation and scientific research as legal bases to process personal data and explores the right means to gate the lawful process of knowledge creation and scientific research.

Note: The views expressed in this blog are those of the IAF staff that prepared the IAF comments. They do not necessarily reflect the views of the IAF Board, funders, or broader community.

Photo by Michael Dziedzic on Unsplash


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.