TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | A forgotten right gets into action in UK A-Level controversy Related reading: GDPR Genius

rss_feed

""

""

For the past 69 years, U.K. families with school leavers have experienced the same mid-August ritual. Their steadily building anxiety would reach its climax on the day the Advanced Level results were announced, leading to great joy or utter disappointment as students opened an envelope revealing the outcome that would ultimately determine the crucial next stage in their education. 

In many respects, this year has been no different — the anxiety, the joy, the disappointment — but due to the COVID-19 crisis rampaging through the world and the lack of exams to decide A-Level grades, the process leading to that life-defining outcome has been very different.

According to the U.K. government’s own words, to bring consistency to teacher judgments across all schools and colleges and to make sure results were comparable with previous years, a standardization model was developed by Ofqual, a non-ministerial government department that regulates qualifications, exams and tests in England. The model generated a prediction of the grade in each subject and gave an allocation of the grades based on data, including the schools’ historical grade distributions and prior attainment of their students in previous years. 

Crucially, the A-Level grades were not the result of an exam for which students prepare for two years but the outcome of a computer algorithm.

Here is where the EU General Data Protection Regulation comes in — more precisely, the Article 22 right not to be subject to fully automated decision-making that significantly affects individuals. Dwarfed in popularity by the right of access and right to be forgotten, the Article 22 right has largely been forgotten for well over two decades. In a previous life, this right existed in Article 15 of the old 1995 Data Protection Directive, and the GDPR just incorporated into its framework a slightly more elaborate version of the same concept.

Partly due to its complexity when compared to other data protection rights and partly due to the fact that its most obvious uses — such as challenging employment or financial decisions — are explicitly excluded by the exemption applicable to contracts, traditionally, this right has hardly been exercised. But the world has changed a lot since the '90s, and the increased role of technologies, like artificial intelligence, in our lives is paving the way for this right to become a hugely important element of the regulatory framework governing the use of personal data.

The way in which the A-Level results have been awarded this year creates a unique situation, which, on the face of it, is precisely what this legal right was devised for: to mitigate the potentially negative and unfair consequences of relying on technology alone to make life-determining decisions. In the particular case of the A-Level results, the applicability of this right rests on the question of whether the decision was “based solely on automated processing.” There is no question that this qualifies as “automated processing,” but was the decision “solely” based on it?

Like most things in law, this is debatable given that teachers and schools provided valuable input, but what is significant is that unlike with the exam-based A-Levels, the decision on the grades awarded appears to have been entirely generated by an algorithm. This is understandable given the sheer amount of data and the impossibility and unfairness of a human being making the decision to award the appropriate grades to hundreds of thousands of students. But the bottom line is that this right exists and provides for human intervention.

In practice, the right to seek human intervention, in this case, may need to be exercised against the specific exam board that formally awards the A-Level grades in each individual subject at each school, despite the fact that Ofqual was responsible for developing the standardization process. In that scenario, the relevant exam board will be compelled to take a new decision that is not based solely on automated processing.

Aside from these legal technicalities, there is a very powerful lesson emerging from this situation: We are not ready yet to rely on fully automated decision-making for important public policy matters that affect people’s lives. So those developing and implementing these technologies will need to be mindful of the need for proper human intervention and judgment for the foreseeable future.

Photo by Green Chameleon on Unsplash

GDPR Genius

This interactive tool provides IAPP members ready access to critical EU General Data Protection Regulation resources — enforcement precedent, interpretive guidance, expert analysis and more — all in one location.

View here

European Data Protection Law and Practice

European Data Protection reviews concepts, criteria and obligations of the GDPR and related laws, examines the territorial and material scope of the GDPR, legitimate processing criteria, information provision obligations, data subjects’ rights, security of processing, accountability requirements, and supervision and enforcement. The book also provides practical concepts concerning the protection of personal data and cross-border data transfers.

Print version | Digital version


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.