TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Europe Data Protection Digest | Notes from the IAPP Europe Managing Director, 21 Aug. 2020 Related reading: Notes from the IAPP Europe, 14 Aug. 2020

rss_feed
GDPR-Ready_300x250-Ad
PrivacyTraining_ad300x250.Promo1-01

""

Greetings from Brussels!

I am recently back from the Luberon in Provence, where I spent a glorious week "en famille." Provence is an ancient and mystical region, soaked in a stunning variety of landscapes: fields of lavender — a favorite of mine — ancient olive groves, patchwork vineyards, and maquis-shrouded undulating hills. In short, spectacular.

While I was away, I did keep an eye on newsworthy stories as they unfolded, and the pick of the week for me was the debacle surrounding the final school year A-Level grades as they were announced in the United Kingdom — yet another example of the endemic COVID-19 impact. Like elsewhere, end-of-year students were unable to sit their final exams, which are all-determinant in the higher education journey as the class of 2020 seeks places at universities. Consequently, A-Level students were graded by the national official exam regulator, Ofqual. Results were based on estimated grades, and rankings compared with every other student at school and within the same estimated grade category as provided by teacher assessments. This data was subsequently put through an algorithm; the most weighted element within the standardized formula was the school’s performance in each subject over the previous three years. The rationale being that the grades — even without exams — would be consistent with how the given schools had performed in the past.

The outcomes were controversial, to say the least, with 40% of school-assessed grades being adjusted downward across England. Similar trends were manifest in Wales and Northern Ireland, as well as Scotland. The public outcry ensued. Moreover, there was a public intervention by the Equalities and Equal Rights Commission when it emerged that students from more disadvantaged backgrounds scored worse. Furthermore, private schools fared far better, with twice as many top grades being awarded to students over their peers in the public sector schools. It has been an anxious and angry week for students across the U.K. as they bid for their future.

Having your future determined by an algorithm seems wrong, regardless of circumstance.

The ICO also weighed in commenting that taken the overall sensitivity around the importance of final grades, it was especially important that personal data be used fairly and transparently. The ICO is engaged with the Ofqual regulator to better understand the extraordinary and exceptional scoring measures posed by the pandemic. The GDPR is also at play here, as it provides limitations on organizations making solely automated decisions that have a legal or similarly significant effect on individuals: Processing needs to be fair in its application. Eduardo Ustaran of Hogan Lovells penned an excellent Privacy Perspectives reflection on the matter with a narrative stemming from Article 22 of the GDPR. As Ustaran states, the rights as protected under this article are there to mitigate “the potentially negative and unfair consequences of relying on technology alone to make life-determining decisions.” The ultimate question as posed by Ustaran is whether the resulting grades were “based solely on automated processing.” There can be no question that the process qualifies as “automated processing,” but were the ultimate decisions “solely” based on it?

Initially defended by the government and on the ropes so to speak, Ofqual has had to reverse their attempt at mathematically standardizing student grades across the country. In a recent statement, the regulator has now conceded that students will be broadly awarded the grades estimated by their schools.

Ultimately, the algorithm performed as it was designed — most likely an imperfect algorithm in the making. What is certain, is that more algorithmic decision making and decision augmenting systems will be used in the future. The learnings from this crisis are twofold: Firstly, our public sector actors need to take a far more carefully considered approach before such automated deployment. More importantly, and what seems abundantly evident, is that individuals still value personal control over life decisions as opposed to automated direction based on empirical data. We are still — I am glad to say — emotionally and intellectually driven, despite the benefits of technology.

Comments

If you want to comment on this post, you need to login.