TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Europe Data Protection Digest | Notes from the IAPP Europe Managing Director, 13 August 2021 Related reading: Notes from the IAPP Europe Managing Director, 6 August 2021

rss_feed

""

Greetings from Brussels!

There’s an emerging debate sparked by Apple’s recent announcement to implement new iOS software features to combat child abuse. This is certainly a noble initiative, the premise of which is not in question. There are, however, questions over Apple’s latest strategic intentions regarding users’ control over their own devices, leaving them reliant on Apple’s promise it won’t overreach and abuse its power.

The new neural network-based program — which will initially be rolled out in the U.S. —  is designed to scan individual users' iCloud photos for child sexual abuse material. As expected, reaction is a mixed bag, with privacy and security experts voicing criticism of varying degrees over the software deployment. Rival technology firms are also voicing apprehension at the proposal, generally remarking that an Apple-built and operated surveillance system could easily be used to scan private content for anything they or a government decides it wants to control. Moreover, for an organization that has prided itself, even built up a competitive reputation on a pillar of privacy — with features such as encryption, mobile application data collection controls — this will be a controversial step for them. Will this ultimately weaken user privacy and the brand along with it? IAPP Associate Editor Ryan Chiavetta gives a good rendition of the story so far, highlighting some of the reaction to date.

Jeroen Terstegge, IAPP's Country Leader for the Netherlands, sums up user impact rather poignantly in a recent commentary on LinkedIn — incidentally, Jeroen’s observations got an Editors' Pick on LinkedIn News, so kudos to Jeroen. A question many users should be asking themselves is whether you are truly in control of any of your device’s functionality. Jeroen has a point here: Companies can remotely adjust the operating software functionalities of your device, taking a high level of control (or surveillance), even to the point where they could disable it. He highlights the need for a serious discussion around the boundaries of such intervention as they intersect with property rights and the right to privacy. What does this entail for the user who is expected to conform with such interference? In the case of smart devices, the boundary between device ownership and the provision of services offered as bound by contract are increasingly blurred.   

The initiative in itself is not something extraordinarily new; other tech companies do actively search photos and videos. Facebook, Microsoft, Twitter and Google all use various technologies to scan their systems for illegal content. That the tech companies are proactively seeking to implement such programs is likely, at least in part, a response to the growing public outcry around the pervasive problems of predatory online child abuse. It stands to reason they feel a moral obligation and responsibility to act. The flip side of the coin is to ask: Are we entering a new age of privatization of intelligence and investigation? And importantly, what will this hold for our regulatory and legal systems? 

On a final note, I’d like to remind the German-speaking privacy pro community of our upcoming Data Protection Intensive: Deutschland 2021 event in Munich, 14 and 15 Sept. Today is the last day to secure the Early Bird rate. We have some great topics and a stellar lineup of speakers, including government, regulatory and an array of experienced cross-sector privacy professionals to help you map and mitigate the contemporary issues and risks facing the field of data protection and privacy. I’m looking forward to seeing those of you who can make it. 

Comments

If you want to comment on this post, you need to login.