TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Voter Analytics and data protection: Early findings from the ICO Related reading: Biden signs bill reauthorizing FISA Section 702

rss_feed

""

In September 2013, I was invited to give an address to the private session of the International Data Protection Commissioners Conference in Warsaw on my recent research on data-driven elections and the implications for personal privacy. The previous year, I had co-authored a report for the Office of the Privacy Commissioner in Canada on the use of personal information by Canadian political parties.  At the time, there were very few individuals within the international privacy community who had researched or written about this subject.  And there were very few enforcement activities by the DPAs, themselves. There had been some guidance on political communication from the U.K. Information Commissioner’s Office and the Commission Nationale de L’Informatique et Libertes in France.

But with those exceptions, most forays by DPAs into the world of elections and political parties had been quite narrow and prosaic.

All that has obviously changed with the series of issues concerning Cambridge Analytica, Aggregate IQ and Facebook. Cambridge Analytica, as a corporate entity, no longer exists, but its name endures as a symbol of what can go disastrously wrong when companies cross ethical and legal lines. All of a sudden, the issue is not just about privacy and data protection. The misuse of personal data in elections, in violation of common legal standards, can influence the choice of governments and have profound consequences for democratic politics.

All of a sudden, the issue is not just about privacy and data protection. The misuse of personal data in elections, in violation of common legal standards, can influence the choice of governments and have profound consequences for democratic politics.

This multi-faceted controversy has raised to public and media attention a number of interrelated policy issues that need to be carefully distinguished: the monopoly power of companies like Facebook in the “platform economy” and their dependence on personal data for their business models; the non-consensual harvesting of information on one’s wider social network through third-party applications; the possible violations of campaign spending limitations; the accountability and transparency for targeted political ads especially from fake foreign accounts; cyberthreats to election integrity; and the larger question of the role of big data analytics in modern elections.

It is that last question that the ICO has tackled in its report on voter analytics released this month. For the first time, a DPA has tried to draw the curtain back on the very complicated world of voter analytics, to paint a picture of the range of organizations involved in contemporary elections, and of the practices they engage in.

There has been a lot of hype about the importance of the “data-driven” election, and recent scholarly work that sheds a skeptical light on the extent to which data analytics do indeed influence election outcomes. The report does not go there, although there is an accompanying research report from Demos which reviews the current and future trends in campaigning technologies.

It is clear that the competitiveness of current elections, in the U.K. and elsewhere, continues to place enormous pressure on major political parties to use data analytics to gain any edge over their rivals. Thus, more data on voters are being captured, and those data are increasingly shared through a complicated and dynamic network of organizations, involving some quite obscure companies that play important roles as intermediaries between the voters and their elected representatives. For the first time, we get a glimpse into that industry from a regulator who was able to use her powers to interview participants, to subpoena records where necessary, and to bring enforcement actions where appropriate.

The accompanying “Investigation Update” details the enforcement actions to date. These include: an intended fine of half a million pounds to Facebook; enforcement actions against SCL Elections Ltd., the parent company of Cambridge Analytica, and against Aggregate IQ, the Victoria-based company that worked for the Vote Leave campaign in the EU referendum; and audits of the main credit reference companies. They also served notice of enforcement against Emma’s Diary, a company that provides advice to women and new parents, that allegedly sold information to the data broker, Experian, which was then used by the Labour Party.   “Data crimes are real crimes” stated Elizabeth Denham in a soundbite that is bound to endure.

Equally revealing, however, is the picture of the relationships between the main political parties and the intricate web of companies that use cutting edge methods of audience segmentation to micro-target voters. Democracy Disrupted provides a detailed and empirically based description of the various sources of personal data that are used to profile the electorate and of how micro-targeting works across a variety of media. Around 40 organizations were the focus of this ongoing inquiry; many other individuals assisted.

For privacy professionals, the report raises some intriguing questions about the application of the General Data Protection Regulation to political parties and election campaigns going forward.

For privacy professionals, the report raises some intriguing questions about the application of the General Data Protection Regulation to political parties and election campaigns going forward. I have raised a number of these questions in my recent writing. It is gratifying to see the ICO remind political parties that although they have a “special status in the democratic process… allowing them to process political opinion data when carrying out legitimate political activities….they have responsibilities as data controllers to comply with all the requirements of the law, including the data protection principles.” Most of the findings in the report concern the lack of transparency about “fair processing.” The report criticizes the parties’ privacy policies for shortcomings in accessibility and clarity, in light of the enhanced privacy notices requirements under the GDPR.

Political opinions are sensitive forms of data.

The UK parties each manage Voter Relationship Management systems that allow them to profile the entire electorate and prioritize voter contact strategies. The processing of political opinion data is permitted by a registered political party, “in the course of its legitimate political activities” under the new 2018 Data Protection Act, provided that it does not cause “substantial damage or substantial distress to any individual.” For any business that supplies data to political parties, and several are mentioned in the report, that business “cannot repurpose that personal data for political campaigning without first explaining this to the individual and obtaining their consent.” Vague and expansive statements of purpose are not likely to be good enough. Equally, political parties need to ensure when sourcing personal information from third-party organizations (including data brokers) that appropriate consent has been obtained. This performance of "due diligence" must be recorded and auditable.  

Vague and expansive statements of purpose are not likely to be good enough. Equally, political parties need to ensure when sourcing personal information from third-party organizations (including data brokers) that appropriate consent has been obtained. This performance of "due diligence" must be recorded and auditable.  

Some political parties, it is reported, use software which assigns a predicted ethnicity and age to individuals, under the contention that this “assumed” or “inferred” data is not necessarily personal information about the data subject. The ICO disagrees. Once this is linked to an individual it does amount to personal data and is subject to the requirements on the processing of special categories of data under the GDPR. There is a significant risk that assumptions or predictions about ethnicity (based for example on the heritage of the name) could be inaccurate and carry significant risks for the individual.

The investigation also identified a lack of understanding among political parties about the legal basis for uploading contact information to social media platforms, such as through Facebook’s Core, Custom and Look-Alike Audiences functions. The popular company, Nationbuilder, also comes under scrutiny. This platform enables political parties to match contact information with data on social media platforms, to customize their voter outreach. The ICO is concerned that political parties are using this platform without adequate information being provided to the people affected. Even where a party got the personal information from publicly available sources such as the Electoral Register, they must still provide a clear privacy notice to individuals. The report also discusses the legality of micro-targeting under the GDPR’s provisions on automated decision-making and profiling. Political micro-targeting may be a type of automated decision-making that does have sufficiently significant effects on individuals, triggering the requirements under Article 22.

The ICO has made a series of ten recommendations, and has issued eleven political parties with warning letters detailing areas of concern and non-compliance. These letters are in advance of Assessment Notices providing for compulsory audits of a selection of the parties. The ICO has also asked the government to legislate a statutory code of practice on the use of personal data in political campaigns.  

Until then, the commissioner has called for an ‘ethical pause’ to allow the key players to reflect on their responsibilities. The language is significant and signals a broader set of concerns beyond privacy and data protection. The provocative title Democracy Disrupted? indicates a recognition that the inappropriate processing of data in electoral contexts not only carries risks to individuals, but threatens trust in the democratic process.  It reflects a general concern that just because a practice might be technically legal under the GDPR, it might not still be in the public interest.

Political marketing is qualitatively different from commercial marketing. “Voter surveillance” is different from “consumer surveillance.” It should not be occurring by default.

photo credit: Janitors 2016 U.S. presidential election party, Riga, Latvia via photopin (license)

Comments

If you want to comment on this post, you need to login.