TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Rounding up the fallout from Facebook-Cambridge Analytica Related reading: Council of the European Union adopts data transfer framework with Japan

rss_feed

""

""

Reaction continues Monday to news over the weekend that data analytics firm Cambridge Analytica used, and allegedly did not delete as agreed, what may have been personal information of 50 million Facebook users. The effects of the incident may have far reaching implications for Facebook and how people view companies' data use policies more broadly. 

Last Friday evening, Facebook VP & Deputy General Counsel Paul Grewal announced the company was suspending the accounts of Strategic Communication Laboratories, including their affiliate Cambridge Analytica as well as at least two other researchers involved in the incident. 

The move and announcement came just before The New York Times and The Observer published extensive stories about how Cambridge Analytica allegedly used the personal information of 50 million Facebook users, without their permission, for political purposes. According to former Cambridge Analytica employees who spoke with the Times, "one of the largest data leaks in the social network's history ... allowed the company to exploit the social media activity of a huge swath of the American electorate, developing techniques that underpinned its work on President Trump's campaign in 2016." 

Separately, an in-depth article in The Guardian interviews Cambridge Analytica co-founder Christopher Wylie, who has decided to go public about "his role hijacking the profiles of millions of Facebook users in order to target the U.S. electorate." 

According to Facebook, "In 2015, we learned that a psychology professor at the University of Cambridge named Dr. Aleksandr Kogan lied to us and violated our Platform Policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica ... He also passed that data to ... Wylie of Eunoia Technologies." The app in question, called "thisisyourdigitallife," offered a personality prediction, and nearly 270,000 users agreed to "plug in" the app. By doing so, users consented to providing access to their data, including "limited information about friends who had their privacy settings set to allow it."

By sharing the data with a third party, Kogan violated Facebook's "platform policies" and Kogan, SCL/Cambridge Analytica and Wylie's Eunoia all certified to Facebook that they had deleted the data. "Several days ago," Grewal wrote over the weekend, "we received reports that, contrary to the certifications we were given, not all data was deleted." He said the company is now "moving aggressively to determine the accuracy of these claims." 

The fallout

The reports have politicians and regulators reacting with strong statements, both in the U.S. and Europe. Sen. Amy Klobuchar, D-Minn., has called for Facebook Chief Executive Mark Zuckerberg to appear before the Senate Judiciary Committee to explain what his company knew about how its users' data was used "to target political advertising and manipulate voters." Massachusetts Attorney General Maura Healy said her office will investigate as well. 

Likewise, in a series of tweets, EU Justice Commissioner Vera Jourova described the incident as "horrifying, if confirmed." She also added she'll be in the U.S. this week and will seek answers from Facebook, while welcoming the U.K. Information Commissioner's Office announcement that it has already begun an investigation. 

In the U.K., the ICO Commissioner Elizabeth Denham, whose office has been investigating Cambridge Analytica for the past year, said, "This is a complex and far reaching investigation for my office and any criminal or civil enforcement actions arising from it will be pursued vigorously."

The incident has also raised a debate about whether it constitutes a data breach or not. Facebook has vehemently denied that the incident was a data breach, but The Guardian decided to call it a data breach. In an update to its blog post announcing the suspension of SCL, Grewal said such a claim "was completely false. ... People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked." Facebook Chief Security Officer Alex Stamos also posted a series of tweets defending the company against such allegations. He later deleted those tweets. 

Others, like Zeynep Tufecki and Nicole Wong said the incident has much more broader and significant effects than a data breach. 

Questions have also been raised about whether the company has audited third parties that siphon user data. Sandy Parakilas, a former privacy manager at Facebook, told The Washington Post that during his time with the company, it did not conduct any audits of developers. He said Facebook "relied on the word of Kogan and Cambridge Analytica to delete the data, rather than conducting an audit, which they had a right to do in the case of Kogan. They did not investigate further, even after it became clear that CA had bragged about having 5,000 data points on every American, data which likely came from Facebook." 

U.K. conservative lawmaker Damian Collins, who heads up a parliamentary inquiry into fake news and to alleged Russian meddling in the Brexit vote, also wants Zuckerberg to testify. He said the incident "creates a false reassurance that Facebook's stated policies are always robust and effectively policed." 

In an interview with The Washington Post, David Vladeck, former director of the U.S. Federal Trade Commission's Bureau of Consumer Protection, said the incident may have violated Facebook's 2011 consent decree. "I will not be surprised if at some point the FTC looks at this. I would expect them to," he said. 

Jessica Rich, who also served as director of the bureau and was deputy director under Vladeck, said, "Depending on how all the facts shake out, Facebook's actions could violate any of all of these provisions, to the tune of many millions of dollars in penalties. They could also constitute violations of both U.S. and EU laws," adding, "Facebook can look forward to multiple investigations and  potentially a whole lot of liability here." 

Facebook pushed back on claims it may have violated the FTC's consent decree. In a statement, the company said, "We reject any suggestion of violation of the consent decree. We respected the privacy settings that people had in place. Privacy and data protections are fundamental to every decision we make." 

Max Schrems, an Austrian lawyer and privacy activist who took down the EU-U.S. Safe Harbor agreement, said he told Ireland's data protection authority of loopholes in Facebook's policy that allowed apps to "harvest" data about their friends without consent some time ago. "We flagged it in 2011," Schrems said. "Now it emerges that in 2014 Cambridge Analytica started doing precisely what we warned about three years earlier." 

In comments to The Privacy Advisor, Hogan Lovells Partner Eduardo Ustaran, CIPP/E, said, "These revelations expose data practices that have been an area of concern for regulators for a while. Essentially, they feed the regulators' worst fears about the digital economy. I think that in the same way Snowden had a very visible effect on the law around international data transfers, this will shake the current debate about consent and other lawful grounds for processing. It is very likely that at least in the first instance, the calls for the exclusive reliance on consent for any kind of big data analytics and profiling will be stronger than ever. We can expect a tightening in the level of tolerance by policy makers and regulators for data collection practices through our daily interaction with technology." 

In line with Ustaran's comments, The Wall Street Journal reported the incident has ignited debate about third-party access to user data. David Carroll, a U.S.-based professor who is suing Cambridge Analytica, said, "This could be a data privacy reckoning for Americans. It's a wake-up call." 

Facebook is also reviewing whether one of its research employees knew about the leak. Joseph Chancellor, who works at the social media company as a social psychology researcher, was also co-director of Global Science Research, a company involved in the data sharing incident. 

Many in the academic community are concerned about the effects this will have on researcher access to Facebook user data. Solomon Messing, a former data scientist at Facebook, expressed concerns about research ethics, noting that the data Facebook makes available is "tremendously valuable to social science." 

By early Monday morning, Facebook shares were down more than six percent, and it appears the fallout will continue for some time to come. 

Top image is a screenshot of The Guardian's interview with Wylie via YouTube.

4 Comments

If you want to comment on this post, you need to login.

  • comment William Marshall • Mar 19, 2018
    The really alarming thing that doesn't seem to be highlighted in the initial coverage is how CA used the data.  Wylie explains that they used the data to identify people vulnerable to influence and then used an algorithm to present "fake news" to the individual from seemingly disparate sources thereby bolstering its authenticity.  The subjects were in a very real sense living in an alternate reality.
  • comment Peter Meyler • Mar 19, 2018
    Another important issue is whether there is informed consent for the use of personal information. The article notes that Solomon Messing thinks that Facebook's data is "tremendously valuable to social science." However, an individual should need to consent to being the "monkey" for the social scientist.
  • comment Sheila Dean • Mar 19, 2018
    From Guardian's 'psychological warfare tool' article
    
    "Robert Mercer was a pioneer in AI and machine translation. He helped invent algorithmic trading – which replaced hedge fund managers with computer programs – and he listened to Wylie’s pitch. It was for a new kind of political message-targeting based on an influential and groundbreaking 2014 paper researched at Cambridge’s Psychometrics Centre, called: “Computer-based personality judgments are more accurate than those made by humans”.
    
    Mercer's wikipedia page states he opposed the Civil Rights Act of 1964 saying, "It was the biggest mistake ever made," (giving all people of color - not just African Americans the right of suffrage). 
    
    Look at what AI can do when the grey market of politics decides your their target and their enemy.
  • comment Derek Lackey, CIPM • Apr 16, 2018
    This whole mess is less a privacy issue and more an ETHICS issue. What Cambridge Analytica used the data for is a breach of ethics. They crossed the line. And to whatever degree she can, Elizabeth Denman will hold them accountable using the data/privacy framework. Trump has NO incentive to investigate as it would only incriminate his questionable "win at all cost" mentality.  Ask the disgraced Enron execs how that turns out.