TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | At US Senate hearing, Cambridge Analytica whistleblower takes center stage Related reading: Rounding up the fallout from Facebook-Cambridge Analytica

rss_feed

""

At the Senate Judiciary Committee's hearing on "Cambridge Analytica and the Future of Data Privacy," the whistleblower serving on the three-man witness panel was the committee's sharp focus, his observations on how the scandal and ones like it could impact democracy well received and often complimented, at least by the Democrats on hand. 

You may remember that his name is Chris Wylie. He's the pink-haired, nose-pierced young guy you saw in all those news clips on Cambridge Analytica. He flew in from the U.K., where he's a resident, to appear in front of the committee, and while he was joined by Tufts University Professor Eitan Hersh and Public Utility Research Group's Mark Jamison, the committee had little use for those two, instead zeroing in on Wylie for details about what exactly he witnessed in his time as a contractor with Cambridge Analytica and whether those practices could have impacted the U.S. election in 2016. 

Specifically, the committee wanted to know what kind of data-sharing relationship Facebook had with Cambridge Analytica, whether Facebook had knowledge Cambridge Analytica would use data it obtained via Facebook's platform to try and influence the presidential election or other political aims, and what role the government should play in trying to ensure the scenarios — in which U.S. consumers' views are exploited, potentially at the expense of the U.S. electorate's clear thinking — can't recur in the future. 

In his time working for Cambridge Analytics and its parent company SCL from 2013 to 2014, Wylie told the committee he was aware of a relationship between Cambridge Analytica executives and former members of both Russian and Israeli state services, as well as Russia's Lukoil, all interested in gaining insights on potential vulnerabilities in the U.S. voting population — insights such as neurosis, paranoia and racial bias. It's those kinds of insights Cambridge Analytica is accused of exploiting by sharing data on 87 million people with the Trump campaign, which then targeted specific demographics with alleged misinformation, eventually influencing the election's outcome.

Cambridge Analytica, now shuttered, denies any wrongdoing. Facebook says it wasn't aware the company would use the data for commercial or political purposes, an assertion Wylie said can't be true if Facebook read the terms of service any app operating on the platform must submit before being onboarded. 

To be clear, Wylie testified the Trump campaign was not a client while he was employed by Cambridge, and Cambridge says any Facebook data it scraped was deleted upon request from Facebook in 2015 when it learned of the download. But whether that's true or not, Hersh testified he doesn't believe that data would have been effectively used to sway the votes of targeted populations. 

"Compared to mobilization, persuasion is very, very hard," he said, adding that research has shown an advertisement might change a voter's mind for a fleeting moment, but "no person is persuadable all the time." And further, "when you try to find out who is persuadable, predictions are often wrong." For example, using algorithms to predict who is black or Hispanic yields incorrect data about 20 percent of the time, he said. So targeted messages aren't even landing. What's important to this debate on regulating tech giants and the potential spread of misinformation, however, is that "There is a lot we do not know about how campaigns are using social media to target voters ... we don't know where the line is between trying to persuade voters and lying to voters, and Facebook hasn't taken an active role in deciphering this." 

And that's where Wylie takes issue. Of course, campaigns have always used data sets to try and gain insights that might inform their strategies. But the Cambridge Analytica scandal unveiled that there's a "new game being played," he said. Specifically, Wylie points to the difference between targeted marketing and exploiting mental vulnerabilities. 

"Traditional marketing, first of all, doesn't misappropriate tens of millions of users' data," he said, "and it is not or should not be targeted at people's mental vulnerabilities, such as neuroticism or paranoia or indeed racial biases. Traditional marketing doesn't exacerbate people's innate prejudices and doesn't coerce them and make them believe things that aren't necessarily true." 

He pointed to billionaire Robert Mercer's investment funding of $20 million in Cambridge Analytica's U.K.-based front company, SCL — reportedly paid that way to avoid being labeled as a client, which would violate U.S. election laws — at the urging of Steven Bannon, who'd later become a member of Trump's team. Wylie said Bannon was seeking out a "culture war" and believes politics is downstream of culture, so to influence politics, you have to focus on the culture. "They were seeking out companies that could build an arsenal of informational weapons to fight that war," which is why they went to Cambridge. 

Wylie testified the Mercer investment was used in part to target messages in swing states, as well as specific ethnicities. There were "videos created by the firm that were sadistic and Islamophobic ... to target anybody that would vote for the Democratic Party, including the African-American population," he said.

But what senators grappled with was the appropriate governmental response here. After all, Facebook isn't technically in the business of selling user data.

Wylie said, "They’ve created a platform that encourages the use of data, so it’s true you can’t go to Facebook and simply buy Facebook's data, but they make it readily available to its customers via its network of applications or the fact that the layouts of people’s profiles on Facebook make it very conducive to scraping data, for example. Although Facebook would say they don’t allow that, they still create a setup which catalyzes its misuse, in my view." 

So if the company isn't breaking the law, is the solution simply that users uncomfortable with its terms and conditions should opt out of using the service? 

Sen. Kamala Harris, D-Calif., made the comparison to the offline world. She said most Facebook users have no idea how the company is using their personal data, data it leverages for 98 percent of its revenue. She cited the company's tracking of user IP addresses, locations and activities on other websites, whether they're actively logged into Facebook or not. 

"In the real world," she said, "this would be like someone following you every single day as you walk down the street, watching what you do, where you go, for how long, and who you're with. For most people, this would feel like an invasion of privacy, and they'd call the cops. And yes, it's laid out in terms of service, but let's be honest, few Americans can decipher or understand what this contract means." 

She added it's the "government's responsibility to help create rules that enable a more fair bargain for American consumers." 

Wylie agreed. He cited other American industries, like food, cars and airlines, in which rules are put in place to keep consumers safe. 

"If this affects everybody every day, there should be some level of public oversight," he said, adding that the issue at hand is information distortion. "If there are potential disruptive distortions in the market, we regulate that. What you're talking about is a distortion to the access to information we have and engagement with that information." 

But can't users just opt out of using Facebook if they aren't down with the company's use of their data? 

To that, Wylie said even lawyers whose job it is to read through terms and conditions find them too dense, and it's unreasonable to place such a burden on an ordinary person. Further, "It is very difficult to be a functioning member of the workforce or society and refuse to use the internet. I don't know a job that would let you go in and refuse to use Google. Although we use this narrative of choice ... if that's the only way you can get a job, it's not really fair." 

He said the matter at hand isn't informed consent; the question is whether the cost of doing business with companies like Facebook is proportionate to the benefit the user is getting now and 10 years from now, when the data harvested about them now still exists and technology advances. 

Comments

If you want to comment on this post, you need to login.