TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | O'Neil to privacy pros: Help me destroy big data's lies Related reading: Algorithmic transparency: Examining from within and without

rss_feed

""

Cathy O'Neil is out for blood. Her target? Big data. 

That was the message she delivered to privacy pros during her keynote address at Privacy. Security. Risk. 2016, and she asked for their help in her mission. 

To be clear, what O'Neil really aims to destroy are the algorithms that users of big data employ; algorithms she calls weapons of math disruption, or WMDs, which she says have very real negative impacts on vulnerable populations. She uses the acronym WMDs to say that such algorithms are widespread, affecting many people in many ways; they're mysterious, in that those affected rarely even know they're being affected by such an algorithm, and they're destructive in their impact. The destruction O'Neil saw unfolding around her as a result of such algorithms left her "so disgusted and disillusioned" that she left her career in the finance world and wrote a book. 

But it's not so much the algorithms themselves that are the problem. We, as human beings plugging in the data and creating the algorithms, we're the problem. 

Here's a real-life, practical example: O'Neil cooks dinner for her children. She creates an algorithm for the scenario. Her objective in this scenario is for dinner to be a success. The data she curates to feed into the algorithm machine in this case is the food contained in her house with which to make dinner. What does a success mean? Well, for O'Neil, it would mean there was no crying and the kids ate their vegetables. But success to her seven-year-old son? That would mean he got a lot of delicious Nutella, and so he would have had to feed different data into the algorithm. 

But that doesn't so much matter. No one cares if O'Neil's family had a "successful" dinner. But this kind of scenario plays out in much broader, much more nefarious scenarios with dire consequences, like people going to jail or getting arrested or not getting that loan. And these things happen without the data subjects understanding what kind of algorithm's been applied to their situation. 

"People don't know that they are being scored," O'Neil said. "Depending on the environment you're in, it can cut off opportunities for you."

Despite her attempts via a Freedom of Information Act request, O'Neil couldn't get access to the source code to the algorithm used to predict student success.

For example, O'Neil cited the campaign over the last 10 or 20 years for the U.S. to "get rid of bad teachers." This was done via a statistical algorithm which would give each student in any given class an expectant score for that student's test score based on their academic history. Then, the teacher was graded based on the difference between the expected score and the actual score. But there's a lot of uncertainties within that kind of algorithm, obviously, including whether the a student had eaten breakfast that day, whether it was a hot day and no air conditioning, whether the child had slept the night before, etc. 

Despite her attempts via a Freedom of Information Act request, O'Neil couldn't get access to the source code to the algorithm used to predict student success. The the vendor employed by the school systems to create the scores considered the algorithm proprietary. 

O'Neil interviewed for her book a Washington, DC, teacher was was fired based on the score the algorithm popped out. Two hundred other teachers that same year were fired for the same reason. Many teachers, frustrated by the system, left the schools they worked at, which were often inner-city schools serving poor kids, and moved to more affluent suburbs who weren't using the scoring system to weed out the bad ones, destroying the original purpose of the algorithm. 

"The original purpose was to eliminate bad teachers," O'Neil said. "What's happened is a lot of good teachers retired early or quit, because they didn't want to work in a regime that was arbitrarily imputative. But punishing teachers of poor kids is exactly what we don't want." 

O'Neil cites predictive policing as another example of where we, as a society, are using big data unethically. While research shows white people and black people smoke pot at the same rate, more or less, black people are much more likely to get arrested for doing so, anywhere from three to 10 times more likely. That's because police are using algorithms that aren't objective or fair. While the data makes it look like black people are more likely to be holding contraband, that's not true. 

"It's a choice we've made as a society to police like that," O'Neil said. "It's not objective, and it's not fair. The algorithm takes in the data, looks for patterns and establishes the definition of success as further arrests. What it says is, 'I've looked at this data, and we should go back to these black neighborhoods and look for more crime because the data says so. The algorithm isn't going to ask, 'Why is there more crime there?' It's just looking for patterns." 

In the end, the problem isn't about math, O'Neil said: 

"Word has to get out that just because it's a score, doesn't mean it's fair. I hope you spread that word." -Cathy O'Neil 

"It's about fairness. We as technologists try to put those questions at arm's length, and we try to act as if the algorithm is somehow elevated above moral values, but it's not." 

So the question is: How can we get around this if the algorithms being employed are considered proprietary, and no law requires companies, whether providing their own service or working on behalf of the government, to disclose those "secret sauce" algorithms? 

Her call to privacy pros was this: Help her solve this problem. 

She also called for data scientists to take a Hippocratic Oath "that gets them to stop pretending they're just following the numbers when they build an algorithm. Just because it's for profit doesn't mean it's benefiting society at large." 

She called for lawyers who will help her figure out how to get around the FOIA requests that have to date bee denied, and she called for laws similar to related to credit scores, so individuals can complain about unfair data being fed into algorithms affecting their lives. 

"Word has to get out that just because it's a score, doesn't mean it's fair. I hope you spread that word," she said. 

1 Comment

If you want to comment on this post, you need to login.

  • comment Sheila Dean • Sep 16, 2016
    Lawyers, you should totally help. 
    
    BUT... I think FOIA researchers like MuckRock and seasoned reporters who pull down CIA torture docs could help her so much more.  She should assemble her Ron Burgundy Channel 3 News Team and get cracking.