TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | The role of data in the fight for social justice Related reading: IAPP statement on racial injustice

rss_feed

""

For the past couple of months, I have found myself watching videos of George Floyd, and far too many others, being slain in broad daylight. I have tried to comprehend the violence and the hatred, often switching my processing lenses from the perspective of a black woman, mother, and immigrant.

But today I share my thoughts from a research and data professional's perspective in the data privacy realm.

Data and social justice have always intersected, one feeding the other, and not always for the better. I believe it is time data professionals play a bigger role in ensuring data justice by working to reduce bias and racial disparity. Rooting out the race, class, gender and other "isms" in our society requires more than a legal or technological solution. It requires a paradigm shift in education and mindset in how we define ourselves within this global community and a realization that progress at the expense of others is no progress at all.

Data is a double-edged sword. Without it, decision makers are unable to effectively invest, improve, streamline, reach their audience. That same data also has a growing power to divide, marginalize, exclude, and reinforce abhorrent constructs in a society, constructs of racism. Technological advancements have given organizations access to Big Data. We now live in the era of datafication, where our previously invisible social actions are processed into quantifiable data that is out there for the taking. Algorithms and artificial intelligence have taken organizations to new heights in terms of innovation, growth, and profit, but their inherent nature present a huge risk and multiplies biased and discriminatory effects.

Data is an asset for organizations, but the power to control it should reside with the subjects of that data; the latter are the rights holders. From a data privacy standpoint, the growing disproportionate power between data subjects, data aggregators and those who act upon that data make it so that marginalized groups and communities of Black and Brown people have less and less power to demand justice for how their data or personal information is being used. On numerous occasions, a clear choice to consent is not given and most don’t know when and why their personal data was collected and processed. Black and Brown people are still wondering in 2020 why they keep getting pulled over, unable to get a lower interest loan, or denied the same medical care at significantly higher proportions than their white counterparts.

The heavy lift will have to start at the individual level. I don’t recall professors placing enough emphasis on data bias detection and mitigation in business school. I don’t recall the emphasis on how collecting information on race, ethnicity, income or location could reinforce inherent bias or that analytical tools are still codified human thoughts to be challenged when necessary. So, the humans behind the codes have to check themselves to ensure that the input is not exacerbating racial disparities.

When I was starting in the market research world, I was conducting projects for Fortune 500 companies. The clients were insisting on collecting data on race, gender, and income to maintain internal classification trackers that had been set up over the years. Even when that data was no longer relevant, there was pushback when discussing the potentially harmful implication of collecting such sensitive data. As a young research analyst, I could only wish and hope that the data I collected for those companies would be used for good and not for evil.

But long gone is the time for wishing and hoping. We cannot afford to take such a passive stance; Black and Brown lives depend on it.

So how do we make sure that we, as data professionals check ourselves and contribute in this fight for social justice and eliminate racial disparities and discrimination? It all starts with design. We have to challenge the status quo, in what we ask, how we ask it and from whom we chose to get those answers. Discriminatory designs are so subtle that you might just miss them.

We need to look beyond processing our data to render it “clean” and “workable” — just because it is clean, doesn’t mean that it is not biased and potentially harmful. We need to question the classification or segmentation in the databases that we mine. There are a lot of opportunities and educational resources on how to generate and apply datasets that have, to the best of our human abilities, the least amount of bias possible. We have to approach all things data with a microscopic lens of data privacy and a great depth of social awareness. Understanding the power and equity dynamics between the data subjects themselves, the data we are collecting, and analyzing will make us better data professionals.

I am encouraged by the voices for change and social justice, and I hope data professionals will join in and fortify the movement. Data is power. It has been used as a weapon of discrimination for far too long, but in the right hands, it can also be the fighting sword that slashes social inequity and racial injustice. 

To quote Professor Ruha Benjamin, one of my favorite researchers and female warriors for data and social justice: “Remember to imagine and craft the worlds you cannot live without, just as you dismantle the ones you cannot live within.”

Black Lives Matter!

Photo by Clay Banks on Unsplash


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

2 Comments

If you want to comment on this post, you need to login.

  • comment Suman Taneja e/v Korenhof • Aug 17, 2020
    A truly thought provoking article. 'Data is power and it is a double-edged sword' indeed.
  • comment Annie Bai • Aug 18, 2020
    Excellent post. At our analytics company, I encouraged our CEO to gift this book to all employees: Algorithms of Injustice. And, we've invited the author to speak to the company to break down the assumptions around the neutrality of technology.