TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | An open letter on race, privacy and the IAPP Related reading: IAPP statement on racial injustice

rss_feed

Civil rights leader and congressman John Lewis, who died last month after a battle with cancer, was often referred to as the “conscience of Congress.” Even to the very end he encouraged us all to get into “good trouble, necessary trouble.” Former U.S. President Barack Obama eulogized John Lewis’ life as an example of what Martin Luther King Jr. once described as someone who lived with the “fierce urgency of now.”  

I, too, write today with the “fierce urgency of now.” As Martin Luther King Jr. further stated, “We are now faced with the fact that tomorrow is today.” That was 50 years ago. So what “today” do I speak about with “urgency”?  I speak about the need for the IAPP to have a constructive dialogue about race and systemic racism. My prose may not share the grace of these giants, but I share their dreams.

Is it now time to talk about change in the IAPP? Jane Stanford, in a 1904 address to the Stanford Board of Trustees, said, “Let us not be afraid to outgrow old thoughts and ways, and dare to think on new lines as to the future of the work under our care.” I would make that same challenge to the IAPP, though it is not alone in being challenged to self-reflect. This includes other organizations like Emergency Medicine and The New York Times as illustrated in the respective articles, “Addressing the Elephant in the Room: Microaggressions in Medicine,” and “A Reckoning of Objectivity, Led by Black Journalists.” Addressing the underlying issues of systemic racism requires moral clarity and truth. That means all organizations must truly listen, not just now, but on an ongoing basis, to the issues of systemic racism.  

Does the messenger matter? I am Latino. Does it automatically make me a less credible voice? Does it make a difference that I work for a state human services agency instead of a “power” company? Do those initial status markers cause this dialogue to be dismissed before it even begins? I mention this because it is a routine microaggression I experience to be casually dismissed in discussions until I mention my association with Stanford or my certifications. It does seem many times that race matters. 

A recent Atlantic Monthly article reported that black workers receive more scrutiny; meaning small mistakes have a disproportionate impact on their jobs. As a Latino, I have experienced this additional unreasonable scrutiny. I also have had my expertise questioned more than my white colleagues. It gets a bit tiresome to constantly have to prove yourself.

For my part, I have started discussions with IAPP leadership about the intersection of race with privacy and the IAPP. Part of the discussion has been whether it provides the same platform for minority issues or whether a blind spot exists. I think blind spots are there, a flaw articulated in a quote from D.H. Lawrence, “"What the eye doesn't see and the mind doesn't know, doesn't exist." And while the IAPP and the greater community of privacy professionals may not have realized it, this blind spot to race makes me feel apart from, rather than a part of, the privacy community. I am sure others like me feel the same.  

Why is it important for the IAPP to address race? It goes to the IAPP’s core mission of data protection and the related issues of data and technology. And in this area my message is: Technology and data are not neutral. IAPP Market Research Specialist Christelle Kamaliza speaks to this in her article, “The role of data in the fight for social justice.” Data and technology and their uses drive discussions and solutions. If data or technology is biased or flawed, the solutions will be biased and flawed. The IAPP can and should lead the way in this area. Before that can happen, however, the IAPP — indeed, all organizations — needs to reflect on its own biases and how it may contribute to any structural or institutional racism.

Technology and data have impacts on vulnerable populations, while often advantaging whites. These vulnerable populations not only include underrepresented minorities, Blacks, Latinos, and Native Americans, but also the LGBTQIA+ community, children and women. You only need to look at surveillance technology, facial recognition and social media to see this. Facial recognition recently misidentified a Black man for a crime he did not commit. Unlike typing in a password to access a smart phone, facial recognition represents a risk to minorities because police can unlock their phones by pointing it at their faces. With a PIN, individuals can rely on the right to remain silent to keep their phone locked. In another powerful example, Jack Dorsey, the founder of Twitter, admitted that the lack of female developers prevented the company from recognizing that the platform could be weaponized against women.  

These kinds of examples take place because of who is at the table, usually white males. Similarly, privacy and security controls meant to mitigate harms from data and technology will be discussed and controlled by those at the table. If those at the table draw from the same white employee pool, the solutions will suffer the same blind spots as the technology.

Solutions must come from a diverse set of stakeholders who all have a seat at the table. For too long, minorities have asked, “Where is our seat?” The IAPP can help with the pipeline by encouraging and promoting diverse talent. One way to do this is through some of the most prominent platforms the IAPP offers, presentations, advisory groups and functions.

IAPP presentations are indications of IAPP values. Given that, presentations and panels can be a way to raise awareness about race and its intersection with data, technology and the law. There are very few presentations that really dive into this area. The IAPP states that its programming reflects operational matters to help privacy professionals do their jobs better. Recognizing race and the impacts technology, data and law have on race would help privacy professionals do their job better. The issues related to vulnerable populations are a superset of issues in the area of data protection. Related to that is the issue of cultural competency. How can you address issues in that area and with this segment of the population if you have corresponding blind spots?

The IAPP claims to act in a policy-neutral manner. But this is exactly the point when it comes to structural racism. Neutral by whose standards? As The New York Times article states: “The objective (neutral) truth is decided almost exclusively by white reporters and their mostly white bosses. And those selective truths have been calibrated to avoid offending the sensibilities of white readers … The views and inclinations of whiteness are accepted as the objective neutral.”

Could the same not apply here? “Neutral” policies often have built in biases based on those who developed them. Furthermore, as The New York Times article states, “No ... process is objective. And no individual ... is objective, because no human being is.”     

And you still have the issue of disparate impact; the impact is not neutral to everyone. Disparate impact occurs when policies, practices, rules or other systems that appear to be facially neutral result in a disproportionate impact on certain groups. It is just like technology. Race matters.

From my understanding, presentations are filtered through the advisory groups and are based on the interests of the members. This process sets up a self-perpetuating cycle familiar with those on the outside. To be an insider, you have to be inside. But to be inside, you have to be an insider. I have been a part of state privacy legislation dealing with mental health, substance abuse and minors' privacy. I only had the opportunity to work on these because insiders invited me in.

How can we minorities make these issues of interest unless we are both given a platform and strong support? Presentations seem to favor Big Tech, big law and big names, mostly white. They cannot be another token gesture sidelined to the farthest conference room. While there is a claim that there is a diverse representation in the existing advisory boards to deal with this issue, I just don’t see it.

While I concede the possibility that the IAPP may not receive many applications in this area, it may also be true that this is because the IAPP does not welcome it or take affirmative action to promote applications in this area. This mirrors the tired argument that companies cannot seem to find qualified minorities. It casts the burden and blame back on minorities for the current state of the situation. I reject that and ask if our voices are really welcomed?

That is why the need for a minority advisory group is so important. This is the issue I most strongly disagree with IAPP leadership about. If advisory groups approve presentations and provide a voice, I cannot see how, at this time, it can even be an issue. “Tomorrow is today.” Other advisory groups cannot make up the difference for the lack of a minority advisory board. This feels dismissive and a reminder of our second-rate status. What happens to those at the lowest levels foreshadow what can happen to all of us. These issues are ignored at our peril. 

Leadership has been open to my feedback, but I realize my voice alone is not enough to carry the day. Race is not about “what lies in the depth of a human’s heart. It is about word and deed.” I need others to support and advocate for a more inclusive voice in the IAPP regarding race. Only additional voices can create a different feed-back loop that encompasses us all. 

So let’s talk about race and technology. Let’s talk about race in respect to data, privacy and information security. Let’s talk about reserving space for presentations on race and providing support for them. Let’s talk about a minority advisory group. Let’s talk about the IAPP and race. I have suggested a panel on “Race, Technology and the IAPP.” The question is not if structural racial biases exist within the data, privacy, and technology community but how to begin to address them? Your thoughts? 

Let’s dream. Let’s talk. Let’s act. Be safe. Be brave. And most importantly, be kind.

Photo by Kate Macate on Unsplash

Editor's Note:

Affinity Groups
The IAPP has several Affinity Groups aimed at providing privacy pros with the opportunity to connect and unify their experiences. Affinity Groups include Cybersecurity Professionals; Cyber & Privacy Investigations, Enforcement and Litigation; Energy and Utilities; Higher Education; LGBTQs in Privacy and Technology; Minorities in Privacy; and State, Local and Municipalities.
More information can be found here 

Women Leading Privacy
The IAPP also has a Women Leading Privacy Section, where many of today’s top privacy professionals come to give and get career support, to help advance women in the field and to expand their personal and professional networks with a slew of opportunities created just for them.
More information can be found here. 


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

5 Comments

If you want to comment on this post, you need to login.

  • comment Gregory Hollingsworth • Aug 18, 2020
    Well said, I couldn't agree more.
    
    Data privacy is of an even greater importance for vulnerable communities and should be pushed to to absolute forefront of concern given the actions we've seen taken by the US government in the past 3 months, especially with monitoring of movements like BLM that are perceived as a threat to the current administration.
    
    Protecting the privacy rights of those groups and vulnerable communities means giving those communities a voice, and if not now, when?
  • comment Simisola Belo • Sep 1, 2020
    Well said. I'm by your side. What needs doing? I agree that affirmative action, which is actually just a way of keeping the qualified from being unfairly disadvantaged, is part of the solution. Please let us connect, talk, strategise and do what we can to address all the imbalances.
  • comment Daniel Esquibel • Sep 1, 2020
    Please share your thoughts with the IAPP and management.   They need to hear from membership to make change happen.  Feel free to look me up on Linkin and connect.  Thanks!
  • comment Pamela Bosley • Sep 9, 2020
    Thank you for publishing this article.  I hope we have the opportunity to see this kind of commentary featured more often.  I also hope the specific recommendations made are given serious consideration.
  • comment Karen Rogers-Robinson • Dec 13, 2020
    Great job, Daniel!  Conversation is definitely needed around Data Privacy and Race. Anonymization of data or lack of it as it relates to race in this new Jim Code era that we now live in. Is the group already up and running ( It seems as if the last post was 3 months ago)? How can I be of service?