TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | OURSA conference signals need for diversity in privacy, security design Related reading: What is a privacy engineer?

rss_feed

""

8, 13

The RSA Security conference made news recently after some high profile folks on Twitter pointed out that, of the nearly two dozen keynote speakers lined up for the event — perhaps the most prominent conference in the world — only one was female. It's no secret that the security industry has a male-biased history, but instead of just complaining about it, Access Now's Amie Stepanovich and Facebook Chief Security Officer Alex Stamos, among others, worked on a solution: A one-day, alternative conference designed to share the thought leadership and expertise of leading female and minority information security and privacy professionals. 

In just six short weeks, the OUR Security Advocates Conference was born. Held Tuesday at Cloudflare's headquarters here in San Francisco, OURSA featured an impressive lineup of academics, practitioners, journalists, and advocates. The conference itself seemed to emerge as a microcosm — a signal — of what's needed in the digital ecosystem: inclusion, diversity, a recognition of at-risk populations and their wide range of needs and perspectives in the digital world, just like the physical one. 

"The first thing to know about understanding at-risk populations, is that you don't understand anything about at-risk populations," Eva Galprin of the Electronic Frontier Foundation asserted early in the day. Google Principal Engineer Lea Kissner, who put together a track session on privacy practices near day's end, also recognizes the need to serve a wide spectrum of users. As engineers and designers build new products and systems, thinking about user needs is important, but what those needs are is not always obvious. 

Jessy Irwin, head of security at Tendermint, for example, discussed what she's called "the Old Lady Gang." After moving to the city by herself, Irwin befriended a group of elderly women who lived nearby and noticed their unique set of needs when using smart phones. "There's a significant disconnect between expectation and reality for at-risk users," she pointed out. 

(Left to right: Facebook's Aanchal Gupta; Fastly's Window Snyder; Facebook's Kate McKinley; the ACLU's Leigh Honeywell; Cloudflare's Jennifer Taylor; and Spotify's Kelly Lum

(Left to right: Facebook's Aanchal Gupta; Fastly's Window Snyder; Facebook's Kate McKinley; the ACLU's Leigh Honeywell; Cloudflare's Jennifer Taylor; and Spotify's Kelly Lum.)

Take two-factor authentication for example. Irwin knew that it was difficult for some of the elderly women to navigate the phone and remember the six-digit code to authenticate within the 30-second window of time. She also realized that remembering passwords — something that can be difficult for just about anyone not using a password manager — can pose another set of challenges for elderly folks already facing memory problems. That's why Irwin advocates for human-centric security.  

Thinking about a diverse set of human needs also plays into privacy design as well. 

Maritza Johnson, who's been at Facebook and Google in recent years and now works at ICSI, stressed the need to think about the user experience. "How do you design for people so they understand how to protect their privacy?" she asked. "What do we know about our users? What are we going to do with their data? As you plan to optimize your data, are you building a trusting relationship with your customers?" 

Sha Sundaram, a privacy engineer at Snap who focuses on bias in machine learning, said engineers must put themselves in the shoes of their users and try to think like them. She noted that biases in machine learning have the potential to harm users, but it's very difficult to identify those biases. She shared a checklist she uses to help identify bias in machine learning. What training data is used? What is being put in place to improve data quality? How sensitive is a model's accuracy to changes in test datasets? What is the risk to the user if something gets mislabeled? In what scenarios can your model be applied? When should a model be retrained?

More broadly, however, Brown University Computer Science Professor and Cryptographer Seny Kamara pointed to the computer science curriculum, advocating for more ethics courses. "We must educate our students beyond computer science," he said. "We need to teach ethics, not just coding. Some schools are beginning to do this." 

Bank of America's Allison Miller, who moderated a panel of privacy engineers including Johnson, Sundaram and Kamara, noted a change in the privacy profession. "It used to be that the only people who got to talk about privacy at security conferences were the lawyers ... but now we're talking about privacy as design problems and engineering problems." 

Johnson advocated for a space to share privacy engineering best practices. "We need a space for privacy engineers to talk about problems, best practices, and lessons learned. We're long on lessons learned," she contended, "but short on solutions." 

Comments

If you want to comment on this post, you need to login.