TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Facebook’s Egan: “If people are surprised, that’s not good for me.” Related reading: Evolving privacy law 'exciting' for IAPP Westin Scholar

rss_feed

""

""

""

Facebook Founder and CEO Mark Zuckerberg has made public his confusion and frustration over “repeated reports” of government spying. In calling on the U.S. government to “be the champion for the Internet, not a threat,” Zuckerberg said, “They need to be much more transparent about what they’re doing, or otherwise, people will believe the worst.”

And Zuckerberg knows a thing or two about struggling to maintain user trust. Facebook has faced many trials as its privacy policies and services have changed over the last 10 years.

Transparency for a company like Facebook, one predicated on users sharing personal information with one another, is a huge part of maintaining such trust. This same notion was explained in more detail at the IAPP Global Privacy Summit by Facebook Chief Privacy Officer, Policy, Erin Egan.

Set in a room filled to the brim with Summit attendees, Egan shared some of her insights into being the CPO for, perhaps, the most scrutinized and identifiable social network in the world in a conversation with Center for Democracy and Technology President and CEO Nuala O’Connor, CIPP/US, CIPP/G.

Facebook CPO, Policy, Erin M. Egan, and interviewer, Center for Democracy & Technology President and CEO Nuala O’Connor, CIPP/US, CIPP/G

“Transparency is huge,” she said. “I spend a lot of time on what we call data literacy: making sure people understand the data that is being collected about them.” This is something “Mark” talks a lot about at Facebook, she said. Helping users understand what data is being collected about them establishes trust, or as Egan said, “at its essence, it’s understanding.” One step she has taken to bolster transparency, for example, is Facebook’s “Ask Our CPO” feature.

Egan is not alone at Facebook with such a heady privacy task. She’s one of two CPOs—both created after the company settled with the Federal Trade Commission in 2011 for privacy violations. As the outward-facing privacy officer in charge of public policy, Egan travels around the world building relationships and with an ear to the ground. She recently spent three months with academics in Europe to better understand the cultural differences and data protection developments in a region that considers privacy to be a fundamental human right. She also pays attention to other developments in privacy, including the NTIA multi-stakeholder meetings on facial recognition and the White House’s Big Data initiative.

And her focus has not only been on the U.S and EU. India, which just announced plans for a new privacy regime, is one of the network’s fastest growing markets, closely followed by Latin America and several Asian countries.

By having a finger on the pulse of the meta-privacy world—one looking five, 10 years into the future—she is then able to feed the inner teams at Facebook the larger privacy developments and trends to better inform product development. Her colleague and counterpart, Chief Privacy Officer, Product, Michael Richter, works within the company, training employees and developing new services and products. Together Egan and Richter help lead a cross-functional team, she said, one that meets about three times a week to hash out the latest and greatest.

And meet consistently, they must.

As social norms change almost in real time and as larger swaths of human-level data are collected on its network, Facebook resides at the forefront of a lucrative, tempestuous and unpredictable Big Data ecosystem. Egan argued the Big Data implications are hugely important and valuable to society, citing the work of the UN Global Pulse. Though, she said, Facebook is not yet at the point of solving such lofty problems as feeding the world’s poor, there’s room for optimism. However, Facebook’s closely guarded database faces many critics, who point out the potential harms that lie therein, from facial recognition to behavioral targeting to perhaps something yet unknown.

But users should be aware of what they’re sharing, and, Egan said, Facebook tries to make sure that is accomplished.

“At the end of the day, you’re going to Facebook to share and connect,” Egan said. “So what does privacy mean? Do people understand what’s going on? If we think they don’t,” she answered, “then we’ll call it out.” She said Facebook is constantly tuning and building user controls. “You can delete, change your audience, search in bulk,” she said. “Privacy is the word we all use, but we’re really in the business to maintain trust.” And really, in some respects, Egan is more like a chief trust officer.

Additionally, there’s no single type of transparency, she allowed, but rather, “different concepts.” Of course, there’s the standard long-form privacy policy, or as Facebook has it named, its data-use policy. But, Egan said, the company wants to convey the relevant message into shorter policies. “For us, it’s making sure people understand the policy. We are thinking about other ways to bring that data literacy out,” she said.

“If people are surprised,” she added, “that’s not good for me.”

Another path Facebook is exploring is the use of icons. For instance, she said, a cloud pops up when a user is about to post something publicly. And as the company keeps up with the mobile sphere, the literacy of icons becomes significantly more important. Long-form policies are unreadable on a big screen, let alone a mobile device, but Egan remained optimistic about the use of icons, noting that everyone now knows the recycling icon.

She also said Facebook is exploring other ways to be more transparent and provide user control, particularly in advertising. She noted the company has a set of advertising guidelines that companies must follow. “If a company wants to sell medical devices,” for example, “then they have to comply with applicable law. Full stop.”

Data literacy has a role here as well, she said, noting Facebook wants to educate users on why they receive the ads they see. “We have icons and controls to tell them why they are getting that ad. We want them to be relevant.” She also said if an ad on Facebook is based on information not from Facebook, a link is provided to the source. Egan also noted Facebook is looking into the development of ad preference managers. Look for something in the next six to nine months, she said.

And yet, as many companies recently found out, maintaining trust after a year of National Security Agency surveillance disclosures—including evidence companies such as Facebook have been compelled by law to share user data with the government in bulk, or in some cases, have had backdoors allegedly installed into servers—is not always in the hands of the private sector. Facebook, along with Twitter, Apple, Yahoo, Google, Microsoft and others have been vocal in calling on the government to allow for more transparency.

Egan said she was in Europe at the time of the earliest Snowden leaks last June. “It was a big deal for all companies,” she said. “We had never heard of the allegations that were being made.” She said the company had never heard that there were backdoors into their servers until the leaks. In response, Zuckerberg vociferously maintained the company only does what is lawful and had “never been part of any program to give the U.S. or any other government direct access to our servers.”

And though the government argues what it has done is legal, many, including Zuckerberg, maintain it’s not the safest and most secure. “So it’s up to us—all of us—to build the Internet we want,” he wrote on March 13, adding, “I’m committed to seeing this happen, and you can count on Facebook to do our part.”

Whether you do or not, Facebook is counting on that trust and hopes its consumers aren’t thinking the worst.

Comments

If you want to comment on this post, you need to login.