If there’s any company that knows about pushback following policy changes, it’s Facebook. But that’s kind of to be expected given Facebook’s self-proclaimed “learning culture,” in which it allows itself to try new things, even at the risk that they don’t achieve the preferred result, with the expectation that what’s learned from the failure is valuable in itself.
But even given that, said Facebook’s head of public policy in the EU Lord Richard Allan, the remaining challenge is how to educate consumers and regulators both on what those changes are. That’s because, as we now well know, no one reads privacy policies. At least not in real-time and for information on how their data is being used or to determine whether to use a product.
Allan said the company has “tried a whole bunch of things to help people understand what we were doing, but we still seem to find ourselves in this repeated cycle where, as we update the product and service, we cause unhappiness.”
Notably, after facing threats of enforcement action from the U.K. Information Commissioner’s Office, Facebook agreed to stop using data from U.K. WhatsApp users for marketing or product improvements. Facebook acquired WhatsApp in 2014, and the two companies have shared data until recently confronted by the Article 29 Working Party.
Allan said business models and technology are evolving and the drive to integrate products is “perfectly natural and perfectly normal,” even though they are often seen as the exception to the rule.
He used the example of Facebook or other web companies moving from static products or services into the mobile space. Suddenly, location data comes into play, and updates must be made to policies to reflect a new data use. That’s a normal part of the game, he said.
“There’s no fundamental reason why a business can’t evolve its business model,” Allan said, adding the change is often based entirely on consumer interest. “Some people will object to the evolution of the business model, but there’s no core reason that can’t happen.”
And it’s a problem that’s only going to continue. Think about connected cars, for example. Today, when a consumer buys a car, a safety certification has been done on the car before it’s driven off the lot.
“If I buy a car, I don’t read all the service terms,” he said. “I rely on brand and trust.”
But soon, connected cars will use data, and software updates will be persistent. Car manufacturers will soon have a requirement to alert drivers when updates have occurred, and that’s just an example. The same will happen across sectors as technology evolves, Allan said.
“We can anticipate more attention and regulation in this space,” he continued. “Services are going to have to constantly be evaluating levels of compliance with existing policies.”
He said it’s necessary to find ways for companies to be able to operate within frameworks without being paralyzed by their provisions as well as solutions on how to communicate to consumers in a meaningful way and one in which appeases regulators. Earlier this year, for example, the company released a paper, "A New Paradigm for Personal Data," the result of an 11-city consultation tour. Some have criticized Facebook's new products in light of that very paper, but Facebook has also applied some of its findings in recently denying an insurance company's plan to use user posts to evaluate them for discounts.
“We’re going to have to figure out what the solutions are,” he said. “We’re stuck in this repeat cycle where we’re trying to make sure we’re up to date and communicate effectively with people who use our services, but in some cases we’re spending months and years arguing over the legal document that underpins the product.”
Allan said Facebook doesn’t claim to have the recipe.
“We clearly don’t have the answer, by our own track record.”
But he said Facebook is trying to set up programs to talk with a broader group on what some of those solutions might be. And that some of those solutions might include improved user-interface design, which might provide users with bite-sized pieces of information at the time it’s relevant. So if location data were to be used, the consumer might get a three-paragraph notice on how the data is to be used.
Allan hopes others will want to join in on a conversation, because, until there’s a solution, "there’s a huge amount of energy diverted to the wrong part of this problem.”
Editor's Note: This article was updated on Nov. 11, 2016, to provide more context to Facebook's privacy decision-making.
If you want to comment on this post, you need to login.