Okay, so I know I’m not the only one writing about what the Twitter world collectively calls "Ubergate," but I think it’s important to delve into this issue a bit here and flesh out why this demonstrates the organizational need for privacy professionals and privacy ethics.
Much of this current firestorm centers on the alleged unethical use of sensitive personal data collected every day by the popular rideshare start-up.
To quickly sum this latest privacy kerfluffle: A journalist, himself using questionable ethics, quoted Uber's senior vice president's off-the-record discussion about using data collected from its services to go after its enemies. But this isn't the only time the media has focused on what many sources have been calling a culture problem at Uber, particularly its questionable use of customer data. For example, according to venture capitalist Peter Sims, Uber used a “God View” for stalking purposes, while others have raised concerns about how the company has mined its data as seen in its “Rides of Glory” and “crime location knowledge demand” blog posts.
Plus, on Thursday, a representative from Uber told me, "We have a privacy team within the legal department under managing counsel Katherine Tassi. Katherine reports to our GC and joined Uber this past summer" and that they "are continually building their team internally."
Regardless, this media flap is a nightmare. In addition to slews of negative headlines, it has gotten the attention of U.S. lawmakers, particularly Sen. Al Franken (D-MN). On Thursday, he sent a list of 10 questions he wanted answered about the company's data-handling practices.
Clearly, the lesson here is privacy cannot be ignored, pushed aside or minimized in organizations. Privacy must be addressed with experts who are trained to spot privacy issues and questionable uses.
It looks like Uber is learning this important lesson as is evidenced by their internal privacy team improvements and hiring of Harriet Pearson, CIPP/US, and the Hogan Lovells team. Uber is definitely in capable hands, but it could have avoided this firestorm and likely saved a lot of money if it had ingrained privacy and privacy professionals into its services years—instead of months—ago. As Pearson says, "Trust founded on confidentiality and information security lies at the very heart of Uber's business, and we will be working with the team to review and reinforce where appropriate its policies and systems."
Your customers care about what you do with their data, but more and more are feeling a loss of control. Just look at the recent Pew Internet Research report on this. UC Berkeley Law Prof. Chris Hoofnagle, in a report for The Washington Post, rightly said, “We have never in history been at a point where we were more extortion-able.” He’s correct. This is the society in which we’re taking part, and we’re all becoming more vulnerable because of it.
Earlier this year, on this blog, Trevor Hughes, CIPP, and Omer Tene called for strong ethical practices, even if laws are not being violated, in all organizations and explained how privacy pros can help fill that role by training “personnel to spot privacy issues and flag concerns.” Additionally, Prof. Ryan Calo has called for consumer subject review boards to help operationalize this need. It's high time companies start implementing strong ethical practices such as these.
Uber has been a hugely successful company for many reasons, but it appears that privacy was not as high a priority as it should have been. It's good to see them now redoubling their privacy efforts and making privacy an inextricable part of its services. Now that they have Pearson and her team of privacy experts, they surely will. But at what cost? Surely, this would have cost less, both in finance and reputation, if they had considered privacy more deeply from the start.
As more and more companies spring up to provide killer new services, it’s likely that tons of sensitive personal information will be collected. How responsibly companies use that data will matter more than ever.
The big data lab experiments should continue, but with care. If companies aren’t careful, and don’t responsibly use their data and ingrain privacy and privacy experts into their products and services, a senator may send a highly publicized letter or the Federal Trade Commission might one day come-a-knocking for deception.
As Alex Howard wrote in a recent blog post, “With great data comes great power, and therefore responsibility.” This is where privacy pros have a role to play. Hopefully more start-ups realize that.
If you want to comment on this post, you need to login.