TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | What "Ubergate" Means for Privacy Pros Related reading: Uber Privacy Gaffe Prompts Data Use Questions


Okay, so I know I’m not the only one writing about what the Twitter world collectively calls "Ubergate," but I think it’s important to delve into this issue a bit here and flesh out why this demonstrates the organizational need for privacy professionals and privacy ethics.

Much of this current firestorm centers on the alleged unethical use of sensitive personal data collected every day by the popular rideshare start-up.

To quickly sum this latest privacy kerfluffle: A journalist, himself using questionable ethics, quoted Uber's senior vice president's off-the-record discussion about using data collected from its services to go after its enemies. But this isn't the only time the media has focused on what many sources have been calling  a culture problem at Uber, particularly its questionable use of customer data. For example, according to venture capitalist Peter Sims, Uber used a “God View” for stalking purposes, while others have raised concerns about how the company has mined its data as seen in its “Rides of Glory” and “crime location knowledge demand” blog posts.

The company, which was formed in 2009, does have a privacy policy, and it has repeatedly pointed out that it has clear policies against the use of travel logs outside of legitimate business purposes. A spokesman said, “These polices apply to all employees. We regularly monitor and audit that access." 

Plus, on Thursday, a representative from Uber told me, "We have a privacy team within the legal department under managing counsel Katherine Tassi. Katherine reports to our GC and joined Uber this past summer" and that they "are continually building their team internally."

Regardless, this media flap is a nightmare. In addition to slews of negative headlines, it has gotten the attention of U.S. lawmakers, particularly Sen. Al Franken (D-MN). On Thursday, he sent a list of 10 questions he wanted answered about the company's data-handling practices.

Clearly, the lesson here is privacy cannot be ignored, pushed aside or minimized in organizations. Privacy must be addressed with experts who are trained to spot privacy issues and questionable uses. 

It looks like Uber is learning this important lesson as is evidenced by their internal privacy team improvements and hiring of Harriet Pearson, CIPP/US, and the Hogan Lovells team. Uber is definitely in capable hands, but it could have avoided this firestorm and likely saved a lot of money if it had ingrained privacy and privacy professionals into its services years—instead of months—ago. As Pearson says, "Trust founded on confidentiality and information security lies at the very heart of Uber's business, and we will be working with the team to review and reinforce where appropriate its policies and systems."

Your customers care about what you do with their data, but more and more are feeling a loss of control. Just look at the recent Pew Internet Research report on this. UC Berkeley Law Prof. Chris Hoofnagle, in a report for The Washington Post, rightly said, “We have never in history been at a point where we were more extortion-able.” He’s correct. This is the society in which we’re taking part, and we’re all becoming more vulnerable because of it.

Earlier this year, on this blog, Trevor Hughes, CIPP, and Omer Tene called for strong ethical practices, even if laws are not being violated, in all organizations and explained how privacy pros can help fill that role by training “personnel to spot privacy issues and flag concerns.” Additionally, Prof. Ryan Calo has called for consumer subject review boards to help operationalize this need. It's high time companies start implementing strong ethical practices such as these.

Uber has been a hugely successful company for many reasons, but it appears that privacy was not as high a priority as it should have been. It's good to see them now redoubling their privacy efforts and making privacy an inextricable part of its services. Now that they have Pearson and her team of privacy experts, they surely will. But at what cost? Surely, this would have cost less, both in finance and reputation, if they had considered privacy more deeply from the start.

As more and more companies spring up to provide killer new services, it’s likely that tons of sensitive personal information will be collected. How responsibly companies use that data will matter more than ever.

The big data lab experiments should continue, but with care. If companies aren’t careful, and don’t responsibly use their data and ingrain privacy and privacy experts into their products and services, a senator may send a highly publicized letter or the Federal Trade Commission might one day come-a-knocking for deception.

As Alex Howard wrote in a recent blog post, “With great data comes great power, and therefore responsibility.” This is where privacy pros have a role to play. Hopefully more start-ups realize that.

photo credit: Pensiero via photopin cc


If you want to comment on this post, you need to login.

  • comment Jason • Nov 21, 2014
    Again, another nod toward a good offense being the best defense...companies must start accepting this reality (and the associated costs) if they want to avoid kerfluffles of this kind.
  • comment Stephen • Nov 21, 2014
    Emil Michael off the cuff boasted (threatened) that the company "could dig up dirt on unfriendly reporters and share personal details about one particularly critical journalist". Later he said that is not how he really feels. 
    Privacy is cultural. Privacy is a state of affairs where an organisation is restrained in what it does with other peoples‘ information. Restraint and respect should be ingrained in any organisation that professes to take privacy seriously. It should be beyond the staff‘s imagining to do the things that Mr Michael mentioned. Even off-the-cuff or in a moment of frustration, company officers should not display such base instincts to exploit the special position they have in respect of customer data. 
    So, if the Hogan Lovells privacy review is to get to the bottom of this, they will have to go beyond checking the Privacy Policy and the usual auditing of whether staff have undergone privacy training. Such a review should try to pierce staff attitudes, uncover the signals that employees get as they function in the company, and examine whether Uber at its heart respects what it knows about its users.
    Steve Wilson, Constellation Research