TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | User-Managed Access for the New IoT Economy Related reading: Will Privacy Be Part of the IoT Standards Battle?

rss_feed
GDPR-Ready_300x250-Ad

""

""

No one, not even privacy pros, likes the consent experiences we’re given in ordinary online contexts. We go from forced opt-in checkboxes to cookie-directive-compliant OK buttons, wondering where our lofty Fair Information Practice Principles about user control went.

And just where did they go?

Often they’re buried under a pile of business risk mitigation. The IAPP’s own salary survey showed that the top driver for corporate privacy funding was meeting compliance obligations, and more than half of privacy groups reported into a legal or compliance department.

But a funny thing happened on the way to the API economy: A new wave of companies developed two new kinds of consent tools to meet the needs of emerging data-sharing requirements—OAuth and the “Share” button—and go beyond the minimums required by certain regulations.

OAuth: The Standard that Protects Any API

When I worked on consumer identity services at PayPal in 2009–2010, I saw the first new consent tool flower in real time. Facebook was building an API platform for user data access and was using the OAuth mechanism to secure it. OAuth lets users consent to and revoke the sewing together of third-party apps and Facebook APIs at will. We very much wanted to borrow this new tool to replicate that success, as the platform was going viral.

Yes, I did just say that: Facebook popularized a new, powerful user-consent tool.

OAuth has some very nice properties. It can protect any API, so it’s not just for personal data attributes. It can be used for any type of data retrieval—think Twitter feeds and Flickr photos—and it can even be used to put data and content back into the service. The extent of access granted can be limited to a subset of the possible operations. Moreover, OAuth is a standard, now in its second version, meaning that many companies can build to the same protocol and achieve interoperability.

Share: A Modern Approach to Consent for Data Access

The second consent tool is something you might already use every day but not think of in “consent” terms. This tool is the “Share” feature in Google Apps and similar online productivity services.

Think about it. You hit that button specifically to share sensitive data with someone else, and you can select the scope of sharing depending on the type of service. For example, in Google Docs, a drop-down menu allows a document owner to give individuals the ability to edit, comment on or view documents. Using TripIt, a traveler can share trip details with trip planners, fellow travelers and viewers, giving different levels of access automatically to each category. Even more compelling, file-sharing services such as Box enable users to configure the length of time individuals can have access to content.

If that isn’t consent for data access, what is?

The Share model makes a lot of sense. It’s proactive, so users get ahead of the consent curve, and people can share with other parties rather than just between applications that they themselves log into and use.

Cynicism in the IoT Economy

In the post-Snowden era, people have finally become cynical about where their personal data is flowing. But demands and opportunities for data sharing are only going to increase as we rush headlong into the IoT economy, a world where net-connected devices are built on top of the API economy.

These tensions are born of fear and greed.

Sure, people are looking to protect themselves. But they’re finally finding killer apps in cool devices that give them new ways to selectively share content. I recently saw an ad for a smart CPAP machine (used to treat sleep apnea) that comes with a free iOS app to “view and share data.” People aren’t posting that data on Facebook; it’s meant to be shared with medical professionals and perhaps in online forums with fellow patients.

So we now have the million-dollar question: How are we going to handle data access requirements when all of the devices, appliances and vehicles in our lives are all talking to the net? We need to step up to new consent tools, and quick. That’s where the new User-Managed Access (UMA) standard comes in.

UMA is like a combination of OAuth, the standard on which it’s built, and the Share paradigm, all wrapped up into one. It protects any API on a standardized, scoped basis. It lets people share directly with other parties and revoke access authorization whenever they see fit. Individuals can set, view and change sharing preferences from a single online control console—like a digital footprint dashboard.

Healthcare Information Sharing: A Real-World Practical Application

Imagine a health data-sharing ecosystem in which service providers such as Dr. Bob’s BHealthy practice  and devices such as the Beat blood pressure monitor work with Alice’s UMA service, called ShareHub. She can hit the UMA-enabled Share button in the BHealthy portal to share the allergy portion of her health record with her husband Ted ahead of an overseas trip she’s taking and manage that preference through ShareHub.

Separately, Alice can hit Share in the portal of the Beat blood pressure monitor service to ensure Dr. Bob gets an ongoing view of her blood pressure and then manage that preference through ShareHub. Alice can visit ShareHub any time she wants to update her privacy settings. She can remove her previous doctor’s authorization and ensure her new doctor gets access, add her husband or other family members or simply revoke sharing entirely.

This scenario is not far from becoming a reality.

The UMA standard reached stability at Version 1.0 in March this year. And a new standards group called Health Relationship Trust (HEART) kicked off in January to work through and document the specifics of using OAuth, UMA and two other standards, OpenID Connect and the Fast Healthcare Interoperability Resources (FHIR) API. Together, these software pieces support security and privacy-sensitive use cases for sharing patient-centric health data.

The bottom line is that if we just had to worry about providing personal information to websites, then opt-out check-boxes might suffice—if they offered true choice! In the API economy, OAuth and proprietary Share buttons might offer a sufficiently robust consent framework.

If I’m ever going to outfit my condo with smart light bulbs and appliances made by more than one manufacturer, list my condo on AirBnB, head out of town for a week and expect to get back full control of all my devices at the end of that week, then my device data streams had better be integrated with and controllable through a convenient central dashboard. Otherwise, fear—and one really devastating household hack—will kill the consumer IoT proposition before greed gets a chance to take hold.

photo credit: green circuit board II via photopin (license)

4 Comments

If you want to comment on this post, you need to login.

  • comment Jon • Jul 14, 2015
    Both Eve and her idea are brilliant and often necessary.  Beware -- all you context-sensitive, norm-loving post-consenters -- the IoT, lest it turn you from a surfer into a serf.     http://datalaw.net/my-privacy-solution-for-the-internet-of-things/
  • comment Susan • Jul 14, 2015
    Thank you for this article. Using "share" is a great idea and using Google Docs as an example makes its usefulness clear. The other concern for consumers is once the doc or data is shared, what will happen next? Can the share be limited to the person intended or can that person share the data (or sell it) to another? I'm sure limitations can be built into the equation. Good information, Eve.
  • comment Jay • Jul 15, 2015
    All this is true, and excellent advice, as far as it goes. But what Share does not address is the default-"sharing" which we wouldn't necessarily (or wouldn't at all) choose to do. "Share" as described above is always user-voluntary, directly in the proactive interest of the data subject. The problem with "Consent" in its current, broken model is that what we're asked, or demanded, or opted-in to, or just plain forced (sometimes silently) to "Consent" to is sharing of things _that we don't want to share, or can't even understand_.
    So, while "Share" is great, it doesn't solve the real problem of the current, broken Consent model :-(
  • comment Eve • Jul 16, 2015
    Jon -- I've never been called "brilliant and often necessary"; thanks for that. :-)
    
    Susan -- We've found that an UMA-protected API would be the entity in control of the resharing behavior you describe. Of course, without encryption (or DRM or other) technical controls layered on top, all kinds of downstream sharing shenanigans are hard to avoid 100%, but such technical controls add enough friction to the equation that they inhibit the adoption of UMA-style trackable sharing ecosystems in favor of passing around copies illicitly. Another approach is to add enforceable business/legal layers, which we're working on enabling now for a complete "BLT sandwich". :-)
    
    Jay -- So true that UMA isn't close to the whole story, sad to say. But there is an interesting design pattern of UMA applicability in traditional consent scenarios that can turn disempowered consent into empowered "access approval". Where business models allow -- and perhaps where future regulatory regimes encourage or require? (I'm thinking here of interesting parts of the forthcoming EU Data Protection revisions) -- this design pattern could prove interesting.
    
    Thank you all for joining the conversation, and I look forward to other comments.