TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | CES 2015 Dispatch: Challenges Multiply for Privacy Professionals, Part Two Related reading: CES 2015 Dispatch: Challenges Multiply for Privacy Professionals, Part One

Editor's Note:

This is the second and final dispatch from George Mason University’s Mercatus Center Senior Research Fellow Adam Thierer. In his first installment, Thierer discussed some of his major takeaways from the Consumer Electronics Show last week in Las Vegas, NV, including how the Internet of Things (IoT) has come of age and wearables have become more fashionable. You can find part one here.

Miniaturization of Cameras Continues

Toshiba Glass was one of the most interesting wearable technologies I demoed at this year’s CES. It is Toshiba’s rival to Google Glass and works about the same way, with a pop-up lens display offering the user an augmented reality experience. In that sense, it breaks no new ground and would not raise any novel privacy concerns beyond those already associated with Google Glass. There was one important difference, however: Toshiba Glass was lighter and more stylish than Google Glass. They felt more like a regular set of spectacles that you could fold and put in your pocket. Again, that signals the broader trend I noticed with all the wearables at CES 2015 becoming more fashionable and practical. In turn, that could create new privacy challenges due to the increased potential for hidden, surreptitious uses of these technologies.

I was thinking about that problem as I walked through the GoPro booth. The irony of the GoPro exhibit at CES 2015 was that the company created a massive booth to highlight its increasingly miniature camera technologies. While mini wearable cameras have been around for many years now, what GoPro and its many competitors at CES demonstrated again this year is that we should never underestimate just how small, lightweight and portable these innovators can make these technologies.

Just a few years ago, a minicam mounted on your helmet, handlebars or car dash could be large, heavy and unsightly. Today, however, consumers don’t always need to “mount” some of these wearable cameras. Instead, with technologies such as Narrative Clip, they can pin them on their lapels or wear them like jewelry.

Again, privacy professionals will need to think creatively about how to curtail inappropriate uses as well as limiting the aggregate amount of data and/or video that is retained or opened to secondary use. But they will be fighting their own customer base in the process since the whole point of most of these technologies is to capture as many of life’s moments as possible.

It strikes me that this is another area where “tech etiquette” and privacy education become more important than ever. There’s only so much that can be done up front to bake in privacy and security for some of these products. And “notice and choice” frameworks are increasingly unworkable for this class of technologies since it's impossible to enforce those principles strictly in this case. Use-based restrictions are likely to become the focus of legal discussions about wearable cameras going forward. In the meantime, much more could be done by innovators, trade associations, government bodies, advocates and educators to address use of body cameras and educate users about where, when and how these technologies should and should not be used.

Drones Getting Smaller, Cheaper and Easier To Use

Relatedly, as cameras continue to get smaller, it makes it even easier to get them airborne with various unmanned aerial systems or “drone” technologies. As Sam Pfeifle reported from CES last week, a wide variety of drones were on display, although they were required to be operated inside netted cages. In the real world, however, there won’t be any nets as average citizens increasingly integrate drones into their lives as a sort of personal assistant. And that raises both safety and privacy concerns.

For example, “drone selfies” have become a thing. One drone exhibitor (Zano) used the motto: “Taking Your Selfies to New Heights,” showcasing its personal drone technology that can fly above you and your friends to snap shots or video. Personal drones could also accompany you on nature walks and runs or perhaps even down city sidewalks.

A huge policy battle is already playing out over drone safety, with the Federal Aviation Administration (FAA) taking extremely cautious steps to open up the nation’s airspace to private drone technologies. But the FAA is not as concerned about the privacy implications of drones, which fall outside its jurisdiction. If the new personal drone technologies on display at CES 2015 really do take off, privacy professionals will have to think hard about how to limit inappropriate use. Some privacy advocates have suggested preemptive constraints on drone use need to be established, while others, including myself, have argued that existing statutes and torts might be able to handle the most egregious misuses of these technologies.

Again, at a minimum, privacy professionals should be pushing these innovators to take steps to better educate their users about inappropriate uses. Late last month, a collaborative “Know Before You Fly” drone education campaign took flight, offering new users advice about how to safely use these aerial technologies. Developers will need to step up these efforts and make sure they highlight specific privacy guidelines in addition to tips for safe use.

Robotics Gets More Personal

A final trend from CES 2015 that privacy professionals should be paying attention to is the rapid growth of robotic technologies, which were quite ubiquitous at the show. As if Toshiba wasn’t amping up the “creepy” factor with its Toshiba Glass product, right next to it Toshiba was displaying its new “Communications Android,” an extremely lifelike female robot that had more than a few people wondering if we’ve arrived at the “Uncanny Valley,” i.e., the point where robots are so lifelike that people develop a sense of unease about them.

Privacy professionals should care about “personal care robotics” because it's going to take off in a major way.

The Toshiba Communications Android was billed as “A Robot for Tomorrow’s Service Industry and Homes.” This foreshadows a day when the world’s rapidly aging population will need more care than we have actual humans to provide. To the extent personal care robots fill that gap, we can expect them to be fully networked, data-collecting machines that will know as much about us as any human caregiver, possibly much more. And much of the information these machines will collect about us will be highly sensitive in nature because we will be bringing them into the most intimate quarters of our lives.

For privacy professionals advising innovators in this space, there are some obvious steps that can be taken to ensure that privacy and security are baked into robotic assistants from the start. The Health Insurance Portability and Accountability Act and other privacy laws and standards could be implicated, and privacy professionals will need to make sure the robotic innovators they are advising are in compliance.

Meanwhile, a host of thorny ethical issues will also accompany robotics assistants, many of which have been explored in a captivating collection of essays on Robot Ethics, edited by Patrick Lin, Keith Abney and George A. Bekey. This signals an evolving role for privacy professionals as they will be increasingly expected to grapple with profound ethical concerns that transcend traditional privacy and security-related issues. Some academics even suggest that we need to supplement chief privacy officers and chief information officers with “data ethicists” or “robot ethicists” who can advise companies and trade associations on these matters.

It’s hard to argue against that notion, but the challenge is that, as noted above about many other technologies, companies will often be expected to broaden, not shrink, the potential range of uses associated with these technologies. Many consumers will demand a level of personalization and customization that puts robotics on a collision course with many traditional ethical norms and potential social taboos. If you think sexual relations with robots will be an uncomfortable issue to address, just wait till some consumers demand—and innovators offer—robots that can help administer drugs! Can an algorithm be written that tells a robot how to administer a safe dose of prescribed medicine without the owner asking it to administer a different recreational drug or perhaps even a fatal dose of something as part of an attempt to end suffering?

In sum: The world of technology is changing rapidly and so, too, must the role of the privacy professional. The technologies on display at this year’s CES 2015 make it clear that a whole new class of concerns are emerging that will require IAPP members to broaden their issue set and find constructive solutions to the many challenges ahead. If nothing else, the questions on IAPP certification exams promise to get a lot more interesting in years to come!

Top photo courtesy of Adam Thierer.


If you want to comment on this post, you need to login.