TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | On building an internet of safe and useful things Related reading: Draft ICO report finds gaps in Google's Privacy Sandbox

rss_feed

""

""

With the booming market for voice-controlled virtual personal assistant devices like Google’s Home and Amazon’s Echo, and warnings from the former head of MI5 about hackable smart toilets (which, frankly, doesn’t scare me quite as much as a hackable Boeing 757), the attention of regulators is finally turning to the impact of the internet of things on our privacy and security.

When every little device can be connected to the internet, the questions include:

  • Will the device stop working if it needs a security patch?
  • Will it be up to the consumer to check for, find and install security patches? But what if the manufacturer doesn’t even issue security patches?
  • Can the device be remotely controlled by the manufacturer? What will prevent a garage door opener from being remotely disabled by the manufacturer because the customer left a bad product review?
  • Can IoT data be ‘monetized’ by the manufacturer for secondary purposes? 
  • Are customers being adequately informed about how their data is used? A global sweep by data protection authorities around the world found that 71 percent of IoT devices do not offer a privacy policy or collection notice to the consumer. And for those that do, the terms and conditions are ridiculously long and complex. (I love this stunt, in which a consumer advocacy group ask the Norwegian Consumer Affairs Minister to go for a jog while they read her the privacy policy from her fitness tracker. The Minister manages to run 11km in the time it takes for the policy to be read out to her.)
  • Can insurers access consumer data collected by smart devices? As if I want my life insurer finding out from my FitBit data that I decided to stay home with a packet of Pringles instead of going out for a jog. Every night.
  • Can the data be accessed by law enforcement? Can driverless cars be controlled by police
  • Can the device be manipulated by third parties? My favorite recent stories include the Burger King TV ad designed to set off Google Home devices to describe their products; and Amazon’s Echo device being used by a child to accidentally buy a dollshouse from Amazon, which, when it was reported on the TV news, in turn prompted all the devices in homes where that TV news story was playing to also buy dollhouses. 
  • Can the device be remotely controlled by hackers? Everything from kids’ toys to your own, er, adult toys may be vulnerable.
  • Can simple devices be used to access more valuable systems or data? When a casino was hacked via its IoT fish tank, I felt like real life is now disturbingly like the absurdist plot of one of those bad sequels to Ocean’s Eleven.

When stories like these abound, it is no wonder that consumers’ fears about privacy are suppressing demand for IoT devices.

And so just in time, it seems, the IoT Alliance Australia has published its industry-led Good Data Practice guidelines. 

And so just in time, it seems, the IoT Alliance Australia has published its industry-led Good Data Practice guidelines. Like the many examples we have cited above, the IoTAA guidelines focus on consumer-facing IoT devices, rather than bigger ticket "smart cities" programs to manage lighting, parking, traffic, energy and waste in public spaces which, as the IoTAA recognizes, raise their own privacy issues and thus need more specific guidance. 

In addition to addressing the kinds of security vulnerabilities illustrated above, the IoTAA guidelines delve deeper into privacy concerns. Indeed, some of their Good Data Practice principles will sound pretty familiar to anyone who already works in privacy, like: one, follow privacy law; two, build-in privacy-by-default and use privacy by design; and three, be accountable. These are sound principles which should apply in any sector. 

But there are also some privacy challenges which are comparatively novel in this world of IoT, and the IoTAA guidelines call these out. For example, IoT devices in the home may be used by people other than the consumer who purchased them; and indeed those other individuals may have no awareness that the devices exist, let alone are already monitoring them. 

(Which reminds me of the time a couple years ago when I popped over to a friend’s place and we chatted for a while before he suddenly said ‘Hey Alexa turn down the music’ and after I realized that there wasn’t actually some other person hiding around the corner called Alexa, it completely freaked me out to think that the device on the kitchen bench I had thought was some trendy new Scandi pepper grinder was actually listening to everything I said. Luckily, all we had been talking about was the cost of kitchen renovations, because, apparently that’s what we Sydneysiders do.)

The guidelines also note: “Many B2C IoT services reach into homes and other domestic, sometimes intimate environments, and enable observations and inferences as to private behaviour that otherwise are not possible." So, like, for all those times when conversations and activities in the home are a little more spicy than discussions about what type of splash back to choose. 

The proposed responses are eminently sensible too, with the guidelines stating that device designers and manufacturers need to design on the basis that the device must be safe for a child to use and that communications to consumers must be understandable by someone with a “reasonable but below average” level of literacy. It criticizes attempts by manufacturers to shift data security risks onto consumers, noting that consumers cannot be expected to constantly monitor the use of their devices and be knowledgeable or skilled enough to install updates and patches.

Bravo, IoTAA, for recognizing that privacy harms can arise through what I describe as individuation, as well as identification.

The guidelines also deliberately take a broader view of the data that needs to be considered in a device’s design, beyond information about individuals who are identifiable (i.e. what privacy law protects now as ‘personal information’) to also include what they describe as ‘private’ because it is “domestic or confidential in nature” even if no individual is identifiable from the data. 

Bravo, IoTAA, for recognizing that privacy harms can arise through what I describe as individuation, as well as identification. And hooray, the guidelines also stress data minimization. In other words, let the consumer be just a consumer, not the product.

I believe these new guidelines are a really positive step, to help designers and manufacturers figure out how to build us the "Internet of Safe And Useful Things," instead of an "Internet of Stupid Dangerous Invasive Things." 

Hopefully they will give manufacturers pause for thought before they continue down the road of just sticking a chip in everything. 

From smart clothes pegs telling you when your clothes are dry, to a smart umbrella telling you it’s raining, to hair brushes telling you, um, something about how to brush your hair, to smart toilet rolls that can identify you (why??), there is mirth to be found poking fun at all the dumb things allegedly turned "smart." 

Because really, does anyone genuinely need their dental floss dispenser to be connected to the rest of the world? I would suggest that our obsession with building connectivity into every little thing is getting out of hand, and it is time for a sensible re-think. 

As technologist Vikram Kumar told a Technology & Privacy forum, sometimes, “the best interface to control your lights is a light switch."

photo credit: marcoverch Gelbes Band zum Abgrenzen von Tatorten, Unglücksstellen und Baustellen via photopin (license)

Comments

If you want to comment on this post, you need to login.