TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Emerging technologies invite new marketplace competition Related reading: Global News Roundup: Feb. 23–March 1, 2021

rss_feed

On March 16 and on March 23, UC Hastings hosted Internet and Privacy Law Symposiums. As part of discussion of the new legal landscape made possible by the Internet of Things, audience questions prompted panelists to comment on the potential for user choice to push companies to adopt privacy policies more favorable to users. Here's a summary of what happened. 

Competition based on privacy by physical design

At the Evolution of Privacy in the Age of Technology at UC Hastings on March 16, 2017, an audience member asked the panel on regulating emerging technology and the competing considerations involved whether privacy by design considerations should include ways for users to set privacy settings on the connected devices themselves, rather than through an app interface.

In answering the question, the panelists, Robert Callahan, Joanne McNabb, Hannah Poteat, and Jamie Lee Williams, made note of issues including the size of the connected device, how realistic it would be to include the necessary buttons and/or screens to provide for this sort of functionality. Overall, the panel seemed to agree that it was potentially a good idea, but certainly not one that should be required by law. Panelists also proposed that while such functionality should not be required by law, such functionality could and should be encouraged by the market place.

Participants in the conversation noted that, as recently as five years ago, consumer-facing companies which were not required to comply with HIPAA began complying anyway and proceeded to advertise that compliance as an additional feature to customers. Today, this is not a particularly common practice, but more and more companies are embracing privacy protections as a significant feature of their products. Ultimately, the experts present agreed that, in a day and age where data breaches seem to be constant in the news and consumers' homes are being filled with more and more "always on" devices, companies that include physical switches on their devices which allow for older and/or simply less tech-savvy users to communicate their privacy preferences to those companies will build a better rapport with their customers and will encourage more adoption because they will have an easier time establishing user trust. 

As a concrete example, imagine you purchased a smart TV which had all the standard features: a power button, a menu button, multiple HDMI ports, along with other video and audio ports. A modern smart TV is typically at least 32 inches, and therefore leaves plenty of space on the front, sides, and back for several ports and buttons. Some companies may offer the following settings as menu options, but for others who simply are not interested in digging through hundreds of menu options, perhaps the following feature set could be incorporated:

A switch (much like one on a clock for turning on or off an alarm) with the following setting options: "always listening," "remote controlled" (for instructing the TV to listen after pressing a button on the remote), and "never listening." Perhaps also, a switch with the following functionality: "share data for improving user experience," and "only share data necessary to provide services." The actual labels on the switch settings would need to be short and to the point, but reference to the printed manual which came with the TV would be acceptable, they said.

Of course, there are other possible combinations which could provide competitive advantage to companies.

Lastly, privacy experts agreed that designers of IoT devices could create new competition in the marketplace by providing easy-to-use physical switches on their products which correspond to the types of privacy settings users are used to seeing inside apps and websites.

Competition based on privacy by regulation

At the Internet & Privacy Law Symposium on March 23, 2017, an audience member asked the panel discussing the recent legal developments relating to the EU-U.S. Data Privacy Shield and GDPR whether a user in Malaysia would have standing to sue a telemedicine doctor in Germany. The question arose because another member of the audience was advising a client who provides software which allows such a communication to take place. 

For the record, the panelists, Mark Aldrich, Alexandra Ross, and Mark Webber, affirmed that, yes, the Malaysian user would have standing. But more importantly, the panel clarified that the GDPR does not simply cover the data owned by citizens of the EU. Instead, the GDPR applies to processing of personal data by controllers and processors in the EU, regardless of whether the processing takes place in the EU or not, and seemingly regardless of whether the specific data in question in fact is the personal data of a data subject residing in the EU. That last piece is critical because, if the cost of services from a U.S. data processor is similar to the cost of services from an EU data processor, the choice for a U.S.-based user becomes a choice between using the services of a company required to comply with the GDPR's strict data protection guarantees and a company which is not. This choice could result in a new avenue for business between users and European companies.

Panelists also discussed, large companies like  Google, Microsoft, and Adobe, for example, have so many European users that they are having to sort out how to abide by the GDPR, and it seems that they will likely abide by the GDPR with respect to their European users and their non-European users alike. But they do not necessarily have to do so. Conversely, a European data processor likely has to comply, no matter whose data is being processed. So, if you are a user who stores sensitive data in the cloud, suddenly a hosting service with physical storage in the EU sounds like a much better business proposition than a hosting service with physical storage outside the EU (and with no ties to the EU). That is, unless that second hosting service has affirmatively elected to promise users that its data security mechanisms are either equally as protective of data as required under the GDPR or advertise even more protection than that provided by the GDPR. Such a decision would be based on the company recognizing that "what the market will bear" is no less than the requirements promulgated by another jurisdiction. 

Experts expanded the conversation by noting that, before the internet, users did not have similar choices. It would not have been reasonable for someone to choose to see a doctor in Germany when they lived in Malaysia, unless they frequently took trips to Malaysia. But now, with telemedicine, a patient may very reasonably decide that video chatting with their doctor provides that doctor with all the information necessary for an adequate diagnosis. By accepting video chat as a sufficient interaction for adequate diagnosis, other considerations shift. The length of drive to see the doctor is no longer a consideration.

Steering the conversation back towards legal issues, experts pointed out that now a new cost question comes into play: which regulatory body does a patient trust to best protect their session with the doctor? Patients are no longer limited to their home jurisdiction's regulatory body. Perhaps a patient is from India, but they think the EU regulatory bodies are doing a better job at preventing doctors from committing malpractice. The internet provides a window through which doctors can physically remain in the jurisdiction in which they are licensed, but patients can "visit" that jurisdiction from as far away as Antarctica.

In conclusion, privacy experts attending the symposium confirmed that service providers (including healthcare service providers) are liable for violations of the laws governing the service provider's geographical location irrespective of the geographical location of the user, and this phenomena could lead users to now have the opportunity to push for more user-protective privacy policies by declining to work with service providers subject to less user protective regulations.

Comments

If you want to comment on this post, you need to login.