Virtual reality and augmented reality, collectively known as extended reality, or XR, has been pitched as the next great communications platform and user interface. At next week's Facebook Connect Conference, the company will provide the next glimpse into its ambitions for immersive technology. Facebook is hardly the only Silicon Valley company with an interest in XR, but what should be noteworthy to privacy professionals is that Facebook has explicitly been discussing the need to bake privacy into XR.

Whatever your take on Facebook’s agenda, the time to begin developing XR privacy guidelines and controls is now. Growing numbers of consumers are expressing worry about how data collected via VR headsets and AR apps are used, and privacy compliance has emerged as the top legal risk impacting XR companies. An XR industry survey found that companies were more concerned with consumer privacy and data security than product liability, health and safety, or intellectual property. Industry groups, privacy advocates and technical standards bodies are positioning themselves to develop guidance for responsible XR and spatial technologies.

Privacy laws, like the California Consumer Privacy Act and EU General Data Protection Regulation, may establish baseline privacy compliance expectations for companies building and using immersive technologies. In a thoughtful piece for Privacy Perspectives California-specific disclosures, general-purpose access and opt-out portals, and generic privacy policies that say nothing about the company’s XR-specific practices at all.

More can be done. As a privacy advocate and XR enthusiast, I would recommend the following three actions.

First, improve transparency and begin making XR-specific data disclosures. The unique nature of immersive technologies offers yet another chance for companies to do a better job of fixing privacy policies. Even Facebook has acknowledged that XR raises new challenges and opportunities for privacy notification design, and the company specifically calls out the sensitivity of features like location mapping and eye-tracking technologies. While we wait to see how layered notices and just-in-time disclosures work in XR, companies can start providing more transparency about sensitive functionality now.

AR point clouds and other sensor-based mapping functionality are necessary to sell the illusion of virtual reality, but companies need to recognize that XR users will react to mapping functionality in much the same way that they have become apprehensive about location tracking. This isn’t necessarily unique to XR. Consumers were surprised to learn their Roombas were mapping rooms as they cleaned, suggesting any technology that analyzes and may share the layout of intimate spaces should be much more transparent about these capabilities.

Similarly, with growing regulatory interest in facial recognition and biometrics, XR companies should be much more open about the types of biometric data that are either collected or derived from these technologies. Clear explanations about how eye- or gaze-tracking technology is used must be a priority. Eye-tracking technology has huge potential for XR, both as a touchless, user interface option and as a method to offload processing power through a process known as foveated rendering. However, eyes divulge the subconscious in ways that are difficult, if not impossible, for individuals to control.

The privacy implications are considerable. Eye-tracking has been called “advertising’s holy grail,” serving as an unconscious “like” button for everything. Research has shown that pupil dilation, which is relatively easily measured by eye-tracking technologies, indicates an individual’s level of interest in what they are looking at and can reveal a user’s age, gender or race. It can be used to ascertain sexual attraction and arousal. And that’s before we consider galvanic skin responses, EEGs (a test that measures brain waves) and other XR mechanisms to detect users’ states of mind. XR companies would be wise to take some uses off the table entirely, but the lack of clear communication about how this tech is developing is equally problematic.

Second, embrace transparency reporting and technical solutions to restrain data sharing. Facebook’s recent announcement that its Oculus headsets would be locally recording users’ experience in its Horizon VR app on a rolling basis highlights the surveillance potential of XR. Researchers at the University of Washington have highlighted how XR presents another tension point for the Fourth Amendment’s third-party doctrine, and it is not an exaggeration to say that metaverse enthusiasts are concerned immersive technology opens the door to a “total surveillance state.”

XR companies will not only need to develop clear policies for what happens when law enforcement comes calling, but they also need to consider whether certain XR data like raw eye-tracking data or sensor image streams should only be stored locally. Device-level machine learning and daisy-chaining to more powerful local devices could be mechanisms to improve users’ experiences without creating a potential data free-for-all.

Moving forward, companies will need to consider how to process XR data in privacy or obscurity protecting ways. Some of the privacy issues initially faced by Google Street View could be helpful. Google had to respond to capturing snapshots of bystanders picking their noses and entering adult bookstores by developing tools to automatically blur faces and license plates and offer options for individuals to request additional blurring of houses and bodies. In addition to this more visible data collection, however, Street View also captured network SSIDs, forcing the company to small and exclusive community. It should surprise absolutely no one that women are vastly outnumbered in social VR, and early research suggests rampant harassment in virtual environments.

Women often choose non-gendered avatars like robots to disguise themselves and avoid harassment. The integrity of digital representation and identity goes both ways. In VR, individuals can use an avatar to broadcast how they wish to be seen by friends and strangers alike. Augmented reality, meanwhile, may give individuals the ability to see people how they wish. Filters that let people give themselves bunny ears or cat whiskers can be presented as a novelty, but one can imagine AR filters that make strangers on the street look more or less attractive, which creates new dynamics for human empathy and cruelty.

Snapchat, for instance, received blowback for offering a Bob Marley selfie-lens to promote “4/20,” raising questions about digital blackface and promoting weed culture stereotypes. Sephora Virtual Artist allows users to try on digital makeup, but what is to stop digital makeup from becoming an individual’s permanent reality? These types of augmentations or filters may have spillover effects on users’ physical bodies and sense of self — "Snapchat dysmorphia" is a new term to describe the trend of people getting plastic surgery to look like the versions of themselves that appear in social media filters. Too often, unfortunately, marketing and product teams are not considering how immersive technologies can negatively impact unrepresented communities.

These are less hard-and-fast privacy issues, but privacy professionals need to understand that XR will bring together an uncertain collision of privacy rules, tort regimes, intellectual property rules and accessibility considerations. Companies entering this brave new world need to craft a vision for how they expect to use XR. Privacy and security will need to be considered alongside inclusive design and user safety. The fate of our new digital reality may hang in the balance.

Photo by XR Expo on Unsplash