In a sign of the dangerous time we're living in and as most of the U.S. self-quarantines to protect health during the COVID-19 pandemic, the U.S. Senate Committee on Science, Commerce and Transportation held an unusual "paper hearing" on how to responsibly employ big data to combat spread of the virus. Witnesses submitted testimony to the committee, and now will have a "96-business-hour turnaround time to answer member questions," which will be made public via the Senate committee's website. 

"As the public and private sectors race to develop a vaccine for this deadly disease, government officials and health-care professionals have turned to what is known as 'big data' to help fight the global pandemic," said Committee Chairman Roger Wicker, R-Miss., in his opening statement. "In recognition of the value of big data, Congress recently authorized the [Center for Disease Control], through the bipartisan coronavirus relief package, to develop a modern data surveillance and analytics system. This system is expected to use public health data inputs — including big data — to track the coronavirus more effectively and reduce its spread. State governments are also using big data to monitor the availability of hospital resources and manage supply chains for the distribution of masks and other personal protective medical equipment." 

But it's not only the CDC that's using big data to track the disease's potential spread. Private industry such as mobile advertisers are sharing location data — reportedly in anonymized and aggregated form — under the umbrella term "Data for Good" to inform researchers when large crowds are gathering and likeliness of spread is greater. Wicker said he wants to learn from witness testimony how such data is being collected, with whom it's being shared, how it's being anonymized and how consumers are being notified. While the benefits of using big data to protect American's from disease are plentiful, privacy risks have to be minimized, he said. 

Wicker also noted, as did several of the witnesses in their testimony, that the risks to consumer privacy this hearing aims to address underscores the need for a strong baseline privacy law in the U.S., a process that's seemingly come to a screeching halt during these uncertain times. Wicker's own proposed bill includes provisions on data sharing in cases of national emergency. 

In her testimony, noting federal and local government interest in accessing commercial data, Future of Privacy Forum Senior Counsel Stacey Gray, CIPP/US, says it's possible to collect and use location data in concert with data protection and privacy principles. 

"In many cases, commercial data can be shared that does not reveal any information about identified or identifiable individuals," Gray wrote. "For example, private companies may process aggregated data about the use of public transportation or supply chain management in partnership with local governments. In other cases, data originally collected from individuals can be transformed or deidentified to a sufficient extent that it only reveals aggregate trends, such as movements of people at the city, county or state level."

Gray's recommendations are that technology companies aiming to share data "follow the lead of public health experts" on what data is even useful here. She also recommends technology companies be transparent with consumers on what's being shared; that privacy enhancing technologies — such as differential privacy, for example — be employed to protect the data and that purpose limitation principles are followed. 

Gray echoed Wicker that the crisis highlights a need for federal privacy law, noting while there are strong protections for data collected in certain contexts — health data collected at hospitals is covered under the Health Insurance Portability and Accountability Act — data collected via "wearables" does not enjoy the same protection. 

In her testimony, Center for Democracy and Technology's Michelle Richardson agreed with Gray on the need for federal privacy legislation. 

"While some may argue that a privacy law would only hamper innovations around the coronavirus response, failure to impose reasonable protections may backfire," Richardson wrote. "First, improper use of consumer health data leads to an erosion in consumer trust that may deter people from voluntarily sharing information with legitimate entities for important public health purposes." 

Noting the increasingly widespread use of location and proximity data to track the virus, Richardson testified that location data is particularly difficult to anonymize given its specificity to a particular individual. "Part of what makes location information difficult to anonymize is the wealth of information that can be combined with the dataset to enable conclusions about identity to be drawn," she wrote. "For example, an anonymized route of a morning commute can reveal where the commuter likely lives and works. This information almost always uniquely describes one person and is identifiable to that person."

As a result, she suggests the government focus on using aggregated data versus location data said to have been anonymized or deidentified. 

Ryan Calo, a law professor at the University of Washington, isn't sure this whole idea of location tracking is useful, especially given the risks.

"I understand the intuition behind digital contact tracing," he wrote in his testimony. "But I see the gains in the fight against the virus as unproven and the potential for unintended consequences, misuse, and encroachment on privacy and civil liberties to be significant."

Calo noted that many of the technology companies who'd ostensibly be sharing commercial data with the government — think Google, Facebook, Uber — are under consent decree with the Federal Trade Commission for privacy and security lapses. 

"It seems fair to wonder whether these apps, developed by small teams, will be able to keep such sensitive information private and secure. To the extent digital contact tracing— or any private, technology-driven response to the pandemic — involves the sharing of health care data with private parties, there is also the specter of inadequate transparency or consent."

On the industry side of things, the National Advertising Initiative's Leigh Freund aimed to assuage the Senate committee's concerns about industry's role in disseminating location data to government. 

A number of NAI's 100-plus members are assisting government groups with COVID-19 response, Freund testified, including the “COVID-19 Mobility Data Network,” which pairs infectious disease epidemiologists with technology companies globally to share aggregated mobility data. 

But she said NAI members adhere to its code of conduct, enforced through annual compliance reviews and "rooted in the widely accepted Fair Information Practice Principles." The code requires its members collect precise location information only with opt-in consent and 'detailed, just-in-time notice about the collection, use, and sharing of such information." The NAI is also recommending that its members "subject their efforts and partnerships in this area to legal and ethical review and be guided by contractual controls that narrow the amount and granularity of data provided, restrict secondary uses or sharing of the data, and limit the timeframe the data are permitted to be used to achieve the approved research objectives," she testified. 

Freund also echoed Gray and Richardsons' calls for federal privacy legislation. 

The Interactive Advertising Bureau, Kinsa Smart Thermometers and the App Association also submitted written testimony, all of which can be found here on the Senate committee's hearing page. 

The Privacy Advisor will track updates to this hearing once the 96-business hour period has concluded and witnesses have responded to lawmaker's questions based on submitted testimony. 

Photo by Andy Feliciotti on Unsplash