At the U.S. Senate Committee on Banking, Housing and Urban Affairs hearing June 11, lawmakers sought answers on how the data broker industry operates and what Congress should do about it. They wanted to hear from privacy advocates, government and the industry itself. Committee Chairman Mike Crapo, R-Idaho, said at the outset of the hearing he has been "troubled by government agencies' and private companies' collection of personally identifiable data for a long time."
But there was one problem standing between Crapo and the answers he and the committee sought: The data broker companies asked to participate in the hearing declined.
Ranking Member Sherrod Brown, D-Ohio, noted many consumers probably had never heard the names of the 4,000 or so entities operating as "shadow companies," which collect and share or sell private information about them.
"Stunningly, not one of them has been willing to show up and speak in front of this committee today, not one," Brown said. "These companies expect to be trusted with the most personal and private information you could imagine about millions and millions of Americans ... they're not even willing to show up and explain how their industry works? ... I think it tells you all you need to know how much they want their names and faces associated with that industry."
The committee sought insights on whether new legislation is required to harness the power data brokers currently enjoy given the industry's vast collection and storage of consumer data, largely without knowledge and consent. Notably, various proposals for a baseline federal privacy legislation call for the regulation of data brokers, but it’s not yet clear whether Congress will choose to include those kinds of provisions or instead amend current U.S. laws on financial privacy, children’s privacy, health privacy and the like to capture an industry that has so far evaded policing.
Part of the problem is such practices are done in secret, using algorithms to which no regulator, namely the U.S. Federal Trade Commission, is privy. Additionally, many of the data collection and profile-building practices have evolved quickly with technology and are being used in ways unimagined in an analogue age.
Individuals denied a housing loan, for example, may not know there's a profile of them out there somewhere that was developed by an entity unknown to them, using disparate pieces of their online lives, including location data, social media posts or the neighborhood they live in.
Alicia Puente Cackley, director of Financial Markets and Community Investment at the Government Accountability Office, testified there is "no overarching federal privacy law governing the collection, use and sale of personal information among private-sector companies, including information resellers."
There are sectoral laws, like the Fair Credit Reporting Act, Gramm-Leach-Bliley Act, Health Insurance Portability and Accountability Act and the Children's Online Privacy Protection Act. But no federal law directly addresses tracking consumers online to build profiles on them that will then be used by entities for marketing purposes or to evaluate creditworthiness, among other purposes.
"Under most circumstances, information that many people may consider very personal or sensitive can be collected, shared and used for marketing. This can include information about physical and mental health, income and assets, political affiliations, and sexual habits and orientation. For health information, HIPAA rule provisions generally apply only to covered entities, such as health care providers," Cackley wrote in her testimony.
Pam Dixon, executive director of the World Privacy Forum, said the Fair Credit Reporting Act, passed by the same committee 50 years ago, was "and still is the most important American privacy law that we have, but it's not as important as it was." That's because, she said, credit scores are now being sold in an "unregulated market" and they're created "without due process." Dixon has coined these unregulated scores "consumers scores," and wrote extensively about the problem in her "Scoring of America" report, cowritten with Robert Gellman.
Dixon is especially focused on potential harms resulting from those unregulated scores, "formed by using data about" consumer characteristics, past behaviors and other attributes in statistical models that produce a numeric score .... Businesses and others use consumer scores for everything from predicting fraud to predicting the health care costs of an individual to eligibility decisions to almost anything," she testified.
"There are now literally tens of thousands of consumer scores that have been created by data brokers and others to predict aspects of consumer behavior, group behavior, various types of risk, and more," Dixon said in her written testimony.
She called on Congress to "expand the Fair Credit Reporting Act to regulate currently unregulated financial scores," as well as to "enact a standards law that will provide due process and fair standard setting in the area of privacy."
Cackley agreed with Dixon's assertion Congress should think about expanding the FCRA to protect more types of information than currently regulated under the statute, and she testified consumers should legally have the right to access the information data brokers collect and store on them. In addition, consumers should have the right to correct inaccuracies which may negatively impact their scores.
Additionally, Dixon testified, data brokers are specifically using algorithms to develop consumer scores that are proxies for race or economic bracket, as well as using predictive analysis to imagine how consumers may behave in the future.
She said it's no longer a possibility to say, "Here are the data brokers, let's regulate them," because the practices are so pervasive and used by so many actors for so many purposes.
Sen. Brian Schatz, D-Hawaii, wondered if it might be possible to impose conditions where companies are required to reveal their algorithms to regulators. He said he is working on legislation now that would impose a "duty of care" on companies to address what Dixon referred to as "cracks and fissures" in current laws that allow potentially harmful and discriminatory practices within the data broker industry to occur.
"Because I think the problem is in a sectoral approach ... they sort of evade various regulations because it's not clear where they belong, and in any case, once the data has been collected, either voluntarily or not ... the question is what is the obligation of the company who is in the possession of your data. And the duty of care is the most simple way to say, cross-sectorally, you may not intentionally harm any person whose data you're in possession of," said Schatz.
"I think that is a potentially very good approach," Dixon told Schatz, citing Vermont's data broker law which states entities cannot purchase data with the intent to defraud or discriminate. "I do think that ensuring that fairness is percolating throughout the system is a really good remedy."
Committtee Chairman Crapo and Ranking Member Brown said they planned to take action to try once again to bring representatives from the data broker industry before it in the near future.