Traditional models of what counts as sensitive data are crumbling, replaced by heightened scrutiny over an ever-widening set of personal data. This is as true for health-related data as it is for children's data. You can also look to rules covering biometrics and considerations for whether collected data has the potential to be used as a biometric.
Consumer financial data is, apparently, also due for an expansion. Rohit Chopra, director of the Consumer Financial Protection Bureau, announced this week the agency is considering formal rule changes to expand its enforcement of the Fair Credit Reporting Act, which governs the behavior of organizations that sell reports related to consumers' credit, character, general reputation, personal characteristics or mode of living. The FCRA grants consumers rights to access, correct and restrict the use of covered reports.
Chopra previewed the proposed rule changes at a White House roundtable that brought together regulators, administration leaders and a variety of civil society voices to discuss "harmful data broker practices." The official readout of the gathering shows the conversation was wide-ranging, helping to underscore the plethora of underlying concerns leading to data privacy inflation.
Advocates and regulators seem to agree the harms from the "data broker economy" are worsening, and flow from a wide variety of factors:
- The scale of data collection, "large volumes of exceedingly detailed data."
- The sensitivity of data, "including geolocation and health information."
- The ease of drawing inferences, including about "individuals' lifestyles, desires, and weaknesses," which is rapidly expanding due to advancements in artificial intelligence.
- The types of entities that purchase data, "including advertisers, financial institutions, employers, landlords, and fraudsters."
- And inconsistent data quality, "including data that was inaccurate, outdated, or not fit for purpose."
These same concerns are reflected in Chopra's remarks. He reports the agency's earlier inquiry into specific data broker practices led to the decision to "launch a rulemaking to ensure that modern-day digital data brokers are not misusing or abusing our sensitive data." The results of the inquiry also helped the agency learn "more about the significant harms – from the identification of victims for financial scams to the facilitation of harassment and fraud."
Chopra's remarks offer a rare glimpse into the ideas behind a proposed rulemaking before the agency has even announced its proposal. Two major proposed rule changes were highlighted in Chopra's preview.
The first would change the scope of most operative provisions of the FCRA by broadening the application of the term "consumer reporting agency" to include anyone that sells certain types of consumer data, "for example, a consumer's payment history, income, and criminal records."
The FCRA relies on interdependent definitions for covered "consumer reporting agencies" that sell covered "consumer reports." Fifty years of practice and enforcement by the Federal Trade Commission and others, including most recently the CFPB, have led to a complex set of definitions and exceptions to these terms, which are most recently outlined in great detail in the FTC's 2011 staff report.
In keeping with Director Chopra's regulatory philosophy of crafting simplified and streamlined rules, some of these exceptions and limitations may be on the chopping block. Of course, the agency can't adjust the statutory text of the FCRA, which itself limits the scope of its coverage, so it remains to be seen how the proposed expansion will be effectuated.
The second proposal raised by Chopra would "address confusion around whether so called 'credit header data' is a consumer report."
Consumer advocates have long insisted that the exclusion of credit header data from the definition of reports leads to the over-sharing of personal identifiers in the marketplace. A 2021 letter from the National Consumer Law Center explains in detail how a FCRA rule change would address the group's concerns, particularly about consumers who do not wish to be located.
The NCLC's letter indicated those seeking to remain unidentified may include "not only undocumented immigrants but debtors seeking refuge from harassing collectors, domestic violence survivors seeking to flee abusers, or consumers who simply do not wish to be contacted." The group added, "These consumers, who might take great pains to avoid publicizing their home addresses or phone numbers, should not be forced to give up that privacy in order to obtain essential services such as cell phone, Internet, or utility service."
Echoing these concerns, Chopra explained the possible rule change: "The CFPB expects to propose to clarify the extent to which credit header data constitutes a consumer report, reducing the ability of credit reporting companies to impermissibly disclose sensitive contact information that can be used to identify people who don't wish to be contacted, such as domestic violence survivors."
Based on Chopra's statements, the CFPB is expected to publish an Advance Notice of Proposed Rulemaking next month, to be formalized through a public comment period "in 2024." For now, the agency is seeking proactive engagement with small businesses to help it craft the rule. "We are encouraging small businesses looking to participate in the process to contact us."
Data broker scrutiny is heating up across the policymaking world overall. Proposed federal reforms continue to pop up, even as the California Legislature considers a bill, Senate Bill 362, that would create an online portal for consumers to request that data brokers delete their data.
Whether new laws are successful, regulators have made it clear they will continue to refine existing laws to meet the shifting risks of our modern digital world.
Here's what else I'm thinking about:
- Politico reported privacy lobbying efforts have shifted to the state level. Though glossing over the innovations among new state privacy laws, the reporting highlights the enhanced efforts among stakeholders to influence consumer privacy debates at the state level. The federal conversation, if not on the back burner, remains for the moment at a low simmer.
- Where are duties of care and loyalty in recent privacy legislation? In an IAPP analysis, Westin Research Fellow Anokhy Desai, CIPP/US, CIPM, CIPT, and Hintze Law Partner Sam Castic, CIPP/US, CIPM, FIP, PLS, provide an edifying breakdown of how these ideas are translating into real policy at the state level.
- Christopher Wolf reflects on "the early days of privacy law." In Hogan Lovells' Data Chronicles podcast, Scott Loughlin hosted an engaging and inspiring conversation with privacy pioneer Wolf on the past and future of privacy law.
- Do androids dream of real-world ceilings? A "glitch" in Snapchat's "My AI" chatbot feature — which takes the form of a virtual "friend" to converse with through the popular camera and chat app — reportedly led to a video featuring a nondescript wall and ceiling in some real-world room being posted to the virtual character's story. If AI chatbots have secret bedrooms, one wonders whether they could soon demand their own privacy rights.
Please send feedback, updates and rulemaking ideas to email@example.com.
If you want to comment on this post, you need to login.