TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Emerging trends in fintech privacy: 5 key areas to watch in 2024 Related reading: A view from DC: FCC geolocation orders show privacy's lost waypoint

rss_feed

Over the past decade, financial technology companies have transformed how consumers think about banking and financial services with user-centric technologies and practices. The innovative products and services they offer are facing increased scrutiny from state and federal regulators.

In 2024, fintechs should be prepared to navigate the following five data privacy and security topics.

Scrutiny of third-party tracking pixels, other tracking technologies may increase 

Many companies use third-party pixels, cookies, software development kits and similar technologies to track user activity or serve targeted advertising. Regulators have been scrutinizing these practices when they involve potentially sensitive personal data, especially when it is used for purposes consumers may not understand or consent to, like profiling or advertising.

Taking a broad view of what constitutes "sensitive" personal data, the U.S. Federal Trade Commission recently warned five tax preparation companies that they must obtain affirmative express consent before allowing these technologies to share financial data, marital status or family information for advertising purposes. Lawmakers are also taking notice. In July, six senators released a report calling for the investigation and prosecution of tax-prep companies for potential violations of privacy law in their sharing of "sensitive" consumer data via pixels and other tracking technologies. Lawsuits against the companies named in the report quickly followed.

These actions follow regulator scrutiny and enforcement actions against companies transmitting medical or health-related data via pixels under similar theories. In light of the FTC's actions and the U.S. Consumer Financial Protection Bureau’s past attention on behavioral advertising practices, fintechs that share financial or transaction data with third parties for advertising or other purposes may soon be scrutinized by regulators under the same theories.

Regulators will increasingly expect appropriate use, governance of AI technologies

In June, the CFPB published an issue spotlight on the use of AI chatbots by banks and financial institutions. The spotlight notes that, while financial institutions relied increasingly on AI chatbots due to cost-savings, their use poses a number of risks. These include privacy and security risks, which if left unmitigated, may make companies noncompliant with federal consumer financial laws.

AI applications can have tremendous benefits for consumers and companies alike, but regulators are likely to scrutinize companies that use AI in ways that harm consumers, whether by design or failure of proper governance.

Potential mandates around financial data rights 

The CFPB may mandate fintechs to provide specific personal financial data rights. The CFPB recently proposed a new rule to give consumers increased access and portability rights to their financial data. The proposed rule implements a dormant provision from Section 1033 of the 2010 Dodd-Frank Act and would apply to certain financial institutions, card issuers, digital wallet providers, and other companies and fintechs with data about covered financial products and services.

It would require these entities to provide consumers with certain transaction, balance, upcoming bill, account and other personal data upon request. Consumers could also authorize third-party companies, including those offering competing services, to request this data, and companies would need to maintain developer interfaces that comply with the rule's technical standards, to allow consumers to have their data ported to these companies.

The CFPB hopes this right will allow consumers to seamlessly switch between financial service providers, increasing competition. Third-party financial service providers that obtain data in this way would be subject to specific data privacy and security requirements, including regarding transparency, consent, use limitations, and onward transfer restrictions.

Data brokers may face new requirements 

In remarks given at the White House in August, CFPB Director Rohit Chopra announced plans to introduce new Fair Credit Reporting Act rules to regulate so-called "data brokers."

The proposal outline defines data brokers broadly as companies that collect, aggregate, sell, license or otherwise share personal data with other companies. These proposed rules would define data brokers as consumer reporting agencies, requiring them to comply with an array of FCRA requirements, including ensuring data accuracy, preventing its misuse and significantly restricting its use for marketing or advertising purposes.

States have also increasingly regulated data broker practices, with Texas enacting one of the strictest data broker laws in the country, and California enacting the Delete Act. In both cases, the CFPB is concerned with compliance challenges posed by AI and the changing financial data landscape.

New information security, breach response obligations likely

Federal and state regulators may continue to impose new information security and breach response obligations on fintechs.

Just this month, the New York Department of Financial Services updated its cybersecurity regulations for financial service companies, enhancing chief information security officer reporting and board oversight responsibilities, imposing new obligations for privileged system access and multifactor authentication, and requiring regulator notifications within 24 hours whenever ransomware or extortion payments are made.Last month, the FTC amended its Safeguards Rule to impose new FTC data breach reporting obligations on certain nonbanking financial institutions.

Tips for how to prepare 

Existing privacy programs may be leveraged or enhanced to prepare fintechs to respond to these emerging trends. Specifically:

  • Personal data inventories and mapping can help define what data is shared with third-parties via tracking technologies, utilized with AI services, and disclosed for new financial data rights, as well as to identify whether the shared data means the company is a data broker. This foundational information will help enable appropriate responses to these trends.
  • Pixel and tracker governance approaches, where standards are set for the use and configuration of third-party technologies, and the technologies used are known and configured to interoperate with consent management tools, can help address state-law obligations and minimize potential scrutiny from regulators.
  • Consent-management tools can be leveraged to obtain opt-in consent or allow opt-out rights, in line with federal and state laws for use of third-party cookies, pixels, SDKs or other technologies that transmit data for advertising, profiling or other potentially unexpected purposes.
  • Accurate descriptions about personal data practices in privacy notices, webforms, and customer-facing apps and webpages that reflect personal data use and sharing practices help keep trust and avoid regulatory scrutiny. Don’t skip reviewing these disclosures regularly, and do so in connection with privacy impact assessments of new technologies and use cases.
  • Privacy and AI risk assessment processes can be utilized to identify potential risks with AI uses, and to help define the necessary controls and testing to mitigate them.
  • Incident response programs and plans used by security and privacy teams can be updated to confirm that new incident reporting obligations are considered and addressed.

By tracking these trends and continuing to support and evolve privacy programs and operations, fintechs can innovate and grow while navigating these emerging privacy and cybersecurity challenges.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.