Six U.S. senators and a representative released a report this week titled "Attacks on Tax Privacy: How the Tax Prep Industry Enabled Meta to Harvest Millions of Taxpayers' Sensitive Data." The six progressive legislators include Sens. Elizabeth Warren, D-Mass., Ron Wyden, D-Ore., Richard Blumenthal, D-Conn., Tammy Duckworth, D-Ill., Bernie Sanders, I-Vt., Sheldon Whitehouse, D-R.I., and Rep. Katie Porter, D-Calif.
The report follows a series of letters the policymakers sent to tax preparation companies, including H&R Block , TaxAct and TaxSlayer. The scrutiny was sparked by The Markup's investigation last year into the use of tracking pixels on the company's websites, including those parts of the sites used by taxpayers to process their returns. Of course, external investigations can only reveal so much about the settings and procedures at play in the use of tracking pixels, so the legislators asked for more information from the companies themselves.
Adtech experts may at first bristle at the report's alarmist style, but the main conclusion should resonate: "Tax preparation companies shared millions of taxpayers' data with Meta, Google, and other Big Tech firms. The tax prep companies used computer code — known as pixels — to send data to Meta and Google. While most websites use pixels, it is particularly reckless for online tax preparation websites to use them on webpages where tax return information is entered unless further steps are taken to ensure that the pixels do not access sensitive information."
This underscores an important point that privacy pros would do well to remember: Online tracking technologies are often perceived as creepy, especially when they are associated with sensitive contexts. Knowing that even after more than a decade of the widespread use of these tools, their features violate the expectations of consumers and policymakers, organizations must continue to be mindful in when and how they are deployed. After all, consumer expectations lie at the heart of privacy practice — and enforcement.
The report fits with a theme of recent FTC enforcement. The basic act of "revealing" sensitive data to a third party, even if that data was not subsequently used for the third party's own purposes, is enough to trigger liability in certain contexts. This was most apparent through the FTC's use of the Health Breach Notification Rule in its recent GoodRx enforcement. According to the FTC, the company's alleged disclosure of sensitive data fields to third parties, even if the uses were strictly limited, was enough to trigger the "breach" requirements of the HBNR.
Similarly, throughout the Senators' report, the responses of the tax prep companies about the types of information shared through tracking pixels are interspersed with claims about third parties could have done with the data they received through tracking pixels. That is, even though in fact these imagined practices were likely not occurring, the very fact that the information is shared in an identifiable — or even a pseudonymized but re-identifiable — form, triggers heightened scrutiny.
Again, the very act of revealing sensitive data to any third party, for almost any purpose, may be a privacy violation.
In addition, the report also raises a series of more nuanced concerns about the handling of highly sensitive financial data and the shared responsibilities of first- and third-party entities in ensuring that sensitive data does not leak into targeted advertising systems. For example, the report alleges that Meta and Google pointed to their contractual terms for the use of their tracking technologies, which prohibit the sharing of sensitive financial data with the companies, as their primary defense against the allegation that they were receiving this data. In addition, Meta allegedly pointed to technical controls it has in place to filter out sensitive data that is ingested through pixels. But the report highlights potential gaps in this automated "signal filtering mechanism" which may have difficulty discovering sensitive data in forms that are less structured than a social security number or bank account number.
On the flip side of this argument, unstructured data that could not be flagged as financial data would likely also prove difficult to make use of in any meaningful way. Nevertheless, we should show even more care in contexts where sensitive personal data is collected through or inferable from web browsing activity.
Tax information is an especially sensitive form of personal data, and, as the report takes pains to point out, is subject to rigorous privacy laws. Specifically, 26 USC 7216 requires that "a tax return preparer may not disclose or use a taxpayer's tax return information prior to obtaining a written consent from the taxpayer." The code makes it a criminal misdemeanor to violate the provision, with statutory penalties of USD1,000 per violation. The IRS implementing regulations are also cited repeatedly in the report, including the definition of disclosure: "the act of making tax return information known to any person in any manner whatever."
Just like under the HBNR, the bare disclosure of this information is a privacy invasion that can trigger liability.
Tax prep companies can also turn over data to "auxiliary service providers in connection with the preparation of a tax return." But, the report alleges that Meta and Google likely do not meet the definition of "auxiliary service providers" and points out that the data sharing with Meta was for advertising purposes, rather than "in connection with the preparation of a tax return."
The legislators released a letter addressed to a handful of agencies calling on them to investigate the alleged data sharing: "The Internal Revenue Service, the Treasury Inspector General for Tax Administration, the Federal Trade Commission, and the Department of Justice should fully investigate this matter and prosecute any company or individuals who violated the law."
As ever, the devil is in the details of the consent mechanism. It will be up to those with enforcement powers over the disclosure and use of tax information to determine whether the facts alleged in the report could lead to liability, including whether the types of consent received by the companies could meet the regulatory requirements for "knowing and voluntary" written consent from the taxpayer.
Here's what else I'm thinking about:
- The EU-U.S. Data Privacy Framework received much-awaited adequacy recognition from the European Commission. Guidance from the Department of Commerce clarified the next steps for companies. Those U.S. organizations already self-certified under the Privacy Shield can immediately begin relying on the new framework for EU-U.S. data transfers as soon as they update their privacy policies. The same is true for Swiss-U.S. transfers starting 17 July. However, existing participants who wish to use the "UK Extension to the DPF" for U.K.-U.S. transfers will need to submit an application, starting 17 July, through the new Data Privacy Framework website. Similarly, those who are not already self-certified to the Privacy Shield — including those who have withdrawn over the past few years — will need to submit an application to the new framework and follow the steps to self-certify. This process will likely be very similar to the instructions under Privacy Shield.
- A chorus grew louder about the role of comprehensive privacy legislation in regulating AI. Brookings' Cam Kerry wrote a detailed analysis of "How privacy legislation can help address AI." And in an IAPP op-ed, R Street's Brandon Pugh and Steven Ward concluded that AI regulation needs to start with a "comprehensive federal data privacy and security law." Both pieces analyze the AI-relevant requirements in the American Data Privacy and Protection Act, which still has not been reintroduced in the 118th Congress, even as we near 200 days since the start of the session.
- Twitter thumbed its nose at the U.S. Federal Trade Commission. Rather than cooperate with the FTC's ongoing investigation into the company's compliance with its consent agreement with the agency, X Corp. asked a federal judge to reconsider the consent order. The company formerly known as Twitter seems primarily concerned with avoiding an FTC deposition of Elon Musk, currently scheduled for 25 July. As it happens, that is also the day that the next independent audit of Twitter's privacy program will be due for filing with the FTC. After Ernst & Young reportedly stopped performing its assessments due to unpaid bills and "constant turnover" on the executive team, it is unclear which independent assessor is conducting the FTC-mandated audit.
- Meanwhile, the FTC launched an investigation into OpenAI, according to The Washington Post, which published a copy of the Civil Investigative Demand the FTC sent to OpenAI. The CID provides a rare look into the questions the FTC is asking about the governance processes in place for the production of foundational models like OpenAI's large language models.
Upcoming happenings
- 17 July at 11:00 AM EDT, Future of Privacy Forum hosts Immersive Tech Panel Series: Health and Wellness (virtual).
Please send feedback, updates and tax returns to cobun@iapp.org.