A newly declassified report from the Office of the Director of National Intelligence is making waves in U.S. policy circles. The report, from January 2022, includes a 90-day look back on the intelligence community's acquisition and use of what it calls "commercially available information," an explanation of the current policies in place around CAI practices, and recommendations for internal changes.
The report comes at a charged moment for the commercial acquisition of personal information by U.S. government entities.
I’ve written previously about the data broker community's claims that access to their systems has become essential for modern law enforcement. Reportedly, concerns over hampering law enforcement's use of commercial data tools have become a major sticking point in the efforts by the House Energy and Commerce Committee to update the text of the American Data Privacy and Protection Act in advance of its expected reintroduction. This week, Chair Cathy McMorris Rodgers, R-Wash., was quoted in a PoliticoPRO article, saying "We want to make sure that law enforcement has the tools to do what they need to get done."
The ODNI is an independent agency that oversees and directs the implementation of the National Intelligence Program, which funds the activities of the 17 agencies that make up the U.S. intelligence community. Director of National Intelligence Avril Haines also acts as the principal adviser to the president, the National Security Council and the Homeland Security Council for intelligence matters related to national security.
Transparency and accountability for the intelligence community is core to the mission of ODNI, which was created in 2004 based on the recommendations of the 9/11 Commission Report. The newly released report showcases the agency’s commitment to this mandate. As the Wall Street Journal explains, the report was commissioned after Sen. Ron Wyden, D-Ore., requested, during DNI Haines' confirmation hearing, the intelligence community detail and make public how it uses commercially available data.
Written in a conversational style, the report calls for additional safeguards and, at a minimum, deep reflection within the intelligence community about the use of CAI. In a section titled "CAI increases the power of the government," the report explains:
"The government would never have been permitted to compel billions of people to carry location tracking devices on their persons at all times, to log and track most of their social interactions, or to keep flawless records of all their reading habits. Yet smartphones, connected cars, web tracking technologies, the Internet of Things, and other innovations have had this effect without government participation. While the IC cannot willingly blind itself to this information, it must appreciate how unfettered access to CAI increases its power in ways that may exceed our constitutional traditions or other societal expectations."
The report draws attention to the qualitative and quantitative distinctions between CAI and traditional publicly available information like phone book listings and newspaper reports, citing the U.S. Supreme Court’s Carpenter v. U.S. decision repeatedly. Notably, the publicness of CAI seems to be the primary basis upon which the intelligence community has made use of this data, consistent with laws and internal guidance.
On this point, the report highlights that internal agency guidance differs. The CIA policy, for example, states "information is publicly available only if it is made available to the CIA under conditions or on terms generally applicable to the public," including through subscription or purchase. But intelligence elements within the Department of Defense add their own gloss on this policy, adding PAI is also information that is "generally available to persons in a military community even though the military community is not open to the civilian general public."
In keeping with its mandate, the ODNI report does not directly analyze the legal implications of judicial decisions like Carpenter in the context of intelligence activities, nor does it weigh in on any proposed legislative interventions. However, it does offer three recommendations for internal policy changes within the intelligence community:
- "First, the IC should develop a multi-layered process to catalog, to the extent feasible, the CAI that IC elements acquire. This will be a complex undertaking requiring attention to procurement contracts, functionally equivalent data acquisition processes, data flows, and data use. The IC cannot understand and improve how it deals with CAI unless and until it knows what it is doing with CAI.
- Second, based on that knowledge, the IC should develop a set of adaptable standards and procedures for CAI, governing and requiring regular re-evaluation of acquisition and other decisions.
- Third, as part of this set of standards and procedures, and/or as a complement to it, the IC should develop more precise sensitivity and privacy-protecting guidance for CAI. PAI is no longer a good proxy for non-sensitive information; today, much CAI is very sensitive, and the IC therefore needs to develop more refined approaches."
These recommendations come at a time when regulators are also increasingly scrutinizing the ways in which artificial intelligence applications and other advanced technologies may require an expansion of our understanding of what counts as biometric information, health information, sensitive personal information or even personal information generally. The attention generative AI applications have received also puts pressure on the definition of publicly available information in an unprecedented manner. Could these evolutions mean future changes in the commercial community and the intelligence community alike?
Here's what else I’m thinking about:
- The U.S. Federal Communications Commission will launch a "Privacy and Data Protection Task Force." At a Center for Democracy and Technology event this week, FCC Chair Jessica Rosenworcel announced the new task force, led by Loyaan A. Egal, current Bureau Chief of the FCC Enforcement Division. The task force will coordinate several critical cross-agency projects, including updating the agency’s data breach rules, providing oversight on data breach investigations, tackling SIM-switching fraud and enforcing the newly enacted Safe Connections Act. Notably, Rosenworcel also hinted that the FCC is bringing enforcement actions against two mobile carriers based on their responses to inquiries about their use of geolocation data last year.
- A letter from more than 60 civil society groups asked the White House to continue leading on AI civil rights issues. The letter focuses on the fact that "AI is being used in a growing set of high-stakes applications, including immigration, policing, housing, and hiring — despite mounting evidence that automated systems often reflect and deepen existing patterns of discrimination." The groups ask for a set of specific actions from the administration, including making the the AI Bill of Rights binding policy through guidance from the Office of Management and Budget, and ensuring follow-through from sector-specific agencies such as the Equal Employment Opportunity Commission, Occupational Safety and Health Administration, Justice Department and Labor Department.
- Researchers explore the sensitivity of telemetry data for immersive tech applications. A new analysis by Berkeley researchers, in collaboration with extended reality pioneer Louis Rosenberg, showcases how simple motion data from head and hand telemetry sensors built into XR devices can help to infer surprisingly sensitive information about users. They write, "What appears at first to be random variations in movement may perhaps be more akin to a DNA sequence, revealing the identity, biometrics, demographics, and even health information of XR users to anyone else in the same virtual world."
Upcoming happenings
- 20 June at 13:00 EDT, IAPP hosts The State of the States: U.S. state-level privacy legislation (LinkedIn Live).
- 21 June at 9:30 EDT, the Hudson Institute hosts S. Leadership in Tech Diplomacy: A Conversation with Ambassador Nathaniel C. Fick (hybrid).
- 22 June at 13:00 EDT, IAPP hosts What can LGBTQ+ privacy teach us about privacy for all? (LinkedIn Live).
- 5 July, deadline to complete the 2023 IAPP-EY Governance Survey.
Please send feedback, updates and multi-layered processes to cobun@iapp.org.