Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

The recent near ban on U.S. state artificial intelligence regulation tucked into President Donald Trump's One Big Beautiful Bill Act should have been a wake-up call. Crafted with limited input from technical AI experts, the proposed pause would have put the privacy, security and safety of Americans at risk.

The provision was ultimately removed, but its inclusion reveals a troubling pattern of lawmakers making consequential decisions about technology they don't fully understand.

This problem has plagued the tech policy landscape for decades. A 2023 report from the Brennan Center for Justice pointed out that only a handful of Congressional staffers have advanced degrees, let alone science or technology training. It also noted staffing in the House Science, Space and Technology Committee declined by nearly 45% between 1994 and 2016. The Brookings Institution points to the 1995 defunding of the Office of Technology Assessment as the beginning of Congress' decline in tech competence.

This isn't the fault of regulators, who are making the best decisions they can. But the result is a patchwork of well-intentioned but often misguided rules that fail to protect consumers.

Take the proposed AI transparency bills in California and Florida, which both would require watermarking certain forms of AI-generated content. The idea sounds reasonable: mandate watermarks on AI-generated content so users know what they're seeing. But technologists know we don't yet have airtight tools to make and preserve watermarks — and models like Gemini 2.0 can remove them.

This raises the question: How involved, actually, were technologists in the drafting of these bills?

The contrast is stark when you look at regulators who do understand technology — or hire people who do. Former U.S. Federal Trade Commissioner Lina Khan launched an Office of Technology in 2023 to strengthen in-house expertise. Khan's team understood how data brokers operate and how algorithms can discriminate — leading to action against companies like Mobilewalla and Gravy Analytics that had exploited previous regulators' technical blind spots.

The good news is states are catching on. Delaware, California, Connecticut, Minnesota, Oregon and Texas have hired experts with deep technical backgrounds to support investigations and settlements. These staffers can bring insight on encryption, data flows and system architectures — not just law.

The Minnesota Consumer Data Privacy Act, for example, is the first to require companies to maintain a data inventory. It also contains prescriptive content requirements for data privacy and protection assessments. I recently spoke with the architect behind the act, State Rep. Steve Elkins, D-Minn., who shared how his deep IT background helped shape the law's design.

Recent actions in California against Honda and Todd Snyder also show a shift toward more granular, technically grounded enforcement. Regulators equipped to match the technical complexity of the companies they oversee will result in enforcement that better protects consumers. Yet, we also need this effort at the federal level.

During a recent meeting with Congressional staff focused on crafting a national data privacy framework, I was asked why I support data minimization. I explained that collecting less data inherently reduces risk; there's simply less information to steal or misuse. Staff responded with genuine interest, noting they hadn't considered the concept in those terms.

It was a helpful reminder that even smart, well-intentioned policymakers and their staff are often juggling countless issues; it's on us to explain complex or technical topics in a way that resonates and supports sound policy decisions.

I'm not arguing against regulation. I'm arguing for smarter regulation. When policy ignores technical and operational reality, it leads to requirements that don't make practical sense and consumer protections that don't actually protect them.

The solution is bringing experts into the process from the start: technologists drafting AI bills, chief privacy officers advising on data privacy laws, and security professionals guiding efforts like the proposed Healthcare Cybersecurity Act.   

Some worry that this gives tech insiders too much influence. It's a fair concern. As MIT Technology Review put it: "Leaving it to tech executives to educate lawmakers is like having the fox tell you how to build your henhouse." But the alternative — a regulatory system full of blind spots or shaped by lobbyists behind closed doors — caters to industry even more.

What the public needs is clear, informed regulation that levels the playing field and makes it harder for powerful players to keep dodging the rules.

The stakes are too high to keep stumbling forward with technically illiterate policy. Technology is complex, but the guiding principle is simple: If you're going to regulate something, you have to understand how it works.

Ron De Jesus, AIGP, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPT, CDPSE, CISSP, FIP, is field chief privacy officer at Transcend.