At the Federal Trade Commission's workshop on drones yesterday, agency technologists presented an example of the inherent drone risks, in which an off-the-shelf drone's administrative controls were taken over by standard smartphone. It required no password for the third-party smartphone to connect, and its video and control signal data were sent unencrypted over a wifi network.
The demonstration kicked off two panel debates on whether drones pose privacy risks and what to do about them if so. Thus far, the regulations on the books are the Federal Aviation Administration's Rule 107, which went into effect in August and apply to non-recreational use.
Following the demonstration, however, industry representatives were quick to dismiss it, saying the risks presented by the FTC's technologists were outdated and had already been mitigated by industry on its own accord.
Data is data. There shouldn't be a distinction between the tool you collect that data from. — Mark Aitken, AUVSI
And that was pretty much how the day went. Industry, heavily represented, said it was already on top of any concerns advocates could dream up and therefore required no regulation or legislation. Industry also vehemently denied that drones pose unique privacy risks and said they should be considered no more invasive than other technology-enabled devices.
Diana Marina Cooper, vice president of legal and policy affairs at drone-manufacturer PrecisionHawk, said there's not some vacuum that requires legislation addressing drones specifically.
"I think if regulators start to regulate individual technologies as opposed to uses of any type of technology that create the same harm, we're really going down the wrong path," she said.
Mark Aitken, director of government relations for the Association for Unmanned Vehicle Systems International, said, "data is data. There shouldn't be a distinction between the tool you collect that data from."
But Margot Kaminski, assistant professor of law at Ohio State University's Moritz College of Law, said the assertion that privacy protections on the books will protect people from drones' privacy risk ignores the fact that there aren't enough privacy protections on the books to begin with.
"We don't have adequate privacy law with respect to any technologies," she said.
Panelists countered that there are trespassing and peeping Tom laws that would protect citizens from being filmed or watched in ways with which they were uncomfortable, but Kaminski said that's not true. Often, laws don't apply to drones for various reasons, including that trespass requirements often include peeking through "physical windows," but if a drone is remotely peeking through a window, that doesn't necessarily fall under the law's requirements.
"We just need regulation, period," she said.
On an earlier panel, though, Gregory McNeal, professor of law and public policy at Pepperdine University School of Law, said the conversation about drone legislation is misguided. He said it's not that we're worried about an aircraft, but "something bigger we need to grapple with as a society when it comes to technology." Our concerns about what drones might do with location data they may collect, for example, are more about "how we feel about location-sharing on any device."
He did submit, however, that drones are unique in two ways: They can surveill from the air and slip below tree lines in ways other recording devices generally can't access, and there is a remote operator.
From my perspective, if you can't innovate around some baseline protections, you aren't being very innovative." — Jeramie Scott, EPIC
Kaminski takes particular issue with the remote operator, she said. Other forms of surveillance are less covert but also more avoidable. For example, if a person with a cell phone stands in front of you and records, you're aware they are keeping the camera on you as you move, and, you're able to tell them to stop. If it comes to it, you're able to publicly shame them into stopping by causing a scene. But with remote operators, it's impossible to know who even is doing the watching and therefore to interrupt it.
On a previous panel, EPIC's Jeramie Scott made a similar point about transparency. He said the public should be aware if drones are collecting data, what the drone's purpose is, what the drone's collected data will be used for, and for how long it will be retained.
"I think we'll see pushback against integration of drones unless we're more proactive about implementing certain baseline safeguards," he said.
McNeal said that kind of baseline regulation kills innovation and in some cases is so expensive for the manufacturers, it makes drones inoperable.
"The idea that every drone would have to broadcast in some way is a substantial burden on industry," he said.
Scott wasn't buying it.
"From my perspective, if you can't innovate around some baseline protections, you aren't being very innovative," he said.
But McNeal and his industry co-panelists said they've taken the onus upon themselves to mitigate risks as they've occurred, citing work with 125 airports on how to notify them of nearby drones once they're deployed, for example. McNeal wants to see drones looked at the way cell phones have been: Consumers dictate perceived harms and subsequent adjustments are made to accommodate.
"I'm convinced that within the next two years, most privacy problems will be addressed by this industry," he said.
But Scott said we shouldn't "sit on our hands" and wait for trouble. He compared drones to cookies in the late '90s, where consumers didn't know they were being tracked or how pervasively until well after the fact. Once they knew, they didn't like it.
"If (drones) are going to be used to collect data, and sophisticated equipment is going to be added, maybe the public should know about that so they can actually provide a voice," he said.
The FTC is accepting public comments on drones' privacy implications until November 14.