The U.S. Federal Trade Commission said it would follow the letter of the law when it process began in earnest with an exchange of perspectives from relevant parties at the FTC's virtual public forum Sept. 8.

The official comment deadline for the ANPRM and the 95 questions it poses will not close until Oct. 21, but the forum served as a leap toward arriving at a formal rulemaking proposal. In addition to offering a platform for the agency to collect feedback, the forum offered panel discussions for industry players and consumer advocates to shed light on their expectations and hopes for final rules.

FTC Chair Lina Khan stressed that everything brought to the table during the forum would be considered as the commission "determines how to proceed both with determining whether to proceed with a proposed rule as well as what form a proposed rule could take."

"We know that today's digital tools can deliver huge conveniences, but we also know that these tools and the business models that underline them can also be used to track and surveil individuals in entirely new ways," Khan added. 

Industry talks best practices, clarifying rules

FTC Senior Advisor Olivier Sylvain walked industry representatives through commission-generated prompts related to pressure points around how rules will affect businesses across sectors. The FTC inquired whether businesses have learned any lessons from or created best practices related to consumer harms and asked about what types of rules would create a balance between real consequences and benefits for consumers. 

Within the conversation on hypothetical rules, Digital Content Next CEO Jason Kint suggested the FTC formulate guardrails for limiting out-of-context data use or tracking, saying that would be a step in the right direction.

"If you're choosing to visit a website … using that data in order to make recommendations on the product or target advertising on it, that's all in the same context. The data is being collected while you're choosing to use that app or website and being used in that exact context," Kint said. "It isn't some other party that you're not choosing to interact with that is collecting that data or informing its algorithmic recommendations."

Kint said the crux of the issue falls back to big companies that may utilize data out of its original context, noting how a search engine may use data for its ad business as an example of nonconsensual use and deception. He said everybody should be "playing by the same rules," but rules need to be heightened for "the companies that have dominance."

The prevalence of out-of-context data use is because it has become "too easy" to do so, according to Mozilla Chief Security Officer Marshall Erwin. Growing a business and the services it provides based on data collection and use is one thing, but companies are moving further away from purpose limitation principles and appropriate safeguards knowing they likely won't face deterrent consequences beyond a one-off fine that will force them to make changes.

"The internet is fundamentally a sort of consequence-free zone right now," Erwin said. "A lot of what we do in browsers is crack down on some of the behaviors like cross-site tracking. Users don't understand it, it's opaque and it violates their privacy. What we see as clamping down on that is decreasing the benefits … making commercial surveillance practices less efficient and I think that's what we are doing. But on the converse side, there's actually creating a real cost when there is bad behavior. Like financial penalties are a meaningful way to move the ball."  

Erwin also injected data security concerns into the best practices dialogue. He said encryption in transit, multi-factor authentication and password requirements are "baseline things everyone should be doing to buy down a lot of risk for a lot of consumers" and then "scrutinizing a company more closely" if those basics aren't in place.

Consumers in crisis, changes required

While potential FTC rules could benefit companies as far as knowing limits and fostering better customer relationships, the forum's consumer advocate panel sought to justify these potential gains, outlining some of the most glaring impacts the growing data economy and subsequent malpractices have on individuals.

Electronic Privacy Information Center Deputy Director Caitriona Fitzgerald discussed how surveillance over internet browsing and app activity is "most problematic" because "it's unavoidable and beyond what reasonable consumers can grasp or understand." She noted how data collection in those contexts reveals a range of sensitive information or characteristics that is then spun off to "hundreds, if not millions" of companies for various secondary uses, all of which poses nonconsensual risks and harms.

The panel explored commercial surveillance's discriminatory harms, including discriminatory advertising and ad auctions involving marginalized communities. Joint Center for Political and Economic Studies President Spencer Overton said ongoing digital harms "threaten to expand" existing disparities for minorities while Upturn Executive Director Harlan Yu suggested available data practices like one-click background checks or similar tools that evaluate people based on discriminatory traits are becoming more common issues.

Consumer consent remains a challenging area for individuals as well. The industry panel pointed to the peaks and valleys The Global Privacy Control has undergone, but noted it could ultimately be a useful tool once fully understood by companies and consumers. Regardless of the GPC uptake, advocates don't want companies to rely on user choice alone.

"Individual choice is relevant but ought not to be the sole or dispositive factor in these sorts of balancing tests when we talk about fairness," Future of Privacy Forum Senior Director for U.S. Policy Stacey Gray, CIPP/US, said, noting practicality as the key reasoning for that position. She cited California's opt-out privacy regime and the burden of opt-out requests to hundreds of companies as examples of how consent is becoming "meaningless if not harmful."

Gray also asked the FTC to keep an open mind to consent issues around emerging technologies.

"We're increasingly entering a world involving voice-enabled devices, smart city devices and connected vehicles. In all of these contexts, you face a situation where it's either not possible to obtain (consent) … or where it's simply a bad idea to give an individual choice," Gray said. "We need other safeguards and to expand out of the notice-and-choice framework. We need to be talking about data minimization, pseudonymization, deidentification. All these (are) ways to mitigate risks without placing the burden on the individual."