It is becoming clear that governments are bound to see a backlash when they do not engage the public in deals with tech vendors that affect public policy.
“I remain of the opinion that we shouldn’t make public policy with a vendor, so this was a bad deal at the beginning and is still a bad deal today.” Citizen activist Bianca Wylie made this declaration following a public meeting last August about a plan to turn a neighborhood in Toronto, Canada, into a hyper-connected data-fueled “global hub for urban innovation.”
Wylie is a vocal critic of Sidewalk Toronto, a partnership between the group overseeing the city’s waterfront revitalization and Google-affiliated Sidewalk Labs involving automated trash collection, traffic systems and public transport. In October, former Information and Privacy Commissioner of Ontario Ann Cavoukian stepped down from a consultant role on the project after learning that data privacy policies were not as strict as she originally understood.
Still in the planning stages, Sidewalk Toronto serves as a lesson to governments across the globe. Open cities and tech policy wonks say that when governments use tech such as algorithmic decision-making systems, the old procurement playbook needs a rewrite.
“Right now, what we see is an absence of democratic process,” said Deirdre Mulligan, co-director of the Berkeley Center for Law and Technology, during a keynote presentation on “Procurement as Policy” at the Association for Computing Machinery’s Conference on Fairness, Accountability and Transparency in Atlanta in January.
Agencies are acquiring systems “almost as though they are off-the-shelf products…without actually realizing that what is embedded in this system includes numerous policy decisions,” she said.
Agencies are acquiring systems “almost as though they are off-the-shelf products…without actually realizing that what is embedded in this system includes numerous policy decisions,” Mulligan said.
In response to a request to comment for this story regarding claims that the Sidewalk Toronto project did not offer enough community engagement, Sidewalk Labs directed Privacy Tech to a Feb. 19 opinion piece in the Toronto Star penned by Sidewalk CEO Dan Doctoroff, who served as New York City's deputy mayor for economic development and rebuilding from 2002 to 2008: “We have met in person with more than 18,000 Torontonians and hundreds of public officials. We’ve never said we have all the answers, and we haven’t always gotten it right. But we are committed to this city, have relished the give and take, and believe that these interactions have made us more sensitive to critical issues and meaningfully improved our plans.”
The standard government agency procurement process was devised to eliminate graft and cronyism, reduce costs, and enable fair bidding opportunities. Technologies making algorithmic decisions — whether used to automate responses to constituent letters, in taxation or pretrial risk assessments in policing — create new policy implications. As Mulligan’s AI tech procurement policy research partner and fellow UC Berkeley law professor, Kenneth Bamberger, puts it, these systems not only demand a policy-centric evaluation and procurement process, but they themselves also make policy by making decisions that affect citizens.
“What we know with these technologies, especially with machine learning, is that the policy is happening all the time,” he said in an interview with Privacy Tech.
Community participation, impact assessments and more
So what needs to change? Those focused on these issues point to the need for impact assessments, community notice and participation, expert input and evaluation, a process for feedback loops and contracts that specify details of data privacy, ownership and use.
Those focused on these issues point to the need for impact assessments, community notice and participation, expert input and evaluation, a process for feedback loops and contracts that specify details of data privacy, ownership and use.
Last year, AI ethics group AI Now published a framework for algorithmic impact assessments, suggesting that government agencies evaluate potential community effects on fairness, justice and other factors, establish external researcher review of systems to measure and track impact over time, allow for public notice and comment, and provide enhanced due-process mechanisms for affected individuals.
“If governments deploy systems on human populations without frameworks for accountability, they risk losing touch with how decisions have been made, thus making it difficult for them to identify or respond to bias, errors, or other problems,” stated the paper.
Both Mulligan and Bamberger stressed the need for feedback loops and approaches that allow humans affected by automated decisions to challenge or contest system decisions, ideally improving algorithms over time. “You have to have systems in place where you have to come back and keep testing the outputs,” Bamberger said.
The contracting process itself should be transparent and account for data collection, use and ownership, said Katya Abazajian, director of the Sunlight Foundation’s Open Cities team. In other words, who owns the data, how can it be used, and how long should it be stored?
Ensuring citizens can access details of partnerships with tech vendors in order to participate in the deliberation and procurement process is not easy, though. Abazajian noted that many cities Sunlight Foundation works with still conduct procurement processes on paper, meaning the information is not always accessible for online education, notice and commenting.
And, regarding transparency around actual algorithms that enable automated-decision tools, the question of transparency is moot if governments do not have the information available to begin with. A 2017 Yale Journal of Law and Technology paper by Robert Brauneis and Ellen Goodman titled “Algorithmic Transparency for the Smart City” stated, “Our research suggested that governments simply did not have many records concerning the creation and implementation of algorithms, either because those records were never generated or because they were generated by contractors and never provided to the governmental clients.”
Cities start the process
Abazajian said she has spent the last six months talking to open government partners who are “collectively tackling” approaches to government procurement and use of AI and other technology. In her estimate, only a few cities in the U.S. have established initiatives to determine appropriate processes for evaluating AI technologies and partnering with tech vendors for things like street-level security or predictive policing. New York City established its Automated Decision Systems Task Force in May, assembling government, nonprofit, research and academic stakeholders to devise a process for reviewing such technologies for government use.
In 2016, Oakland established a privacy commission to advise the city on best practices for purchase and use of surveillance equipment and other data-gathering techs. Seattle in 2017 released a list of 28 surveillance and sensor technologies employed throughout the city for purposes ranging from traffic measurement to electricity theft. A public comment period for some of these systems will stay open until March 5, 2019.
Just south of Seattle on the I-5 in Portland, Oregon, the city adopted its Smart City PDX priorities framework in June, promising to develop policies that ensure “that projects protect privacy, promote safety and address equity in a way that results in measurable community benefits” before advancing Smart City PDX projects.
Despite this handful of nascent initiatives, sources indicated that most governments have yet to think about, much less establish frameworks for new AI partnerships and procurement.
“There is a big umbrella of cities buying technologies that they don’t understand are a huge risk to people,” Abazajian said.
photo credit: Rusty Russ City Island via photopin (license)