The Boston City Council could ban the use of facial surveillance technology in the city on Wednesday, becoming the second largest community in the world to do so.
That move comes even as city officials say the technology isn’t yet used by the Boston Police Department — though the department could get access to it with a software upgrade.
The ordinance would ban the use of the technology and prohibit any city official from obtaining facial surveillance by asking for it through third parties.
Councilor Ricardo Arroyo, who sponsored the bill along with Councilor Michelle Wu, noted the technology is wildly inaccurate for people of color. A MIT study found that for darker skinned women, facial analysis programs had an error rate of up to 35%.
“It has an obvious racial bias and that’s dangerous,” Arroyo said. “But it also has sort of a chilling effect on civil liberties. And so, in a time where we’re seeing so much direct action in the form of marches and protests for rights, any kind of surveillance technology that could be used to essentially chill free speech or … more or less monitor activism or activists is dangerous.”
Arroyo said he expects the ban to pass with at least nine votes — which would provide a veto-proof majority.
During a hearing earlier this month, Boston Police Commissioner William Gross said the current technology isn’t reliable, and that it isn’t used by the department.
“Until this technology is 100%, I’m not interested in it,” he said.
“I didn’t forget that I’m African American and I can be misidentified as well,” he added.
While the police department isn’t using facial recognition technology now, an upgraded version of a video analysis software currently used by the department, called BriefCam, does have facial analysis power. Boston police said at a recent city council working session that they would not sign up for that part of the new version of the software.
Wu said government often chases new technologies and tries to put in regulations after the fact — from ride hailing to home sharing. She said in this case, the disproportionate impact on people of color makes acting now even more important.
“The harm from chasing the technology after it would already have been deployed and then trying to rein it in later would mean that we would be continuing racist politics and the harmful impacts of systemic racism on our residents of color,” she said. “So it’s important to put parameters now.”
If the ban is enacted, Boston will become the second largest city in the country to ban facial recognition technology, behind San Francisco. Five other communities in Massachusetts have a ban: Somerville, Brookline, Northampton, Springfield and Cambridge.
The Massachusetts chapter of the American Civil Liberties Union pushed for the bans in those communities, and is lobbying state lawmakers to act. There is no statewide ban, though a bill that would put a moratorium on face recognition systems is currently pending before the joint judiciary committee. The Boston ordinance would not affect federal agencies, like the FBI, from using the technology.
Kade Crockford, with the ACLU, said the state should act now to prevent harm down the line.
“Let’s just ensure that we put the policy horse before the technology cart and lead with our values so we don’t accidentally wake up someday in a dystopian surveillance state,” Crockford said, “because behind the scenes, police departments and technology companies have created an architecture of oppression that is very difficult to dismantle.”
The Boston city council ordinance notes governments around the world have responded to the COVID-19 pandemic with “an unprecedented use of surveillance tools” despite needing the public trust to effectively respond to the crisis.