[guest post by Dana]
San Francisco’s Board of Supervisors voted 8-1 this week to ban what many police departments consider a useful investigative tool:
The ban is part of a broader anti-surveillance ordinance that the city’s Board of Supervisors approved on Tuesday. The ordinance, which outlaws the use of facial-recognition technology by police and other government departments, could also spur other local governments to take similar action. Eight of the board’s 11 supervisors voted in favor of it; one voted against it, and two who support it were absent.
[…]
San Francisco’s new rule, which is set to go into effect in a month, forbids the use of facial-recognition technology by the city’s 53 departments — including the San Francisco Police Department, which doesn’t currently use such technology but did test it out between 2013 and 2017. However, the ordinance carves out an exception for federally controlled facilities at San Francisco International Airport and the Port of San Francisco. The ordinance doesn’t prevent businesses or residents from using facial recognition or surveillance technology in general — such as on their own security cameras. And it also doesn’t do anything to limit police from, say, using footage from a person’s Nest camera to assist in a criminal case.
“We all support good policing but none of us want to live in a police state,” San Francisco Supervisor Aaron Peskin, who introduced the bill earlier this year, told CNN Business ahead of the vote.
Another stated problem with using the technology is the issue of a built-in bias which can lead to incorrectly identifying certain groups:
There are concerns that they are not as effective at correctly recognizing people of color and women. One reason for this issue is that the datasets used to train the software may be disproportionately male and white.
The decision will also make it more difficult for city agencies to continue the usage of any surveillance programs:
Any city department that wants to use surveillance technology or services (such as the police department if it were interested in buying new license-plate readers, for example) must first get approval from the Board of Supervisors. That process will include submitting information about the technology and how it will be used, and presenting it at a public hearing. With the new rule, any city department that already uses surveillance tech will need to tell the board how it is being used.
The ordinance also states that the city will need to report to the Board of Supervisors each year on whether surveillance equipment and services are being used in the ways for which they were approved, and include details like what data was kept, shared or erased.
Yet one needs to also consider the benefits of law enforcement (at large) using the technology in fighting crime:
Perhaps the most compelling argument for FRS is that it can make law enforcement more efficient. FRS allows a law enforcement agency to run a photograph of someone just arrested through its databases to identify the person and see if he or she is wanted for other offenses. It also can help law enforcement officers who are out on patrol or monitoring a heavily populated event identify wanted criminals if and as they encounter them.
Imagine a law enforcement officer wearing a body camera with FRS software identifying, from within a huge crowd, a person suspected of planning to detonate a bomb. The ambient presence of FRS applied to a feed from stadium cameras would allow law enforcement to identify dangerous attendees in cooperation with the company managing event security.
There are many contexts in which this law enforcement technology has already been brought to bear. Beyond spotting threats in a crowd, facial recognition software can be used to quickly suss out perpetrators of identity fraud; the New York Department of Motor Vehicles’ Facial Recognition Technology Program has been doing just that, with 21,000 possible identity fraud cases identified since 2010. The U.S. Department of Homeland Security is also experimenting with FRS to assist in identifying abducted and exploited children. Just this month, U.S. Customs and Border Protection has started teaming up with airlines in Boston, Atlanta, Washington, and New York to use FRS for boarding pass screening. In a safety context, some American schools are installing cameras with FRS to identify the presence of gang members, fired employees, and sex offenders on school grounds.
In spite of the Fourth Amendment and cited benefits of the technology, those supporting the need to ban facial recognition technology as used by law enforcement are worried about a larger, more threatening picture :
Civil libertarians worry that the technology could morph into pervasive automated authoritarianism in which individuals can be tracked everywhere, in real time, similar to the version being developed by the Chinese government. The Chinese government reportedly aims, as part of its Skynet surveillance system, to add an additional 400 million video cameras to its existing 170 million over the next three years. The cameras employ real time facial recognition technology.
Where do you draw the line: ban it altogether or closely regulate its usage because of the risks associated with it? Which reminds me, the New York City Police Department – an agency where facial recognition has reportedly helped break open more than 2,500 cases – used the face of a well-known celebrity (apparently without his knowledge) in order to help identify a suspect:
The New York Police Department used a photo of Woody Harrelson in its facial recognition program in an attempt to identify a beer thief who looked like the actor, according to a report published Thursday.
Georgetown University’s Center on Privacy and Technology highlighted the April 2017 episode in “Garbage In, Garbage Out,” a report on what it says are flawed practices in law enforcement’s use of facial recognition.
The report says security footage of the thief was too pixelated and produced no matches while high-quality images of Harrelson, a three-time Oscar nominee, returned several possible matches and led to one arrest.
(Cross-posted at The Jury Talks Back.)
–Dana