No other city in the U.S. is as closely associated with technological advancements than San Francisco, which, for the most part, has embraced both tech companies and their services in civic life. But on Tuesday, San Francisco passed an ordinance that would ban the use of one very specific technologyโfacial recognitionโby local police departments, making it the countryโs first major city to do so.
Suggested Reading
Cited in the anti-surveillance ordinance were concerns about furthering racial bias and increasing the governmentโs ability to surveil citizens.
โThe propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits,โ reads the San Francisco โStop Secret Surveillanceโ ordinance, which passed on an 8-1 vote by the San Francisco Board of Supervisors, adding, โthe technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.โ
According to CNN, the facial recognition technology ban is part of a larger anti-surveillance ordinance approved by the Board on Tuesday. All 53 of San Franciscoโs police departments would be forbidden from using facial recognition technology, and use of any surveillance technology by the city would require board approval.
โThis is really about saying: โWe can have security without being a security state. We can have good policing without being a police state,โโ Supervisor Aaron Peskin, one of the authors of the new legislation, told the Associated Press. โAnd part of that is building trust with the community based on good community information, not on Big Brother technology.โ
But there are carve-outs: federally-controlled facilities like San Francisco International Airport and the Port of San Francisco can still use these surveillance technologies, as can businesses and residents. โIt also doesnโt do anything to limit police from, say, using footage from a personโs Nest camera to assist in a criminal case,โ writes CNN.
The legislation is groundbreaking, considering San Franciscoโs size and influence; the ordinance may serve as an example for other cities similarly concerned with protecting citizensโ privacy in the face of ever-expanding surveillance tools. Neighboring Oakland and Santa Clara Countyโhome of social media behemoth, Facebookโhave passed surveillance technology laws of their own, with Oakland also considering a facial-recognition tech ban.
But police usage of facial recognition is especially problematic for black Americans, for whom the technology is significantly less accurate.
Just last year, the American Civil Liberties Union (ACLU) tested Amazonโs facial recognition software and found it incorrectly misidentified 28 black members of Congress as criminals (Amazon currently sells this technology to police departments across the country).
It gets worse. From Vox:
Researchers at MIT found that, overall, the software returned worse results for women and darker-skinned individuals (in both cases, Amazon has disputed the findings). And in places like Maryland, police agencies have been accused of generally using facial recognition technology more heavily in black communities and to target activists โ for example, police in Baltimore used it to identify and arrest protesters of Freddie Grayโs death at the hands of law enforcement.
Police departments across the country use facial recognition software for forensic purposes: taking pictures from driverโs licenses and mugshots and matching them against criminal databases. The San Francisco ordinance is more of a preemptive measure, however, since the San Francisco Police Department doesnโt use currently use the technology. But as Vox notes, this kind of tech has been used elsewhere by police and private citizens to monitor crowds at protests, shopping malls and concerts.
While some data suggests black people will be more likely to be mistakenly identified as criminals with this new technology, the accuracy of this software isnโt the only concern civil liberties advocates have. Given how police surveillance historically and presently targets black people at disproportionate rates, itโs near impossible to imagine a scenario in which black communities arenโt caught in a high-tech dragnet by local police departments.
Writerย Zoรฉ Samudzi expanded upon this in a piece published earlier this year in the Daily Beast, asking, โin a country where crime prevention already associates blackness with inherent criminality, why would we fight to make our faces more legible to a system designed to police us?โ
โThis is not simply a problem of โracist codeโ that can be fixed with diversity initiatives to add more black coders and software engineers to correct technical deficiencies within an unchanged structure technology,โ Samudzi continued. โThe problem is that black people are simply not human enough to be recognized by the racist technology created by racist people imbued with racist worldviews.โ
It remains to be seen how many other cities and states will be willing to follow the Bay Areaโs lead. According to CNN, there are presently no federal regulations dictating how AI should be used by the government.
Straight From
Sign up for our free daily newsletter.