February 16, 2022

Facial Recognition Cameras In New York Are Reinforcing Racist Policing, According To New Research By Amnesty

People in the US city of New York are subject to “shocking” mass surveillance through facial recognition technology cameras, with the invasive technology especially trained on areas of the city with greater concentrations of non-white residents, new research by Amnesty International and partners has revealed today.

The new analysis – published as part of a global Ban The Scan campaign – shows how the New York Police Department’s vast surveillance operation particularly affects people already targeted for stop-and-frisk across the city’s five boroughs. 

In the Bronx, Brooklyn, and Queens the research shows that the higher the proportion of non-white residents, the higher the concentration of facial recognition-compatible CCTV cameras.

The findings are based on crowdsourced data obtained by thousands of digital volunteers as part of the Decode Surveillance NYC project. Volunteers mapped more than 25,500 CCTV cameras across New York City and Amnesty worked with data scientists to compare this data with statistics on stop-and-frisk and demographic data.   
 
Facial recognition technologies for identification are systems of mass surveillance that violate the right to privacy and threaten the rights to freedom of assembly, equality and non-discrimination. 

The NYPD used facial recognition technologies in at least 22,000 cases between 2016 and 2019. Data on incidents of stop-and-frisk by the NYPD since 2002 shows Black and Latinx communities have been the overwhelming target of such tactics.   
 
Last year, Amnesty sued the NYPD after it refused to disclose public records regarding its acquisition of facial recognition technologies and other surveillance tools. The case is ongoing. Amnesty is calling for a total ban on the use, development, production, sale and export of facial recognition technologies for mass surveillance purposes by both states and the private sector.

Matt Mahmoudi, Amnesty International’s Artificial Intelligence and Human Rights Researcher, said: 

“Our analysis shows that the NYPD’s use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City.
 
“We have long known that stop-and-frisk in New York is a racist policing tactic. We now know that the communities most targeted with stop-and-frisk are also at greater risk of discriminatory policing through invasive surveillance.
 
“The shocking reach of facial recognition technology in the city leaves entire neighborhoods exposed to mass surveillance. The NYPD must now disclose exactly how this invasive technology is used. 
 
“Banning facial recognition for mass surveillance is a much-needed first step towards dismantling racist policing, and the New York City Council must now immediately move towards a comprehensive ban.”

Article Tags : , ,
Abbianca Makoni

Abbianca Makoni is a content executive and writer at POCIT! She has years of experience reporting on critical issues affecting diverse communities around the globe.