Last Updated, Feb 15, 2022, 6:00 AM Technology
A new map of NYC’s cameras shows more surveillance in Black and brown neighborhoods
technology

[ad_1]

Areas of New York City with higher rates of “stop-and-frisk” police searches have more closed-circuit TV cameras, according to a new report from Amnesty International’s Decode Surveillance NYC project.

Beginning in April 2021, over 7,000 volunteers began surveying New York City’s streets through Google Street View to document the location of cameras; the volunteers assessed 45,000 intersections three times each and identified over 25,500 cameras. The report estimates that around 3,300 of these cameras are publicly owned and in use by government and law enforcement. The project used this data to create a map marking the coordinates of all 25,500 cameras with the help of BetaNYC, a civic organization with a focus on technology, and contracted data scientists.

Analysis of this data showed that in the Bronx, Brooklyn, and Queens, there were more publicly owned cameras in census tracts with higher concentrations of people of color.

To work out how the camera network correlated with the police searches, Amnesty researchers and partner data scientists determined the frequency of occurrences per 1,000 residents in 2019 in each census tract (a geographic section smaller than a zip code), according to street address data originally from the NYPD. “Stop-and-frisk” policies allow officers to do random checks of citizens on the basis of “reasonable suspicion.” NYPD data cited in the report showed that stop-and-frisk incidents have happened more than 5 million times in New York city since 2002, with the large majority of searches conducted on people of color. Most people subjected to these searches have been innocent, according to the New York ACLU

Each census tract was assigned a “surveillance level” according to the number of publicly owned cameras per 1,000 residents within 200 meters of its borders. Areas with a higher frequency of stop-and-frisk searches also had a higher surveillance level. One half-mile route in Brooklyn’s East Flatbush, for example, had six such searches in 2019, and 60% coverage by public cameras.

Experts fear that law enforcement will be using face recognition technology on feeds from these cameras, disproportionately targeting people of color in the process. According to documents obtained through public records requests by the Surveillance Technology Oversight Project (STOP), the New York Police Department used facial recognition, including the controversial Clearview AI system, in at least 22,000 cases between 2016 and 2019. 

“Our analysis shows that the NYPD’s use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City,” said Matt Mahmoudi, a researcher from Amnesty International who worked on the report. 

The report also details the exposure to facial recognition technology of participants in Black Lives Matter protests last year by overlaying the surveillance map on march routes. What it found was “nearly total surveillance coverage,” according to Mahmoudi. Though it’s unclear exactly how facial recognition technology was used during the protests, the NYPD has already used it in one investigation of a protester. 

On August 7, 2020, dozens of New York City police officers, some in riot gear, knocked on the door of Derrick Ingram, a 28-year-old Black Lives Matter activist. Ingram was suspected of assaulting a police officer by shouting into the officer’s ear with a bullhorn during a march. Police on the scene were spotted examining a document titled “Facial Identification Section Informational Lead Report,” which included what appeared to be a social media photo of Ingram. The NYPD confirmed that it had used facial recognition to search for him. 

Eric Adams, the new mayor of the city, is considering expanding the use of facial recognition technology, despite the fact that many cities in the US have banned it because of concerns about accuracy and bias.

Jameson Spivack, an associate at Georgetown Law’s Center on Privacy and Technology, says Amnesty’s project “gives us an idea of how broad surveillance is—particularly in majority non-white neighborhoods—and just how many public places are recorded on footage that police could use face recognition on.”

[ad_2]