Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Detroit Takes Important Step in Curbing the Harms of Face Recognition Technology

Par : Tori Noble
15 juillet 2024 à 20:54

In a first-of-its-kind agreement, the Detroit Police Department recently agreed to adopt strict limits on its officers’ use of face recognition technology as part of a settlement in a lawsuit brought by a victim of this faulty technology.  

Robert Williams, a Black resident of a Detroit suburb, filed suit against the Detroit Police Department after officers arrested him at his home in front of his wife, daughters, and neighbors for a crime he did not commit. After a shoplifting incident at a watch store, police used a blurry still taken from surveillance footage and ran it through face recognition technology—which incorrectly identified Williams as the perpetrator. 

Under the terms of the agreement, the Detroit Police can no longer substitute face recognition technology (FRT) for reliable policework. Simply put: Face recognition matches can no longer be the only evidence police use to justify an arrest. 

FRT creates an “imprint” from an image of a face, then compares that imprint to other imagesoften a law enforcement database made up of mugshots, driver’s license images, or even images scraped from the internet. The technology itself is fraught with issues, including that it is highly inaccurate for certain demographics, particularly Black men and women. The Detroit Police Department makes face recognition queries using DataWorks Plus software to the Statewide Network of Agency Photos, or (SNAP), a database operated by the Michigan State Police. According to data obtained by EFF through a public records request, roughly 580 local, state, and federal agencies and their sub-divisions have desktop access to SNAP.  

Among other achievements, the settlement agreement’s new rules bar arrests based solely on face recognition results, or the results of the ensuing photo lineup—a common police procedure in which a witness is asked to identify the perpetrator from a “lineup” of images—conducted immediately after FRT identifies a suspect. This dangerous simplification has meant that on partial matches—combined with other unreliable evidence, such as eyewitness identifications—police have ended up arresting people who clearly could not have committed the crime. Such was the case with Robert Williams, who had been out of the state on the day the crime occurred. Because face recognition finds people who look similar to the suspect, putting that person directly into a police lineup will likely result in the witness picking the person who looks most like the suspect they saw—all but ensuring the person falsely accused by technology will receive the bulk of the suspicion.  

Under Detroit’s new rules, if police use face recognition technology at all during any investigation, they must record detailed information about their use of the technology, such as photo quality and the number of photos of the same suspect not identified by FRT. If charges are ever filed as a result of the investigation, prosecutors and defense attorneys will have access to the information about any uses of FRT in the case.  

The Detroit Police Department’s new face recognition rules are among the strictest restrictions adopted anywhere in the country—short of the full bans on the technology passed by San Francisco, Boston, and at least 15 other municipalities. Detroit’s new regulations are an important step in the right direction, but only a full ban on government use of face recognition can fully protect against this technology’s many dangers. FRT jeopardizes every person’s right to protest government misconduct free from retribution and reprisals for exercising their right to free speech. Giving police the ability to fly a drone over a protest and identify every protester undermines every person’s right to freely associate with dissenting groups or criticize government officials without fear of retaliation from those in power. 

Moreover, FRT undermines racial justice and threatens civil rights. Study after study after study has found that these tools cannot reliably identify people of color.  According to Detroit’s own data, roughly 97 percent of queries in 2023 involved Black suspects; when asked during a public meeting in 2020, then-police Chief James Craig estimated the technology would misidentify people 96 percent of the time. 

Williams was one of the first victims of this technology—but he was by no means the last. In Detroit alone, police wrongfully arrested at least two other people based on erroneous face recognition matches: Porcha Woodruff, a pregnant Black woman, and Michael Oliver, a Black man who lost his job due to his arrest.  

Many other innocent people have been arrested elsewhere, and in some cases, have served jail time as a result. The consequences can be life-altering; one man was sexually assaulted while incarcerated due a FRT misidentification. Police and the government have proven time and time again they cannot be trusted to use this technology responsibly. Although many departments already acknowledge that FRT results alone cannot justify an arrest, that is cold comfort to people like Williams, who are still being harmed despite the reassurances police give the public.  

It is time to take FRT out of law enforcement’s hands altogether. 

❌
❌