Facial Recognition Technology Wrongfully Accuses Person of Crime

From Privacy Wiki
Revision as of 16:09, 22 November 2020 by Upwork (talk | contribs) (Created page with "{{Event |Short Title=Wrongful Arrest Shows Racial Bias in Facial Recognition Technology |Location=Michigan |Date=November 2020 |Taxonomy=Distortion, Exclusion |Personal Inform...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search


Facial Recognition Technology Wrongfully Accuses Person of Crime
Short Title Wrongful Arrest Shows Racial Bias in Facial Recognition Technology
Location Michigan
Date November 2020

Solove Harm Distortion, Exclusion
Information Identifying, Physical Characteristics
Threat Actors Law Enforcement

Individuals
Affected A man, potentially anyone
High Risk Groups Ethnic Minority
Tangible Harms Anxiety, Incarceration

Wrongful arrest of a black man in Michigan was based on a faulty match using facial recognition technology used by Detroit Police Department.

Description

One man was on the way to work when he was stopped by police and was informed that there was a felony warrant out for his arrest. He was then transferred to Detroit police custody and charged with larceny. The evidence against him was a single screen-grab from a video of the incident, taken on the accuser's cellphone. Accused man, who had a number of tattoos, shared few physical characteristics with the person in the photo, but it wasn’t him, the person on the video didn’t even have tattoos.Distortion The judge agreed, and the case was promptly dismissed. 

Arrest was based on an erroneous match using controversial facial recognition technology. Police took a still image from a video of the incident and ran it through a software program manufactured by a company called DataWorks Plus. The Detroit Police Department then checks for a possible match in a database of photos; the system can access thousands of mugshots as well as the Michigan state database of driver's license photos. 

The man’s face came up as a match, but it wasn't him, and there was plenty of evidence to prove it wasn't. 

But many view facial recognition as flawed technology that has the potential to cause serious harm. It misidentifies Black and Brown faces at rates substantially higher than their White counterparts, in some cases nearly 100% of the time. And that's especially relevant in a city like Detroit, where nearly 80% of residents are Black. 

The fact that this man was stopped on his way to work and even thought that his arrest was a joke shows that people have no way of knowing, that their photos are used by the algorithm in a search. Exclusion

Breakdown

Threat: Facial recognition wrongfully identifying a man as criminal from the photo
At-Risk group: A man
Harm: Distortion
Secondary Consequences: Anxiety, Incarceration

Threat: Police running crime images through facial recognition software to try identify criminals without people knowing that their images are in the software’s database
At-Risk group: Potentially anyone
Harm: Exclusion
Secondary Consequences: not known

Laws and Regulations

Sources

https://www.cbsnews.com/news/detroit-facial-recognition-surveillance-camera-racial-bias-crime/