AI Accuses Wrong Person of Police Brutality in Belarus
AI Accuses Wrong Person of Police Brutality in Belarus | |
---|---|
Short Title | Belarusian Software Developed for Deanonymisation of Police Officers Beating Up Protesters Identified the Wrong Person |
Location | Belarus |
Date | September 2020 |
Solove Harm | Distortion |
Information | Identifying, Physical Characteristics, Professional |
Threat Actors | Software developers |
Individuals | |
Affected | An ex police oficer |
High Risk Groups | Law Enforcement Officer |
Tangible Harms | Anxiety, Loss of Trust |
A wrong person was identified by an AI as the one beating up a protester on the streets in Minsk, Belarus.
Description
In late summer and fall 2020 after presidential elections hundreds of thousands of Belarusians went on the streets to protest election falsifications and police brutality, after the Central Electoral Committee announced the victory of the president Lukashenko (who has been in power since 1994) with allegedly more than 80% of votes.
Secret and riot police who were responsible for the unlawful arrests and detentions and brutality against protesters were avoiding any identification (e.g. starting beating up protesters without introducing themselves or showing documents, wearing uniform without showing their police ID number, wearing face masks, etc). This triggered the development of a software, that can allegedlyrecognize the face of a person involved in suppression of protests, even if it is hidden behind a mask, cap or balaclava.
The developers published a video on YouTube demonstrating the work of the software and showing a few examples. Algorithm - its interface is shown in the video - highlights a riot policeman's face and compares it with a photo database of law enforcement officers. The video demonstrates personal details of some of the identified policemen, including their full name, date of birth, information about family, police department, and contact information.
Among these identified people was a man, who doesn’t work in the armed forces anymore, but is a shopping mall owner. He was was confused with a security guard who beat the protester and his personal details (he was still in the database because of his past career in the police) were shown on the video. This in an example of Distortion, as in the video was a different person, whereas identified man has proof of him being at work in the shopping mall.
He received at least a hundred calls and messages with threats. Some of them apologized after the information about the mistake was published, most of them did not.
Breakdown
Threat: Software developers publish personal details of a person after their AI confuses him with a security guard who beat the protester
At-Risk group: People from the database of Belarusian law enforcement officers
Harm: Distortion
Secondary Consequences: Loss of Trust, Anxiety