Clearview's Facial Recognition App

From Privacy Wiki
Jump to navigation Jump to search


Clearview's Facial Recognition App
Short Title Clearview AI Helps Governmental Agencies and Private Organisations with Facial Recognition
Location Global
Date 2020

Solove Harm Aggregation, Identification, Secondary Use, Surveillance
Information Identifying, Computer Device, Contact, Location, Physical Characteristics
Threat Actors Clearview AI, govermental and private organisations using Clearview AI, Privileged individuals with access to Clearview App, Immigration and Customs Enforcement, Law Enforcement

Individuals
Affected Potentially anybody, who has their pictures online
High Risk Groups Minors, Crime Victims
Tangible Harms

A facial recognition app Clearview AI has a data base, containing more than 3 billion pictures from millions of websites helping to identify a person from any picture by comparing it to the photos from the data base.

Description

Clearview AI is a startup from New York, which in early 2020 was found creating a data base with more than 3 billion of pictures from Facebook, Instagram, Youtube and millions of other websites, in order to help law enforcement match photos of unknown people to their online images to identify the suspects.

Later on it turned out, that the technology is available not only for law enforcement agencies, but also to multiple other governmental organizations, as well as private organizations. Several privacy violations can be identified here.

One of the concerning issues is Surveillance, because the searches for people of interest are run secretly and individuals has no way of knowing that they are being searched for.

Another privacy violation is Identification: face recognition technology allows to link the identifying information to any picture that is being run through the app, even if the picture is low quality and the face is hardy recognizable by a human eye (e.g. a person wears a hat or glasses). It is known that the app is used in law enforcement not only to identify the criminal, but also to identify victims, including, for example children who are victims of sexual abuse.

There is also a problem with Aggregation, because the technology combines a picture of an individual with the photos from the data base along with links to where those photos appeared, which not only leads to identification of a person, but reveals other personal information about them, including contact, location, etc.

The creators of the app claimed, that the product was created to be used strictly by law enforcement, but some documentation (e.g. clients lists of Clearview) proves, that the solution was also sold to and used by thousands of private organizations in retail, banking, education, entertainment. This can be seen as Secondary Use. Among those companies are major corporations such as Walmart, T-Mobile, Macy’s, etc. as well as some high schools, gaming companies and many more. There were also some individuals identified among those freely using the app, such as company's investors and friends. According to those familiar with company's fundraising process, the access to the app was offered to potential investors as a perk. E.g. venture capitalist and actor Ashton Kutcher publicly talked about having the app on his phone during an interview on YouTube.

The technology expanded into at least 26 countries outside the USA, including Canada, multiple European countries, India, Saudi Arabia and United Arab Emirates. The creators of the app acknowledged designing a prototype for use with augmented-reality glasses, which potentially might allow the user of the glasses identify any person they saw.

Twitter has written an official letter to Clearview accusing the technology of privacy violations and prohibiting them to stop using site's photos.

Some of the United States officials addressed the issue. One of the Senators of Massachusetts sent a letter to Clearview, addressed to its co-founder and chief executive. New Jersey’s attorney general told state prosecutors that police officers should stop using Clearview.

In July 2020 with sales already halted in Canada due to an ongoing investigation by the country’s Office of the Privacy Commissioner, Clearview AI faced additional global pressure as the United Kingdom and Australia opened a joint privacy investigation into the facial recognition software provider.

In August 2020 Immigration and Customs Enforcement (ICE) signed a contract with Clearview AI for “mission support”.

Same summer law enforcement in several cities, including New York and Miami, have reportedly been using this software to track down and arrest individuals who allegedly participated in criminal activity during Black Lives Matter protests months after the fact.

Laws and Regulations

Sources

https://themarkup.org/ask-the-markup/2020/03/12/photos-privacy
https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html
https://www.buzzfeednews.com/article/ryanmac/clearview-ai-fbi-ice-global-law-enforcement
https://www.nytimes.com/2020/01/22/technology/clearview-ai-twitter-letter.html
https://www.nytimes.com/2020/03/05/technology/clearview-investors.html
https://www.nytimes.com/2020/02/07/business/clearview-facial-recognition-child-sexual-abuse.html
https://www.cpomagazine.com/data-privacy/leading-law-enforcement-facial-recognition-provider-clearview-ai-faces-joint-international-privacy-investigation/
https://www.theverge.com/2020/8/14/21368930/clearview-ai-ice-contract-privacy-immigration
https://arstechnica.com/tech-policy/2020/08/cops-in-miami-nyc-arrest-protesters-from-facial-recognition-matches/