An investigation that reveals 42 Minnesota law enforcement agencies have used a controversial facial recognition tool in recent years has sparked calls for a ban on the technology by civil liberties advocates.
Buzzfeed obtained a list of more than 1,800 publicly-funded agencies that have used the Clearview AI app between 2018 and February 2020, and this week released the list after matching the data with information obtained from public records requests, and after asking each agency included in the data for comment.
Clearview AI launched in 2018 and was offered as a free trial to law enforcement agencies across the country, with the New York Times reporting users are able to upload a single photo that is then cross-checked against a huge database of images that Clearview "claims to have scraped from Facebook, YouTube, Venmo and millions of other websites."
At more than 3 billion images, this database is far bigger than those used by individual law enforcement agencies and even the FBI.
Buzzfeed's extensive report says the technology is often being used by American law enforcement employees with little to no oversight or transparency, and its list includes more than 40 agencies based in Minnesota.
These include police departments in Minneapolis, St. Paul, Eden Prairie, St. Louis Park, Stillwater, Shakopee, and Woodbury, among others; numerous sheriff's offices across Minnesota; and state agencies such as the Minnesota Bureau of Criminal Apprehension and the Minnesota Commerce Fraud Bureau.
Per the report, the biggest users of the technology in Minnesota were Burnsville Police Department and Prior Lake Police Department, which were found to have carried out between 1,001-5,000 searches.
Other big users were police departments in Cottage Grove, Minneapolis, Eagan, Maplewood, and Woodbury, which were in the 101-500 search category along with the sheriff's offices in Dakota County, Hennepin County, Stearns County, and Washington County.
You can find a full list of Minnesota agencies that allegedly used the technology here.
Many of the agencies named in the list didn't respond to Buzzfeed's comment requests, including Minneapolis PD, St. Paul PD, and the Hennepin County Sheriff's Office.
But some did provide comments, and in several cases said they only tried out Clearview AI as part of the free trial and didn't purchase it, with some stating that the information gleaned from the tech was not used to make any arrests.
Blaine Police Department meanwhile claimed it had never used the tech, despite it being named in the documents obtained by Buzzfeed as having conducted 11-50 searches.
In some cases, Buzzfeed says that law enforcement employees were signing up without the knowledge of superiors, with Clearview AI not putting in place a requirement an officer get approval from a superior and the appointment of an administrator to monitor its use until March 2020.
Calls for statewide ban in Minnesota
The use of facial recognition software is not new in Minnesota. In December, the Star Tribune reported police have run almost 1,000 searches through the Hennepin County Sheriff's Office's facial recognition system since 2018, of which more than half happened in 2020 alone.
The American Civil Liberties Union-Minnesota believes facial recognition software was used by law enforcement during the widespread protests and rioting that broke out following the death of George Floyd in May 2020, which has the potential to infringe upon individuals' privacy and their First Amendment rights, with policy associate Munira Mohamed telling Bring Me The News it could have a "chilling effect" on democracy.
In February, the Minneapolis City Council approved a ban on the use of facial recognition software by the city's police department, and prevents them from knowingly using data obtained by others using such technology.
The ACLU says despite the ban in Minneapolis, the lack of transparency means it's impossible to know the extent to which jurisdictions in Minnesota are sharing data from facial recognition technology, which is why it is pushing for a statewide ban on its use.
Mohamed said another of the ACLU's concerns is that facial recognition technology in general isn't 100% accurate, particularly when it comes to identifying people of color.
The continued and secretive use of such technology by police in Minnesota, Mohamed says, has the potential to further erode trust and faith between civilians and law enforcement.
What's more, the potential for mistaken identity while using such technology could inflict considerable trauma on anyone wrongfully arrested as a result, Mohamed added
While facial recognition software isn't illegal in Minnesota, there remains little in the way of state regulation as to its use for both law enforcement purposes, and for commercial use.
Minnesota Attorney General Keith Ellison said the state should be cautious about its use, and ask questions about its necessity.
"Technology should be pursued and implemented in service to a better world that enables people to live with dignity and respect," he said.
"Facial recognition technology has far-reaching consequences for society, and we need to be deliberate about its use. Not least among the concerns are that a series of studies have shown that face recognition technology is consistently less accurate in identifying the faces of African-Americans and women as compared to Caucasians and men.”
Bring Me The News has reached out to Clearview AI for a comment on this week's revelations, but its founder Hoan Ton-That has previously defended his product and its use by police, telling CBS last May: "This is only used for investigations after the fact, this is not a 24/7 surveillance system."
The company has been threatened with legal action by social media companies including Twitter for "scraping" photos shared on its site, but Ton-That argues that there is a First Amendment right to publicly-available information.
"The way we have built our system is to only take publicly available information and index it that way," he said.