Use of facial recognition software raises eyebrows

Digital tools and spy gadgets like open-source intelligence, AI and surveillance equipment have been a game changer for investigators worldwide.

But one new technology– facial recognition software – is raising eyebrows, with some states including Colorado seeking to regulate the technology.

That’s because civil rights groups and defense lawyers have concerns about privacy, perceived racial bias and the fact the technology makes frequent errors.

For dark-skinned women, for example, the technology had an error rate of 34.7%, compared to 0.8% for fair-skinned men, according to a 2018 study by the Massachusetts Institute of Technology.

Similarly, a federal study in 2019 found that Asian and African American people were up to 100 times more likely than white men to be misidentified by facial recognition technology.

This hasn’t stopped some PI firms using the technology as it can enhance the way investigations are conducted, allowing swift identification and tracking of persons of interest.

For example, one product, FaceMRI, is marketed to PIs, saying it can help with surveillance and monitoring activities.

“By deploying its facial recognition systems in public places or at target locations, FaceMRI can automatically identify and track persons of interest, monitor their movements, and gather valuable intelligence for your investigations,” says the company website.

However, the Washington Post recently published an investigation into the use of the software by law enforcement agencies and found that hundreds of Americans have been arrested after being connected to a crime by these eyes in the sky.

But many of the arrestees didn’t know they’d been plucked out of the crowd by the technology because, said The Post, police departments rarely disclosed they’d used facial recognition to identify suspects.

The newspaper reported that misidentification by this type of software played a role in the wrongful arrests of at least seven innocent Americans, six of whom were Black, according to police and court records.

Charges were later dismissed against all of them. Some were told during interrogations or in documents provided to their criminal defense lawyers that they had been identified by AI. Others learned about the software’s use only after officers mentioned in passing that “the computer” had found them, or that they had been a “positive match.”

Facial recognition software works by submitting an image from a crime scene, often captured by video surveillance camera, to a database of photos, often from mug shots and driver’s licenses.

The software uses artificial intelligence to compare the face of the person in the “probe image” to the faces in the database. It then returns photos of people it has identified who are similar in appearance.

Because there is no scientific consensus on what constitutes a match, software makers vary widely in how many results they show and how closely each result resembles the probe photo.

Clearview AI, a popular maker of facial recognition software for police, compares probe images to its database of billions of images scraped from social media and public websites — which means that anybody with a photo anywhere on the web could be pulled into any criminal investigation if they happen to resemble the culprit.

Clearview search results produced as evidence in one Cuyahoga County, Ohio, assault case included a photo of basketball legend Michael Jordan and a cartoon of a Black man.

In Colorado, Senate Bill 113 established several limitations and regulations for the use of artificial intelligence facial recognition by government and law enforcement agencies, as well as completely prohibiting the use of facial recognition technology in public and charter schools until 2025, when a body created to investigate the technology will report back to the legislature.

“This is a really critical moment with facial recognition technology,” said bill sponsor Sen. Chris Hansen, D-Denver. “We need to make sure we’re not having high error rates and putting the appropriate safeguards in the use of facial recognition technology, particularly because of the high error rate for people of color.”

However, law enforcement agencies that use AI say it is just another tool in the box to help them identify suspects, solve crimes, find runaways and other missing people, and help figure out when someone may be being dishonest about their name.

“So often people go to big-box stores and shoplift, and there are photographs in there or they use fraudulent credit cards. It has been very difficult as we’ve worked through the challenges with the state legislature to allow us to run the faces through these databases,” said David Shipley, a former Adams County detective and commander who now runs the Colorado Information Sharing Consortium, which coordinates the statewide database of faces, mostly from mug shots.

“It’s worth the effort to run a photograph of an unknown subject to give us leads on who this might be.”

In recent years, as artificial intelligence has evolved, making the software quicker and more accurate, some agencies in Colorado have started using it in compliance with state law.

Colorado cops are required to subject any AI facial recognition result to a “human review” — particularly if a result sparks a match that could lead to an arrest.

Law enforcement agencies also must fully disclose their use of facial recognition technology to the community and to the governing body that oversees them, be it a city council or county commissioners. 

“We couldn’t get an arrest warrant for somebody simply based on this one factor,” said Det. Dave Snelling, a spokesman for the Arvada Police Department, which is one of the larger agencies using facial recognition software. “And I don’t know any jurisdictions that would, based on this.”

The state’s largest law enforcement agencies, Denver Police, El Paso County Sheriff’s Office, the Colorado Springs Police Department and the Jefferson County Sheriff’s Office, say they are not using the technology at this time.

Ryan Ross
Ryan Ross
Articles: 45

Leave a Reply

Your email address will not be published. Required fields are marked *

eleven − 8 =