Law \ Legal

Flaws of Eyewitness ID Magnified by Facial Recognition Software: Researcher

tech

illustration by Kal Omari via Flickr

As courts and cops have turned to facial recognition software to identify criminals, identification by machines is matching eyewitness identification by humans as a criminal-legal tactic.

But both crime-fighting tools are flawed, particularly when they are employed together, asserts Valena E. Beety, a law professor at Arizona State University’s Sandra Day O’Connor College of Law.

The two techniques are equally susceptible to producing wrongful convictions, when police use either “without precautions,” according to Beatty.

“Contextual information is vital to whether a factfinder correctly interprets either type of evidence,” Beety wrote in an article published in the Duquesne University Law Review,

But a growing tendency to combine machine and human identification methods without careful precautions further increases the possibility of error, Beety argued in her essay, entitled “Considering ‘Machine Testimony’: The Impact of Facial Recognition Software on Eyewitness Identifications.”

Recognizing that facial recognition software has a “cascading influence” on eyewitness identification, Beety suggested that professional associations, such as the Organization of Scientific Area Committees, include eyewitness identification in its review of facial recognition software.

Such foresight, Beety maintained, may produce a “more robust examination and consideration of  [the] software and its usage,” because the flaws of both methods are intertwined.

Eyewitness identification, in fact, remains a relatively unreliable identification method.

Prosecutions may depend on eyewitness identification to secure a conviction, but the method lacks effective standards — the result of a spate of court cases that culminated in an uncritical legal embrace of eyewitness identification.

Facial recognition software is also faulty, in ways comparable to eyewitness identification and unique in its software-specific danger. The technology — which compares two images and determines whether the same person is present in each image — relies on a photo-matching software with “fundamental accuracy problems.”

Additionally, “the use of facial recognition software is not always disclosed to the person ultimately charged with the offense,” Beety wrote.

“This failure to disclose can be problematic, given the known inaccuracy of facial recognition software when used to identify people of color.”

Research has consistently demonstrated that racial bias is embedded in certain machine-based algorithms, leading to wrongful convictions; eyewitness identifications is similarly marred by “cross-racial misidentification,” in which eyewitnesses struggle to identify people of a different race than their own.

Such flaws can have life-altering implications for people of color, because “white people have greater difficulty identifying people of color than vice versa,” Beety wrote.

“Police use of facial recognition software disproportionately affects Black Americans, Asian Americans, and Native Americans,” the article reads.

“While advocates of technology may claim these systems ‘do not see race,’ research now shows the incorrect identifications of people of color by these programs. Indeed, facial recognition is the least accurate of Black women, even misidentifying their gender.”

Cascading Influence

The cascading influence that facial recognition technology has on eyewitnesses — the placing of facial recognition photos in a traditional photo lineup, for example — calls for interconnected solutions.

Beety suggested, for example, that police departments implement “neutralizing procedures” for show-ups or line-ups, including facial recognition software findings.

“[The] National Academy of Sciences, [in its report] “Identifying the Culprit: Assessing Eyewitness Identification, recommended that law enforcement agencies implement protocols such as using double-blind lineup and photo array procedures, developing and using standardized witness instructions, documenting witness statements, and recording the witness identification,” Beety continued.

Ultimately, advocates of a more just criminal-legal system should remain attuned to the intersections between identification by machines and identification by humans; such awareness may precipitate badly-needed checks on both.

“By recognizing the connections between machine and human identifications, we can work to enhance the reliability of both,” Beety concludes.

To read the full article, click here.

Eva Herscowitz is a TCR contributing writer.


Source link

Related Articles