Blog

The Dangers of Artificial Intelligence (“AI”) in Criminal Investigations and Trials.

The use of AI by law enforcement in criminal investigations is becoming more and more common.  In particular, police around the country are increasingly using facial recognition technology supported by AI to identify criminal suspects.  AI can be a very useful tool for criminal investigations, but it also involves significant risks of misidentification and wrongful arrest.  

This article will examine the risks of misidentification from the use of facial recognition technology.  It will then explain why the proper role of facial identification should be as an initial investigative tool followed by further investigation beyond photo lineups to determine whether the identified person really might be the perpetrator. 

What Is Facial Recognition Technology?

Facial recognition technology (“FRT”), is a way to attempt to identify the perpetrator of a crime using surveillance photographs or video from the crime scene.  FRT compares an image of the perpetrator’s face from surveillance photographs or video to the images of faces in a database of known people, such as a database of criminals maintained by law enforcement.

FRT uses a software algorithm to compare the image of the perpetrator’s face to those in the database.  The algorithm uses AI to “learn” to do better comparisons by exposing the system to a large number of varied facial images in a data set.  See Lux, “Facing the Future:  Facial Recognition Technology Under the Confrontation Clause, 57 Am. Crim. L. Rev. at 21 (2020).  

Here is how a New Jersey appellate court recently described the FRT process:       

The process begins when the person operating the software selects a probe

image captured from surveillance footage that features the perpetrator’s face….

The technology then breaks down the image into component features and distills them into a “faceprint”—a “map written in code that measures the distance between features, lines, and facial elements.”…. It then compares this “faceprint” against others in the database, assigning scores to each based on the extent to which corresponding features line up, and returning a list of those with the highest scores ordered by rank.

State v. Arteaga, No. A-3078-21 (App. Div. June 7, 2023) (slip op. at 29) (internal citations omitted).    

The quality of the image of the perpetrator obviously affects FRT’s ability to make an accurate match.  A grainy video, poor lighting, or an incomplete view of the perpetrator’s face, among other things, makes FRT less accurate.  U.S. Dep’t of Homeland Sec., DHS/ICE/PIA-054, Privacy Impact Assessment for the Use of Facial Recognition Services 26 (2020).

Additionally, not all of the hundreds of FRT algorithms are created equal.  For one thing, the number and variety of images in the initial data set from which the system “learns” to recognize faces makes a big difference.  See Perkowitz, “The Bias in the Machine: Why Facial Recognition Has Led to False Arrests,” Nautilus, Aug. 19, 2020.  

As one researcher has explained, the data sets used by FRT have created a systemic bias against African-Americans:

Recent studies from the National Institute of Standards and Technology and the Massachusetts Institute of Technology…have confirmed that computer facial recognition is less accurate at matching African-American faces than Caucasian ones. One reason for the discrepancy is the lack of non-Caucasian faces in datasets from which computer algorithms form a match. The poor representation of people of color from around the world, and their range of facial features and skin shades, creates what researchers have called a “demographic bias” built into the technology.

 Id. (footnote omitted).  

How is Facial Recognition Technology Used in Criminal Investigations? 

FRT is ordinarily a starting point when investigators do not have leads about the identity of a perpetrator other than a photograph or video of the perpetrator’s face.  For example, a person who robs a store may be completely unknown to everyone in the store, but the person’s face may be caught on camera.  FRT will identify known persons who look very much like the perpetrator.  FRT will also give investigators an indication of the closeness of each potential match.

FRT-generated matches give investigators potential suspects to investigate through other means, such as obtaining the potential suspect’s cell phone records to determine where they were when the crime was committed.  Unfortunately, in some cases, law enforcement has made arrests based on simply showing a victim or witness an FRT match in a photo lineup.  

Eye-witness identification using photo lineups is notoriously problematic.  Even if a photo lineup is fair and not manipulative, people’s observational skills and memories are not nearly as good as they think.  In Texas, for example, our state’s highest criminal court, the Texas Court of Criminal Appeals, has allowed expert witness trial testimony on the unreliability of eye-witness identification.  See Tillman v. State, 354 S.W.3d 425, 442-43 (Tex. Crim. App. 2011).  The Texas Code of Criminal Procedure also includes detailed procedures governing the admissibility of photo lineup identification in trials.  See Tex. Code Crim. P., Art. 38.20.   

If FRT produces an incorrect match that law enforcement includes in a photo lineup, is it any wonder that witnesses will identify that erroneous match?  By design, the match should look like the perpetrator.  It is not surprising that a witness with an imperfect memory who had a brief encounter with the perpetrator would think the erroneous match from FRT was the perpetrator.   

Identifying a match with FRT, followed by showing a witness the match as part of a photo lineup, without further investigation, has resulted in a number of demonstrably false arrests, in which people have been held in custody for extended periods of time, only to be cleared later.  In Detroit alone, at least three people have been falsely arrested or detained by law enforcement based on FRT, and later cleared.  See K. Hill, “Eight Months Pregnant and Arrested After False Facial Recognition Match,” New York Times, Aug. 6, 2023.  Similar false arrests based on an initial FRT matches have occurred in Maryland, Louisiana, and New Jersey.  See E. Press, “Does A.I. Lead Police to Ignore Contradictory Evidence?,” The New Yorker, Nov. 13, 2023; Parks v. McCormac, No. 2:21-CV-04021 (D.N.J.).  

In a troubling recent case, a Houston man was arrested for allegedly committing an armed robbery at a Houston Sunglasses Hut store.  The company that owns Sunglasses Huts reached out to Macy’s and asked them to use their FRT system to help identify the robber from surveillance video.  The Macy’s system identified a man who was ultimately arrested by Houston police and held in custody for two weeks.  That man claims that, while he was in custody, he was attacked and raped by three men.  He was ultimately cleared after he was able to prove that he was not even in the State of Texas at the time of the robbery.  See A. Robledo, “Texas Man Says Facial Recognition Led to His False Arrest, Imprisonment, Rape in Jail,” USA Today, January 24, 2024.  That man recently filed a lawsuit against the parent company of Sunglasses Hut and Macy’s.  See Murphy v. Essilorluxottica U.S.A., Inc., No. 2024-03265, in the 125th District Court of Harris County, Texas.  

 

These incidents demonstrate that using FRT, followed only by a photo lineup using the match photograph, creates a serious danger of false arrests.  This is why law enforcement should use FRT only as a starting point for an investigation.  After obtaining a match, law enforcement should fully investigate whether other evidence, besides eyewitness identification, proves the person’s involvement in the crime.

Can Facial Recognition Technology Be Used In Court?

Courts that have commented on the issue almost uniformly say that FRT is not reliable enough to be used as identification evidence in a trial.  See, e.g., People v. Reyes, 2020 NY Slip Op 20258, *3 (Sup Ct, NY County 2020) (“There is no agreement in a relevant community of technological experts that matches are sufficiently reliable to be used in court as identification evidence.”); People v. Collins, 15 N.Y.S.3d 564, 575 (Sup Ct, Kings County 2015) (“In that regard, the results of some other techniques—polygraphs and facial recognition software, for example—likewise can aid an investigation, but are not considered sufficiently reliable to be admissible at a trial.”).  

Under this view, the use of FRT does not taint an investigation so that the case must be dismissed.  Instead, the prosecution must prove the defendant’s guilt beyond a reasonable doubt with other evidence besides FRT.  See Reyes, 2020 NY Slip Op 20258 at *3 (“No reason appears for the judicial invention of a suppression doctrine in these circumstances.  Nor is there any reason for discovery about facial recognition software that was used as a simple trigger for investigation and will presumably not be the basis for testimony at a trial, except as might otherwise be required by [New York’s initial discovery rules].”).

On the other hand, a recent New Jersey appellate opinion suggests at least the possibility that the defense could potentially tell the jury that the defendant was initially identified using FRT and try to show that FRT is unreliable and can result in an incorrect identification.  See State v. Arteaga, 476 N.J. Super. 36, 62-63, 296 A.3d 542, 557 (App. Div. 2023).  Based on this reasoning, the New Jersey Appellate Division held that a criminal defendant was entitled to discovery about the FRT process used to identify the defendant, including source code for the FRT algorithm.  See id.; but see, Reyes, 2020 NY Slip Op 20258, *3 (denying discovery about FRT, because the use of FRT would not be admissible at trial, and the State would have to prove guilt without it).

The current state of the law is that the prosecution should not be allowed to tell the jury that law enforcement became aware of the defendant by using FRT.  It is at least possible that a defendant may be entitled to discovery about the use of FRT to identify the defendant, and at trial, the defendant might be able to introduce the idea that FRT is unreliable.  These will depend on the facts of the case, the judge’s view of discovery, and the relevance and importance of FRT to the case.

Developers are constantly improving FRT.  The use of AI to help algorithms “learn” should only speed its development and accuracy.  It may be that, in the near future, FRT will become reliable enough so that it can be used in court.  But we are not there yet.   

What Can Criminal Defense Lawyers Do To Protect The Accused From Misidentification By Facial Recognition Technology?

As we have seen, FRT can result in misidentifications and false arrests.  If a criminal defense lawyer suspects that the defendant has been falsely arrested and was not the perpetrator, whether FRT was involved or not, the defense lawyer should quickly gather alibi or other evidence to show that the defendant could not have been the perpetrator.  The lawyer should quickly show this evidence to the prosecutor and demand that the prosecutor dismiss the case.  If the prosecutor refuses to dismiss the case, the defense lawyer should attempt to get a bond reduction, if the defendant is in custody, based on the evidence that the defendant was misidentified.  The defense lawyer can also attack the sufficiency of the arrest warrant to try to obtain the defendant’s release.

If the defense lawyer knows or suspects that FRT was involved and that it produced a misidentification, the defense lawyer can request discovery concerning whether FRT was used and, if so, the FRT algorithm and the techniques used to identify the defendant.  See K. Jackson, “Challenging Facial Recognition Software in Criminal Court,” The Champion, July 2019.

If the case goes to trial, the defense lawyer will have to make a strategic decision about whether to try to introduce the fact that FRT was used and to attack it, or whether to try to keep it out of the trial altogether.  This will depend on lots of different factors that are specific to the individual case, including the strength of the prosecution’s case without FRT, how good the defense’s expert witness is at attacking the FRT, and how bad the specific FRT is.

If you or a loved one has been charged with a crime, and you believe a misidentification of the perpetrator has occurred, you should contact an experienced criminal defense lawyer to develop the best defense strategy for the case.  Specifically, you should look for a criminal defense lawyer who has expertise in facial recognition technology and eyewitness identification issues.  These are highly specialized, and they require a criminal defense lawyer who knows how to handle these issues.