Blog

Report Says Facial Recognition Software Has a Racial Bias

The idea of facial recognition software may sound like something out of a futuristic movie, but police departments are already using it. The software is especially popular in China, where police departments around the country use facial recognition tech powered by artificial intelligence to do everything from displaying the faces of people caught jaywalking to predicting which individuals are likely to commit crimes in the future.

In fact, China currently has the most advanced facial recognition capability in the world. According to one report, police in Guangzhou, a city northwest of Hong Kong, have used the technology to identify over 2,000 suspects, make more than 800 arrests, and solve about 100 cases in 2018 alone.  

However, some experts have raised concerns about police use of facial recognition software. For example, many wonder what happens when the technology misidentifies someone. Others question whether it’s a mistake to attempt to predict who might carry out a crime in the future.

In a recent report, scientists stated that facial recognition software created by Amazon and other companies has demonstrated a racial bias, particularly when it comes to women’s faces.

How Does Facial Recognition Software Work?

Facial recognition software works through the use of computer algorithms, which focus on the distinctive, unique features in an individual’s face. For example, the software might pick out the distance between a person’s eyes or the shape of their nose. From there, these details are changed into a form of data, which is then compared and contrasted to existing facial data that is already stored in the system’s database.

In some facial recognition systems, the software analyzes an individual’s face and then returns a statistical probability of a match between the face being scanned and other faces in the database. The system will display possible matches, ranking them in order of most likely match to least likely match.

Facial recognition systems are not perfect, and they can return inaccurate results for a variety of reasons. In some case, poor lighting, a bad camera angle, or even a low-quality image can cause the system to make an incorrect match or to miss a match altogether.

When a system returns a false negative, it fails to make a match even though the person’s face is actually in the database. When it returns a false positive, the system incorrectly makes a match with a face in the database, misidentifying the face being analyzed. This type of error can lead to false arrests, which can be traumatic for people who are misidentified. If you have been involved in a false arrest, you should contact criminal defense attorney John Helms to discuss your case. 

MIT Report Says Amazon Facial Recognition Tech Is Flawed

According to a report published by researchers at MIT, several law enforcement departments around the country have started using Rekognition, which is a type of facial recognition software developed by Amazon.

When the software was put to the test, it recognized men’s faces with no problem. However, when the software tried to identify women, it was wrong 19 percent of the time. The problem was even worse with respect to women of color. When asked to identify women with darker skin tones, the software was wrong 31 percent of the time. In both cases, the facial recognition software misidentified women as men.

Researchers have also tested facial recognition software from other companies, including Microsoft and IBM. In those cases, experiments also revealed flaws that required the companies to update their products.

One researcher from MIT stated that it is “irresponsible” for Amazon to continue selling its facial recognition technology to law enforcement departments until it fixes the problems researchers uncovered.

According to one report, some tech companies have expressed their own concerns about the problems with facial recognition technology, and some have taken steps to make changes to their algorithms. For example, IBM says it has published a curated dataset that is designed to improve its accuracy. Additionally, Microsoft has pushed for regulations that would hold manufacturers of facial recognition software to certain standards.

Racial Bias in Facial Recognition Software Overseas

The United States and China aren’t the only countries using facial recognition software. Police in both the United Kingdom and Australia have started implementing the technology. In Australia, opponents of facial recognition software have stated that racial biases in the software will lead to both false positives and false negatives for facial recognition.

Experts at the Human Rights Law Centre, which opposes the use of facial recognition software in Australia, say that the technology poses “a significant threat to freedoms of expression, association and assembly.”

Facial recognition software may have known flaws, but a growing number of police departments have started using it. Unfortunately, this may lead to an increase in false arrests, which can cause an innocent person to suffer emotional, financial and personal harm. In extreme cases, an individual may even face a malicious prosecution due to a mistake in facial recognition technology. If you’ve been mistakenly identified by facial recognition technology and charged with a crime, you should contact a criminal defense attorney today.

 

Media Contact:

Attorney John Helms

T: (214) 666-8010

https://johnhelms.attorney/

Sources:

  1. https://www.newsweek.com/racial-bias-found-amazon-facial-recognition-software-used-law-enforcement-1306407
  2. http://fortune.com/2018/10/28/in-china-facial-recognition-tech-is-watching-you/
  3. https://www.theverge.com/2019/1/25/18197137/amazon-rekognition-facial-recognition-bias-race-gender
  4. https://www.theguardian.com/technology/2018/may/30/facial-matching-system-is-racist-human-rights-law-centre-warns