A new research paper from the MIT shows that some facial recognition algorithms available on the market are racially biased in favor of white men based on skin color and gender.
MIT researchers found that three facial recognition programs had an error rate of up to 0.8 percent when white men were involved. By contrast, the computer algorithms’ error rates jumped to 20%-34% when it came to recognize the faces of black women.
- The differences may be caused by the way these computer programs are being designed in the first place.
- For example, one team boasted that their facial recognition tech had a 97% accuracy rate.
- However, when they trained the system, researchers used data that was 83% white and 77% male.
Senior researcher Joy Buolamwini underlined that the data-centric method makes all the difference. Buolamwini is concerned that such subtle biases may be included in software that helps law enforcement look for a criminal suspect in their data bases.
Facial Recognition Tech Affected by Certain Bias
The research team focused on facial-analysis algorithms that are commonly used in modern-day applications. The software was able to match faces and tell between male and female, black and white, young and old.
All three algorithms ran a racial bias, but the differences were not easy to spot.
Several years ago, Buolamwini, who led the research, designed Upbeat Walls, an app that could track users’ head movements and translat them into different color patterns on a reflective background. That software used a commercial facial recognition software.
At the time the team found that the system couldn’t properly recognize the faces of darker-skinned subjects. Buolamwini who is African American wanted to know if the system was affected by a bias.
After using several of her photos, she realized the system couldn’t even recognize her face as a human face at all. Plus, it misclassified her gender multiple times.
Image Source: Pixabay
Leave a Reply
You must be logged in to post a comment.