UK Passport Photo Checker Shows Bias Against Dark-Skinned Women

According to an investigation by the BBC, women with darker skin are more than twice as likely to fail the automated United Kingdom passport rules than fair-skinned men when submitted online through the nation’s automated government checker.

The United Kingdom offers an online service to submit your own images for use on passports, which would theoretically allow a person to get their passports more quickly. If you follow a set of guidelines, a person could also avoid paying to have a photo taken of them if they have the means to photograph themselves at home. Those guidelines include having a neutral expression, keeping a closed mouth, and looking directly at the camera. If a photo is submitted that does not meet all of the criteria, it is rejected as being “poor quality.”

According to the BBC, a student named Elaine Owusu found that the automatic online portal rejected her image for having an “open mouth,” which if you see the image yourself was clearly not the case. Owusu did manage to eventually get the photo approved after challenging the verdict, but she had to write a note arguing that her mouth was indeed closed.

Though she did win, she wasn’t happy about it. “I shouldn’t have to celebrate overriding a system that wasn’t built for me,” she told the BBC.

To determine if there was a systemic problem, the BBC fed more than 1,000 photographs of politicians (based on the Gender Shades study) into the system to see if there were any patterns. They found that dark-skinned men were told that the image was of poor quality 15% of the time when compared to 9% of the time for light-skinned men. For women, it was worse: 22% of the time dark-skinned women’s images were rejected while women with light skin were told their images were of poor quality 14% of the time.

Computers are only biased when the information they are given is biased. In 2019, The New York Times published a detailed article explaining the history of racial bias built into the basics of photography, and that issue continues to show itself in newer technologies like the UK’s automatic photo checker.

“The accuracy of face detection systems partly depends on the diversity of the data they were trained on,” David Leslie of the Alan Turing Institute wrote in response to the BBC investigation. “The labels we use to classify racial, ethnic and gender groups reflect cultural norms, and could lead to racism and prejudice being built into automated systems.”

When a system like this doesn’t work for everyone, the designer of the software would normally be asked to explain. Unfortunately, the government declined to name the external company that provided the automated checker.

As a result, a solution to the problem uncovered by this investigation – where the system in place fails for a disproportionate number of dark-skinned people – is not immediately apparent.

(Via BBC)

Discussion