Multiple users tweeted about out what they claimed was a racially bias of Twitter’s photo preview feature, prompting the company’s top officials to respond and say that they would look into it .
Several Twitter users posted of long vertical photographs that had a white and black person’s face at either end to back the claim that the photo preview feature was more likely to show the white person’s face. Twitter user Colin Madland claimed the facial recognition feature was not showing his black colleague’s face when he was using a virtual background.
After Madland’s tweet, many carried out similar informal tests online to determine the cause. Many also experimented with different faces, including Republican senator Mitch McConnell and former president Barack Obama, to claim that the feature may have a bias.
Here are some of the other tests that users conducted:
https://twitter.com/_jsimonovski/status/1307542747197239296
https://twitter.com/NotAFile/status/1307337294249103361
As the informal trials set Twitter abuzz, the platform’s chief technology officer Parag Agrawal said that this was an important question.
“To address it, we did analysis on our model when we shipped it but needs continuous improvement. Love this public, open, and rigorous test — and eager to learn from this,” he tweeted.
Twitter’s chief design officer Dantley also responded to the issue to assure users that they were checking the platform’s neural network on why this was happening.