Users discovered the problem with the neural network that crops photo previews
Illustration by Alex Castro / The Verge
Twitter it was looking into why the neural network it uses to generate photo previews apparently chooses to show white peoples faces more frequently than Black faces.
Several Twitter users demonstrated the issue over the weekend, posting examples of posts that had a Black persons face and a white persons face. Twitters preview showed the white faces more often.
The informal testing began after a Twitter user tried to post about a problem he noticed in Zooms facial recognition, which was not showing the face of a Black colleague on calls. When he posted to Twitter, he noticed it too was favoring his white face over his Black colleagues face.
Users discovered the preview algorithm chose non-Black cartoon characters as well.
When Twitter first began using the neural network to automatically crop photo previews, machine learning researchers explained in a blog post how they started with facial recognition to crop images, but found it lacking, mainly because not all images have faces:
Previously, we used face detection to focus the view on the most prominent face we could find. While this is not an unreasonable heuristic, the approach has obvious limitations since not all images contain faces. Additionally, our face detector often missed faces and sometimes mistakenly detected faces when there were none. If no faces were found, we would focus the view on the center of the image. This could lead to awkwardly cropped preview images.
Twitter chief design officer Dantley Davis tweeted that the company was investigating the neural network, as he conducted some unscientific experiments with images:
Here’s another example of what I’ve experimented with. It’s not a scientific test as it’s an isolated example, but it points to some variables that we need to look into. Both men now have the same suits and I covered their hands. We’re still investigating the NN. pic.twitter.com/06BhFgDkyA
Dantley (@dantley) September 20, 2020
Liz Kelley of the Twitter communications team tweeted Sunday that the company had tested for bias but hadnt found evidence of racial or gender bias in its testing. Its clear that weve got more analysis to do, Kelley tweeted. Well open source our work so others can review and replicate.
Twitter didnt immediately reply to a request for comment Sunday.