Twitter is looking into why the network it uses to generate photo previews appears to choose Black faces less frequently than white faces. The disparity within the photo preview feature was brought to light over the weekend.
A Twitter user by the name of Colin Madland noticed that Zoom was not showing the face of his Black co-worker during calls. Madland than noticed the same issue when he posted the same photos to Twitter. From there, a Twitter user by the name of Jordan Simonovski ran into similar issues when working with cartoon faces.
Recent discoveries echo the findings of machine learning researchers at Twitter in 2018. In a blog post, researchers explained the limitations of their neutral network.
"Previously, we used face detection to focus the view on the most prominent face we could find. While this is not an unreasonable heuristic, the approach has obvious limitations since not all images contain faces. Additionally, our face detector often missed faces and sometimes mistakenly detected faces when there were none. If no faces were found, we would focus the view on the center of the image. This could lead to awkwardly cropped preview images," the blog post reads.
Twitter CTO Parag Agrawal has responded to these concerns by tweeting that their model needs “continuous improvement.”
It’s clear that we’ve got more analysis to do,” Liz Kelley of Twitter communication added.
“We’ll open source our work so others can review and replicate."
Photo: Getty Images