With today’s technology and social awareness, it is no longer acceptable to show any bias in a product or service, whether intentional or not. That’s why fingers are pointed on Twitter after its image-cropping algorithm shows racial bias.
Does Twitter have racial biases?
Twitter studies its image cropping algorithm after users noticed that black faces are not always displayed in image previews on the mobile operating system when an image shows both a black face and a white face.
Twitter reported that it found no evidence of racial and gender bias when testing its algorithm, but it also realizes that there is more testing to be done.
Social network’s chief technology officer Parag Agrawal said the image cropping model was analyzed when it was shipped, but he appreciated the audience for helping test it live. “I love this public, open and rigorous test – and I look forward to learning from it,” he said.
A Vancouver university principal, Colin Madland, started this adventure by noting that when he zoomed in with a black colleague, the other man’s head would disappear. It seemed that the software saw the darker head as part of the background, so deleted it.
Madland saw a deeper issue when he tweeted about it. He posted on Twitter to ask if anyone knew what was going on with the disappearance of his colleague’s face. However, once he posted a photo that included both his face and his coworker’s missing face on Zoom, Twitter cropped the image to only show Madland and not the other man.
He found out on Zoom that he could make the black man’s face appear if a white globe was placed behind his head as if that was enough to separate him from the background. Yet Twitter has also reframe this. He cropped the man with the missing head and also cropped the image with the head seen in front of the globe.
Twitter’s design director Dantley Davis believes the problem would be fixed if Madland’s hair and glasses were removed.
“I know you think it’s fun to dunk on me – but I’m as pissed off about it as everyone else.” However, I am able to fix this, and I will, ”Davis said. “It’s 100% our fault. No one should say otherwise.
Twitter users conducted experiments to prove the theory. They found that the algorithm preferred US Senate Majority Leader Mitch McConnell to former US President Barack Obama. Even with a photo, a white man was shown rather than a black man.
Twitter not alone
This is not an isolated situation with Twitter. It also happened to Microsoft, IBM and Amazon with their facial recognition systems. They were unable to identify people of color as well as whites.
Microsoft said it took steps to correct the problem after realizing that the system was trained with mostly white faces. The system hadn’t shown enough people of color to learn to identify them correctly. Later, Microsoft suggested that facial recognition be regulated to avoid bias.
IBM has said it will launch a new version of its service. Amazon’s recognition system sometimes even identifies black women as black men, but it doesn’t have the same problem with white women.
The problem, as Madland pointed out on his Twitter accountis that it’s not always as innocent as a Zoom meeting. Law enforcement uses Rekognition. People could be misidentified by this system and charged with a crime. And that, in itself, is a crime.
Image Credit: Colin Midland’s Twitter and public domain
Is this article useful?