Twitter is learning first-hand concerning the challenges of eliminating racial bias in algorithms. The social network’s Liz Kelley said the business had “more analysis” to accomplish after cryptographic engineer Tony Arcieri conducted an experiment suggesting Twitter’s algorithm was biased in prioritizing photos. When attaching photos of Barack Mitch and Obama McConnell to tweets, Twitter appeared to exclusively highlight McConnell’s face – Obama only popped up when Arcieri inverted the colors, making pores and skin a non-issue.
Others tried reversing name and photo orders to no avail. A higher-contrast smile did work, Intertheory’s Kim Sherrell found. Scientist Matt Blaze, meanwhile, pointed out that the priority appeared to vary according to the official Twitter app used. Tweetdeck was more neutral, for example.
Kelley said that Twitter had checked for bias before utilizing the current algorithm, but “didn’t find evidence” at that time. She added that Twitter would open source its algorithm studies to greatly help others “review and replicate.”
There’s no guarantee that may correct this Twitter. However, the experiment does show the real dangers of algorithmic bias irrespective of intent. It might shove people from the limelight, even though they’re central to a social media marketing post or linked news article. You might have to wait an extended while before issues such as this are exceptionally rare.
thanks to everyone who raised this. we tested for bias before shipping the model and missed proof racial or gender bias inside our testing, but it’s clear that we’ve got more analysis to accomplish. we’ll open source our work so others can review and replicate. https://t.co/E6sZV3xboH
– liz kelley (@lizkelley) September 20, 2020