How Facebook’s Main Algorithm Fans Bigotry And Violence

Social media platforms are designed to amplify confirmation bias... even if that bias is bigoted. And that has had dire consequences.
Facebook CEO Mark Zuckerberg arrives before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington, Tuesday, April 10, 2018. (AP Photo/Andrew Harnik)

Facebook CEO Mark Zuckerberg arrives before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington, Tuesday, April 10, 2018. (AP Photo/Andrew Harnik)

According to a pair of studies, social media use was directly linked to violence against Muslims in the United States, and attacks on refugees in Germany after controlling for all other factors. The higher the rate of social media usage in places where the incidents took place, the greater the chance someone got hurt. Of course, the researchers point out that using social media by itself is not the cause of these attacks, just that there is a significant correlation between echo chambers on Facebook and Twitter and xenophobic violence. But they did point to a few bits of insight that would allow those of us who know how social media platforms work under the hood to connect and dots and come up with a causative mechanism.

Now, we’ve already covered why bigotry, toxicity, and outrage thrive on social networks, but that exploration was limited to the psychology of its users. Here, we’re going to look at the heart and soul of social media: the recommendation algorithm. This is the code responsible for suggesting who to follow, what links you may want to click, and what comes up first in your timeline by weighing data points about how you interact with the site and its users and what your friends, followers, or people you follow like. This works great when you’re looking for videos of puppies and stories about science and culture, but also when you like a racist page, read xenophobic screeds, or watch bigoted conspiracy videos.

So, if you were to watch a video that blames Jews for all the world’s problems, Facebook or YouTube will essentially say “You just watched a video by xXx_GasTheJoos_xXx, would you like to join his descent into madness by watching another one of his videos? Also, since we guess you hate Jews now, would you like to join the Exposing the Zionist Conspiracy page?” The code behind it doesn’t know what you’re watching and how it’s different from a Daily Show monologue other than the scale of engagement, and just tried to give you more of the same.

This means your dear aunt Mildred, who wouldn’t hurt a fly, is a few wrong clicks away from reading a polemic detailing how illegal immigrants aligned with Muslim terrorists are being snuck into America by liberals to kill her sons, rape her daughters, and sell her grandchildren into sex slavery. Then, rattled, she will wake up to story after story in her timeline detailing more and more apocalyptic xenophobia. If she engages too much with it or agrees with enough of the platform’s recommendations, the algorithm will escalate the bigoted content on her timeline, creating an airtight echo chamber in which she’s surrounded by racist hysterics and fake news everywhere she turns.

If you were wondering how so many seemingly calm and reasonable parents and grandparents suddenly became so paranoid, angry, and hostile after joining Facebook, that’s why. Because they agreed with the recommendations and checked out the links, they believe they stumbled on some hidden truths and shocking insights through strenuous research, while in reality, a neural network was just giving them what they thought they wanted until their timeline looks like the dispatches from bizzaro land in which Satanic pedophiles secretly rule the Earth and the West is being reduced to ashes by Muslims, liberals, and MS-13. With the recommendation engine’s help, they’ve convinced themselves they’re living through an existential crisis and only far-right populists and bigots can help by taking extreme action.

Consider that the very few movements that turned violent justified it by saying “well, we’re just sociopathic monsters who want to hurt others.” No, for them, violence is a means to save a world falling into chaos or defend themselves from a supposedly clear and present danger. Locking yourself into a bubble in which the world is falling apart and you’re at the mercy of imaginary villains is a huge step towards that. Constant, widespread access to a platform which can be quickly and unwittingly configured to blast incessant paranoia and bigotry means that those most prone to violence will find the fuel to turn their hate into action more often than they otherwise would.

And this isn’t just a Western thing. Just note the outsized role Facebook played in genocidal violence against the Rohingya in Myanmar. Its responses to reports of hate speech and virulent anti-Rohingya propaganda were slower than that of a turtle on sodium pentathol, and while it’s significantly expanding its Burmese-speaking content moderators, it’s a drop in the ocean of hate they unwittingly helped foment and neglected for years as ever more dire warnings poured in from journalists and human rights groups. By handing bigots a megaphone and not caring if they used the platform for evil, they set the stage for the country’s Buddhists to see a Muslim group as an existential threat and respond with violence thanks to fake news and xenophobic sermons from rabid monks.

The bottom line here is that social media recommendation algorithms were built with minimal thought behind what could happen if they were used for evil, and now that they are, the companies who run the platforms, unable and unwilling to deal with the consequences, are looking for a way to pass the buck or sweep the problem under the rug. Meanwhile, angry and manipulative racists, misogynists, xenophobes, and bigots of all stripes are exploiting the core technology behind these apps to suck fellow travelers and uninformed bystanders into an echo chamber of their fear porn, inspiring them to vote for racist politicians, support dehumanizing policies, and yes, commit violent acts.

Facebook and other social media platforms moved fast and broke things. Too bad the last thing they broke just so happened to be civil society as we knew it, and that’s not something you can just roll back or debug.

Politech // Donald Trump / Facebook / Social Media