Transcriber: Paulina Kaniewska Reviewer: omar idmassaoud I’m a computer scientist, and I study bad things people do on the Internet. It’s not fun, but it lets me serve as a guide to the horrors there, and it helps us understand the impact of social media and hopefully improve it. Dark content has always flourished online: where there were interactive spaces, people made things up, traded ideas and chatted with others to refine them.
As the online ecosystem evolved into social media we know today, those activities never went away. While more mainstream themes dominate discussion, those molding digital corners where fringe, extreme and unpalatable ideas have always grown remain. The QAnon conspiracy theory began in those same murky online spaces.
The movement grew to envelop almost any anti-government, anti-science far right crackpot theory, but it was based on the idea that there is an international cannibalistic satan worshipping cabal of child sex traffickers embedded in Congress, the media and halls of power; that these people were taking a drug derived from the blood of children that gave them eternal youth; and that Donald Trump had been sent to save us all by bringing down the cabal through mass arrests and public executions. And a Daily Kos/Civiqs Poll from September of 2020 found that a majority of Republicans believed that the QAnon conspiracy theory was at least partly true, and that, along with Trump’s big lie about the election, led to the insurrection on January 6th. How does something that ridiculous that’s been thoroughly debunked every step along the way come to change the course of American history?
The answer lies at the intersection where some deep social problems meet the algorithms and profit motivations of major social media companies. Almost all the big platforms personalize the content you see. Their goal is to optimize for engagement, that is to show you the content that will keep you on the platform for the longest time because that optimizes the amount of money that the social media companies can make.
It gives them more data about you, it gives them more time to show you ads, and the two together allow them to show you more and better targeted ads. The personalization systems that optimize for engagement have had dramatic and profound social side effects because you know what keeps people engaged on social media conspiracy theories and extreme content? It doesn't even have to be that people believe it.
If people just get upset at the ill logic of it all, and engage in comment section debunking arguments, that counts as engagement, too— the algorithms realize that they should show more of that content to everyone. In addition to getting more engagement, this normalizes the content. When radical ideas are mixed in with the kind of things somebody already believes in, it makes those ideas appear less extreme.
Then the algorithms come in with a push towards the fringes. An internal Facebook report from 2016 showed that sixty four percent of people who joined extremist groups on Facebook did so because Facebook recommended those groups. Sure, the platforms have rules prohibing hate speech, violence, illegal activities… But if that type of information gets a lot of interaction, the platform’s own algorithms drive people toward it, and it makes money.
Why would the platforms enforce their rules against it? Now, let’s bring this back to QAnon. The conspiracy theory mostly lived on the discussion forum, 8kun.
Extreme Trump supporters and other far right conspiracy theorists who lurked there brought those ideas into slightly more popular forums, including extremist groups on Facebook, and conspiracy minded Twitter accounts, YouTube channels, and Reddit forums. From there, they reached further into the mainstream. Breaking onto the popular platforms matters in a few ways beyond the sheer number of people there.
That stereotype racist uncle may not be trolling 8kun, but he's probably on Facebook. Ideas reach him there from people he personally trusts, or who at least share his views. Algorithms can highlight that content for him because he’s likely to engage with it, and its presence on a mainstream platform offers a veneer of respectability.
8kun looks exactly like you’d expect a sketchy internet forum to look. Facebook looks sanitized and reputable, and those traits are reflected onto the content shared there, no matter how abhorrent. So QAnon was slowly leaking into more popular channels, and then the pandemic came along.
While the conspiracy theory had been around since 2017 and 2020, it evolved to embrace the anti-lockdown, anti-government guidance, xenophobic conspiracy theories around COVID, making a satan worshipping cannibalistic pedophile conspiracy theory suddenly palatable to more people. The anti-science and anti-vaccine elements that QAnon involved also made it appealing to alternative medicine and anti-vaccine groups that don’t traditionally align with the far right. All those people who say: “Do your own research,” and by that mean: “Watch some crazy dudes’ YouTube videos” were suddenly locked down, spending a lot of time on social media.
Even if they weren’t sure about the baby eating, they felt there might be some truth behind Q and QAnon grew into a movement. When Trump’s again forcefully pushing the lies of widespread election fraud, he inflamed QAnon supporters. They believed Trump and saw it as another step in the QAnon’s prophecies.
And it provoked thousands to take action. In the lead up to January 6th, storming the Capitol was literally trending on every major social media platform. People going to the rally were sharing maps of the US Congress buildings labeled with the offices of their top targets.
They talked openly about committing violence, breaking into the Capitol and of executions. There was no lack of warning about what would happen on the 6th. Planning for the insurrection wasn’t hidden in dark corners of the Internet.
It was on Facebook and Twitter, and YouTube. It was everywhere out in the open for anyone to find. De-platforming Trump as the capital was being stormed was, I believe, the only thing that stopped that violence from escalating.
The insurrectionists online posts couldn’t have been more clear that they were following Trump’s command, and that they were participating in the promised storm of the QAnon’s conspiracy theory, where the cabal would be taken down. Just a few more incendiary tweets from Trump could’ve easily marshaled bigger and more violent mobs who would have taken more extreme and aggressive action to disrupt American democracy. So do I give the platform its credit for de-platforming Trump at that time?
On one hand, sure, they get one point for stopping the destruction of American democracy. On the other hand, driven partially by a desire to keep profiting off conspiracy fueled engagement, and partially by a fear of political criticism and government scrutiny, the platforms essentially promoted the insurrection. They created an elaborate dance to avoid enforcing their own rules against Trump and his supporters, and their algorithms pushed people towards the extreme ideas.
And now that the immediate danger has passed, we are not out of the woods. There are so many problems that need to be addressed to avoid a repeat of what we saw with QAnon, and many of them are deep social problems that I am not qualified to solve, but I will add a couple thoughts here on how those relate to social media platforms. One common belief is that we're made more extreme by Echo Chambers, where we only hear voices we agree with online.
The evidence doesn't really bear that out. In fact, new research shows that if we expose people from across the political spectrum to more views from the other side, this actually makes them become more extreme and more entrenched to their existing positions. Simply exposing people to other ideas doesn't solve the problem.
Fact checking conspiracy theories also doesn’t help. New studies show that fact checking a person can actually drive them to share worse quality and more extreme information in the future. And at the same time, there’s one straightforward solution that would likely have a significant impact—forcing social media companies to enforce their own terms of service.
They already have rules in place that prohibit hate speech, calls to violence, misinformation around critical issues, like vaccines in elections. Aggressively enforcing those rules against the worst offenders matters because we know there’s usually only small a handful of accounts that give rise to the majority of mis- and disinformation on social media platforms. For example, we’ve learned that out of the billions of users on Facebook, there were just 12 accounts responsible for 65 percent of COVID-19 misinformation.
If platforms had permanently blocked those accounts, blocking the accounts that spring up from those creators under different usernames, and removed the ability of other users to easily share information that comes from chronic disinformation spreaders, the ecosystem would dramatically improve. The fact that platforms are not enthusiastically embracing such an obvious and straightforward solution on their own reveals the power of the underlying financial and political motivations to leave this content up. Their current inaction shows that without government intervention, hoping for change is simply making a wish that we know won’t come true.
This is a place where government regulation can be straightforward, consequences can be significant, but it doesn't cross any First Amendment lines. The government doesn’t have to say that misinformation or calls to violence, or even hate speech is bad. They simply have to tell social media companies to enforce their own terms, or else face serious consequences.
But I don't want to overlook the deep social issues that also gave rise to QAnon, as well as the profound, tragic impact it’s had on families. Like cult members, QAnon adherents have often cut out friends and family from their lives. Well-meaning loved ones find themselves either shut down or mired in conspiracy theory word salad when they try to talk.
So what if you have a loved one who's bought into QAnon or another extremist conspiracy theory? There's no quick solution, but a big draw of these groups is the community that forms around them. Getting people out requires both gentle debunking, and support and exposure to welcoming groups of people who’d left those communities.
Those groups can be found on social media if you look, and I know, loving support and patience may be too much to ask after enduring harassment, insults and watching the destruction that QAnon has brought to the country, but for those who can manage, it may shine a light into that darkness and offer a path out. QAnon manifested vast offline destruction— from unregulated algorithms, total abdication of social responsibility, and unchecked corporate greed for money and power. I wish for us all that it also marks the end of that, and that we are propelled to finally do the work necessary to reclaim that power and harness it for the social good.