How conspiracy theories spread online – it’s not just down to algorithms

Thumbnails from “Alt-Right” YouTube channels. Digital Methods Initiative, 2017, Author provided
The debate has grown in recent years over the role that social media algorithms play in spreading conspiracy theories and extreme political content online. YouTube’s recommender algorithm has come under particularly severe scrutiny. A number of exposés have detailed how it can take viewers down a radicalization rabbit hole.

While YouTube has certainly extended the reach of conspiracy theorists, it’s difficult to assess the objective role of algorithms in these radicalization processes. But my own research has observed the way certain radical communities that congregate at the fringes of the web have managed to essentially manufacture conspiracy theories. These have, in turn, trended on social media.

In 2019, YouTube dramatically cleaned up its platform after coming under pressure from journalists. It removed lucrative ad revenue and deleted entire channels – most notoriously the Infowars channel of the US talk-radio host Alex Jones.

While a recent research paper on this topic noted a corresponding overall decrease in conspiracy theory videos on YouTube, it also observed that the platform continued to recommend conspiratorial videos to viewers who had previously consumed such material. The findings indicate that plenty of potentially objectionable content remains on YouTube. However, they don’t necessarily support the argument that viewers are guided by algorithms down rabbit holes of ever-more conspiratorial content.

By contrast, another recent study of YouTube’s recommender algorithm found that conspiracy channels seemed to gain “zero traffic from recommendations”. While this particular study’s methodology generated some debate back and forth, the fact is that an accurate understanding of how these social media algorithms work is impossible. Their inner workings are a corporate secret known only to a few – and possibly even to no humans at all because the underlying mechanisms are so complex.

This article is part of a series tied to the Expert guide to conspiracy theories, a series by The Conversation’s The Anthill podcast. Apple Podcasts or Spotify, or search for The Anthill wherever you get your podcasts.

Not just cultural dopes

The presumption that audiences are the passive recipients of media messages – that they are “cultural dopes” easily subject to subliminal manipulation – has a long popular history in the field of media and communications studies. It’s an argument that’s often popped up in conservative reactions to heavy metal music and video game violence.

But by focusing on audiences as active participants rather than passive recipients we arguably gain greater insights into the complex media ecosystem within which conspiracy theories develop and propagate online. Often these move from the subcultural fringes of the deep web to a more mainstream audience.

Conspiracy theories are an increasingly important method of indoctrination and extremist radicalization. At the same time, their adversarial logic also maps onto a populist style of political rhetoric that pits the general will of the people against a corrupt and aging establishment elite.

A much more extreme version of this dynamic is also characteristic of right-wing anger against the perceived dominance of a “globalist liberal elite”. Such anger galvanized parts of the trolling subculture associated with certain forums, message boards, and microblogging social networks, in support of the presidential candidacy of Donald Trump.

A common rhetorical technique used on the far-right political discussion forum of the anonymous message board 4chan has been to lump together all manifestations of this liberal, globalist elite into a singular nebulous “other”. Whether a perfidious individual, a shadowy organization, or a suspect way of thinking, this conspiracy is imagined as something which undermines the interests of the ultra-nationalist community. These interests also tend to coincide with those of Trump as well as of the white race in general.

This far-right online community has an established record of propagating hatred, and it has also produced two extremely bizarre and extremely successful pro-Trump conspiracy theories: Pizzagate and QAnon.


Unlike the black boxes of corporate social media algorithms, 4chan datasets are easily captured and analyzed, which has allowed us to study these conspiracy theories in order to identify the processes that brought them about. In both cases, these conspiracy theories can be understood as the product of collective labor by amateur researchers congregating within these fringe communities who build up a theory by a process of referencing and citation.

Pizzagate was a bizarre theory connecting the presidential campaign of Hillary Clinton to a child sex ring supposedly run out of a pizza parlor in Washington DC. It developed on 4chan in the course of a single day, shortly before the November

All Rights Reserved for

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.