You are here

Conspiracy Theory Followers as Interpretive Communities

For the final (boo!) session at AoIR 2022 I’m in a session on feminist approaches to disinformation, and Alice Marwick is already in full flight and discussing the followers of conspiracy theories as interpretive communities. They are social phenomena, communities, and connected by the Internet; their members are socialised into ways of knowledge-making and understanding over time, building their conspiratorial literacy that enables them to make connections between conspiracist factoids and produce counterfactual narrative.

Notably, there are a fair number of young people of colour involved in these conspiracy theories, well beyond the ‘Fox Mulder’ stereotype of the white, middle-aged man driving such theories. Indeed, people from particular identity categories are more likely to engage with such conspiracy theories, in part due to their communities’ long marginalisation by white governments. The present project explored this further by examining some 200 conspiracy videos on TikTok, coding them for their content and identity markers.

Preliminary findings from this are that government, science, the entertainment industry, Jews, the LGBTIQ+ community, pedophiles, and other groups are commonly positioned as the enemy, while everyone, Christians, children and parents, artists, and the Black community in the US are positioned as the victims. There are many creators of colour, mostly 20-somethings, and a split of male and female; they often use visual ‘evidence’ in their videos, but almost never cite their sources. Often they are using imagery sourced from searches on the creators’ phones and stitching this together in their videos.

Quite a number of them also use fictional media to make their points, however. This builds on theories of predictive programming (putting markers in fictional media to influence the public), and is similar to the deep lore about fictional texts like Game of Thrones, translated to the TikTok format. Many videos also encourage users to ‘do their own research’, and teach users to use specific Google search terms to find key sources of conspiracist ideas. This is a form of ‘keyword seeding’, peppering viewers with potential search terms and creating the IKEA effect of disinformation: when users compile these materials themselves they are more likely to believe them.

On TikTok, such conspiracy theories serve as entertainment, producing the pleasure of learning and putting together their own worldviews; they also provide members of minorities groups with the opportunity to question dominant worldviews, but often also by including hateful content.