You are here

Telegram Conspiracy Theorists’ Understandings of Social Media Moderation Practices

The first full day at the ECREA 2024 conference begins for me with a panel on Telegram and politics. The first presenter is Corinna Peil, whose interest is in COVID-19 conspiracy narratives on Telegram. How do the people who disseminate such narratives understand content moderation interventions?

Content moderation is a fundamental service that social media platforms provide, but this also generates accusations of censorship; exactly how content moderation works is also a subject of (sometimes conspiracist) ‘folk theories’ about the power and practices of social media platforms, however. The pandemic heightened this further, as it pushed platforms to implement stronger moderation practices in order to avoid serious harms from mis- and disinformation. This also pushed some groups to more fringe social media platforms – in German-speaking countries, for instance, groups like the Querdenker conspiracy theorist movement.

This study examined some 87 Telegram channels run by Querdenker groups, centred around COVID-19 protests in December 2021, the Russian attack on Ukraine in March 2022, and the lifting of COVID-19 measures in February 2023, with a focus on Austria. Of some 60,000 messages from these channels it selected nearly 1,000 posts that addressed content moderation practices directly.

Such messages saw Telegram as a platform free from censorship, but also perceived threats from German government regulation; feared the removal of the Telegram app from Apple and Google app stores; encouraged censorship avoidance through VPNs, proxy servers, and direct app downloads from Telegram’s servers; and discussed moderation practices elsewhere, especially through fact-checking on Facebook (circulating conspiracy theories about the supposed links of fact-checking organisations to Big Pharma; and accusing Facebook of bias against Russians).

Especially these latter claims position Facebook as an active manipulator of public debate; this also applied to platforms like YouTube, which was similarly seen as suppressing ‘alternative’ views about COVID-19, and linked this to political, media, and other influences. This also led to evasion tactics including the avoidance of notable keywords and the use of alternative video hosting platforms. More broadly, there were widespread criticisms of mainstream media and politics.

Platforms are thus pulled into the very conspiracy narratives they aim to combat; tech companies, media, and politics are seen as collaborators in moderation, and as common enemies of the Querdenker and related movements. Regulators thus face the challenge to moderate problematic content without reinforcing the narratives that such conspiracist groups promote. This positions content moderation as a double-edged sword, and a broader approach beyond content removal – focussing especially also on infrastructures – may be needed. There is also a trust deficit in moderation, and transparency in the implementation of such measures is critical.