You are here

'Fake News' on Facebook: A Large-Scale, Longitudinal Study of Problematic Information Dissemination between 2016 and 2021 (ECREA 2022)

ECREA 2022

'Fake News' on Facebook: A Large-Scale, Longitudinal Study of Problematic Information Dissemination between 2016 and 2021

Axel Bruns, Daniel Angus, Xue Ying (Jane) Tan, Edward Hurcombe, Nadia Jude, Phoebe Matich, Stephen Harrington, Jennifer Stromer-Galley, Karin Wahl-Jorgensen, and Scott Wright

Presentation Slides

Abstract

This study investigates the dissemination of a wide variety of ‘fake news’, mis- and disinformation, and related content on Facebook, to develop a robust empirical base, enabling us to analytically distinguish between different types and practices of problematic information dissemination by observing systematic differences in activity and engagement patterns.

Drawing on a masterlist of some 2,300 suspected sources of problematic information online, as compiled from public lists in the literature (e.g. Shao et al., 2016; Allcott et al., 2018; Grinberg et al., 2019), we have gathered any posts on public pages, groups, and verified profiles on Facebook that contained links to these sites between 2016 and 2021, via CrowdTangle. Our full dataset contains some 42.6 million Facebook posts between 1 January 2016 and 31 March 2021.

Our network analysis of linking patterns between Facebook spaces and sources of problematic information reveals large clusters of Facebook spaces that are broadly aligned with the conservative and progressive sides of domestic US politics. These are bridged by smaller clusters ranging from outright conspiracy theories through alternative health and medicine to esoteric beliefs in astrology, and accompanied by clusters of spaces in languages other than English.

This paper advances this line of inquiry by extending beyond sharing content from our initial list of problematic information sources. From our dataset, we have identified the 1,000 most prominent, active, and influential Facebook spaces; for these, we have gathered their overall public posting activity during the 2016 to 2021 period. This extended dataset shows whether they also link to more mainstream information sources; uncovers their additional sources of problematic information; and identifies posting activities that do not engage with outside content and information, but consist purely of text, image, and video content native to Facebook itself.

We use this dataset to investigate practices of “counter-ideological linking” (Toepfl & Piwoni, 2018), “information laundering” (Klein, 2012), and “white propaganda” (Puschmann et al., 2016), analysing discursive and link-sharing practices that work to build support and legitimise problematic sources. This approach also builds on Starbird et al.’s conception of disinformation as collaborative work (2019), by tracing patterns of activity across multiple Facebook spaces. In our mixed-methods, qualitative and quantitative analysis of the discursive strategies of these 1,000 most prominent Facebook pages and groups whose longitudinal activities are captured in our dataset, we also pay particular attention to how these actors attempt to engage ordinary human participants in their efforts to further their hyperpartisan worldviews.

Allcott et al. (2018). Trends in the Diffusion of Misinformation on Social Media. https://arxiv.org/abs/1809.05901v1

Grinberg et al. (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science, 363(6425), 374–378. https://doi.org/10.1126/science.aau2706

Puschmann et al. (2016, April). Information laundering and counter-publics. AAAI Conference on Web and Social Media. https://www.aaai.org/ocs/index.php/ICWSM/ICWSM16/paper/view/13224/12858

Shao et al. (2016). Hoaxy. World Wide Web Conference, 745–750. https://doi.org/10.1145/2872518.2890098

Starbird et al. (2019). Disinformation as collaborative work. ACM Conference on Human-Computer Interaction, 3, 1-26. https://doi.org/10.1145/3359229

Toepfl, F., & Piwoni, E. (2018). Targeting Dominant Publics. New Media & Society, 20(5), 2011–2027. https://doi.org/10.1177/1461444817712085