You are here

'Fake News' and Other Problematic Information: Studying Dissemination and Discourse Patterns (AoIR 2021)

AoIR 2021

‘Fake News’ and Other Problematic Information: Studying Dissemination and Discourse Patterns

Presenters: Daniel Angus (1), Axel Bruns (1), Edward Hurcombe (1), Stephen Harrington (1), Sofya Glazunova (1), Sílvia Ximena Montaña-Niño (1), Abdul Obeid (1), Souleymane Coulibaly (1), Simon Copland (2), Timothy Graham (1), Scott Wright (3), Ehsan Dehghan (1)
Affiliations: Digital Media Research Centre, Queensland University of Technology (1), School of Sociology, Australian National University (2), Monash University (3)

Video of the Panel

Panel Abstract

Encompassed by the disputed term ‘fake news’, a variety of overtly or covertly biased, skewed, or falsified reports claiming to present factual information are now seen to constitute a critical challenge to the effective dissemination of news and information across established and emerging democratic societies. Such content – variously also classifiable as propaganda, selective reporting, conspiracy theory, inadvertent misinformation, and deliberate disinformation – in itself is not new; however, contemporary digital and social media networks enable its global dissemination and amplification, by human and algorithmic actors (Woolley & Howard 2017), ordinary users and professional agents, outside of, in opposition to, or sometimes also in collusion with, the mainstream media (Shao et al. 2017; Vargo et al. 2017).

Various political, commercial, and state actors are suspected to have exploited this ‘fake news’ ecosystem to influence public opinion, in major votes ranging from the Brexit referendum to national elections, and/or to utilise discourse around ‘fake news’ to generally undermine trust in media, political, and state institutions.

However, ‘fake news’ and associated phenomena remain “underresearched and overhyped” (Dutton 2017): in spite of considerable attention in mainstream and scholarly debate, much of the focus on ‘fake news’ in its various forms remains superficial, spectacular, anecdotal, and conceptual; it draws only on a limited evidence base and is difficult to fully disconnect from ideological disputes. Leading projects such as Hamilton 68 (GMF 2017) and Hoaxy (Indiana University Network Science Institute 2017) attempt to visualise the distribution of ‘fake news’ (and the role of social bots therein); the University of Oxford’s Computational Propaganda Project (Woolley & Howard 2017) offers a number of major country-specific analyses of the dissemination of mis- and disinformation through social media; Bounegru et al. (2017) outline a collection of methodological approaches to researching ‘fake news’; and major reports for online security centre TrendLabs (Gu et al. 2017), the Council of Europe (Wardle & Derakhshan 2017), and NATO Strategic Command (2017) highlight the potential threat from ‘fake news’.

Supported by a major project funded by the Australian Research Council, this panel brings together a number of perspectives that combine systematic, large-scale, mixed-methods analysis of the empirical evidence for the global dissemination of, engagement with, and visibility of problematic information in public debate with the study of the public discourse about ‘fake news’, and the operationalisation of this concept by politicians and other societal actors to downplay inconvenient facts or reject critical questions. In combination, these five papers produce a new and more comprehensive picture of the overall impact of ‘fake news’, in all its forms, on contemporary societies.

The first paper in this panel presents the results of a major study that investigates the sharing of links to some 2,314 suspected sources of ‘fake news’ and other problematic information in public Facebook spaces, from 2016 to 2020. It examines the networks of content sharing that emerge between these public pages and groups, and their sources, and studies the longitudinal dynamics of these networks as interests and allegiances shift and new developments (such as the COVID-19 pandemic or the US presidential elections) drive the emergence or decline of dominant themes in mis- and disinformation.

The second paper maintains a focus on Facebook, but focusses specifically on the sharing of one particular source of problematic information: the Kremlin-backed outlet RT (previously known as Russia Today). Examining the sharing of links to RT’s six major language editions, the paper investigates the positioning of RT within these diverse language communities and finds that the outlet variously forms alliances with left- as well as right-wing outsiders in order to disrupt the political status quo.

The third paper presents another single-source study, but shifts attention to the conservative news channel Sky News Australia. Previously a little-watched pay-TV news operation, Sky News Australia has recently pivoted towards an aggressive and highly successful digital influence strategy that has now positioned it as an important source of alt-right propaganda and conspiracy theories, well beyond (and no longer predominantly focussing on) a domestic Australian audience.

The remaining two papers in this panel examine the discursive operationalisation of the term ‘fake news’, rather than the dissemination of problematic information itself. The fourth paper investigates how the label ‘fake news’ is used in Australian political debate, by whom, and in what contexts. It finds that Donald Trump’s use of the term to attack critical media coverage in the US has found an echo in Australia, too, especially amongst populist and far-right political actors.

The final paper also examines the broader discourse surrounding the ‘fake news’ concept, and shifts our attention towards the use of this term (in its various translations) in Russian and Iranian public debate. Drawing on Twitter data, it shows that Russian- and Farsi-language debates predominantly operationalise the term ‘fake news’ to criticise the existing regime, but also segment into a number of distinct discourse communities that are allied in their position to the regime but distinct in their own political agendas.

In combination, then, these five papers present a substantive collection of innovative approaches to the ‘fake news’ concept, exploring the dissemination of problematic information itself at larger and smaller scales as well as examining the operationalisation of the idea of ‘fake news’ in pursuit of specific ideological aims.

References

Bounegru, L., Gray, J., Venturini, T., & Mauri, M. (2017). A Field Guide to Fake News: A Collection of Recipes for Those Who Love to Cook with Digital Methods. Public Data Lab. http://fakenews.publicdatalab.org/

Dutton, W. H. (2017, May 6). Fake News, Echo Chambers and Filter Bubbles: Underresearched and Overhyped. The Conversation. http://theconversation.com/fake-news-echo-chambers-and-filter-bubbles-underresearched-and-overhyped-76688

GMF: Alliance for Securing Democracy. (2017). Hamilton 68: A Dashboard Tracking Russian Propaganda on Twitter. http://dashboard.securingdemocracy.org/

Gu, L., Kropotov, V., & Yarochkin, F. (2017). The Fake News Machine: How Propagandists Abuse the Internet and Manipulate the Public. TrendLabs. https://www.a51.nl/sites/default/files/pdf/wp-fake-news-machine-how-propagandists-abuse-the-internet.pdf

Indiana University Network Science Institute. (2017). Hoaxy: How Claims Spread Online. http://hoaxy.iuni.iu.edu/

NATO Strategic Communications Centre of Excellence. (2017). Digital Hydra: Security Implications of False Information Online. NATO Strategic Communications Centre of Excellence. https://www.stratcomcoe.org/digital-hydra-security-implications-false-information-online

Shao, C., Ciampaglia, G. L., Varol, O., Flammini, A., & Menczer, F. (2017). The Spread of Fake News by Social Bots. ArXiv Preprint ArXiv:1707.07592. https://arxiv.org/abs/1707.07592

Vargo, C. J., Guo, L., & Amazeen, M. A. (2017). The Agenda-Setting Power of Fake News: A Big Data Analysis of the Online Media Landscape from 2014 to 2016. New Media & Society, 1–22. https://doi.org/10.1177/1461444817712086

Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making (DGI(2017)09). Council of Europe. https://shorensteincenter.org/wp-content/uploads/2017/10/Information-Disorder-Toward-an-interdisciplinary-framework.pdf

Woolley, S. C., & Howard, P. N. (2017). Computational Propaganda Worldwide: Executive Summary (Working Paper 2017.11). Computational Propaganda Research Project. http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Casestudies-ExecutiveSummary.pdf