Skip to main content
Home
Snurblog — Axel Bruns

Main navigation

  • Home
  • Information
  • Blog
  • Research
  • Publications
  • Presentations
  • Press
  • Creative
  • Search Site

Using Digital Trace Data to Study Content Moderation

Snurb — Saturday 21 October 2023 07:52
Politics | ‘Fake News’ | 'Big Data' | Social Media | Streaming Media | AoIR 2023 |

The final session on this second full day at AoIR 2023 is on deplatforming, and starts with Richard Rogers and Emilie de Keulenaar. Richard begins by outlining the idea of trace research – using the ‘exhaust’ of the Web to study societal trends unobtrusively, not least also with the help of computational social science methods.

This understood platforms as mere intermediaries, carrying content, yet more forceful interventions by platforms to shape communication practices – e.g. by deplatforming unacceptable speech acts and actors – have shown that platforms are themselves also active and self-interested stakeholders here, whose algorithmic interventions complicate the study of societal trends by using trace data.

Content moderation actions are themselves often untraceable, in fact: it often does not leave publicly accessible traces, or certainly not traces that are accessible through the public APIs provided by platforms. To study content moderation, a different kind of trace forensics is needed: new techniques to reconstruct the scene of disappearance of moderated data – the why, what, how, and when of content that disappeared or was edited.

Some such moderation or censorship focusses on unacceptable words or phrases, and might be identified in the dynamically archiving of content that may be susceptible to censorship: censorship actions may be identified in the differences between archived content versions, or the effects of and responses to acts of moderation and censorship.

Taking the example of COVID-19 conspiracy theories on YouTube, it is possible to capture evidence of YouTube’s content moderation practices in the archives of the WayBack Machine; the archives may point, for instance, to the terms and topics that YouTube defined as unacceptable at various points, and content including such terms could be captured through the YouTube API, and actions taken against it documented from these data.

This might produce a history of content moderation policies on a given platform, or document mass content deletions when they occur; it can also distinguish between deletion reasons if the platform provides these. A more limited form of moderation – demotion rather than deletion – might also be captured by analysing the ranking of problematic content in the search results provided by a platform like YouTube. Further, user reactions to such moderation actions may also be detected in the comments left by participants on whatever videos remain on a platform.

Finally, the replatforming of deleted content on alternative platforms may also be analysed, for example by searching for the titles of videos that were deleted from YouTube. This points especially to popular unmoderated content sharing platforms like BitChute.

  • 613 views
INFORMATION
BLOG
RESEARCH
PUBLICATIONS
PRESENTATIONS
PRESS
CREATIVE

Recent Work

Presentations and Talks

Beyond Interaction Networks: An Introduction to Practice Mapping (ACSPRI 2024)

» more

Books, Papers, Articles

Untangling the Furball: A Practice Mapping Approach to the Analysis of Multimodal Interactions in Social Networks (Social Media + Society)

» more

Opinion and Press

Inside the Moral Panic at Australia's 'First of Its Kind' Summit about Kids on Social Media (Crikey)

» more

Creative Work

Brightest before Dawn (CD, 2011)

» more

Lecture Series


Gatewatching and News Curation: The Lecture Series

Bluesky profile

Mastodon profile

Queensland University of Technology (QUT) profile

Google Scholar profile

Mixcloud profile

[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Licence]

Except where otherwise noted, this work is licensed under a Creative Commons BY-NC-SA 4.0 Licence.