The next speaker in this panel at the AANZCA 2025 conference is Jee Young Lee, whose focus is on a content analysis of mis- and disinformation examples from the 2025 Australian federal election. Australian voters remain highly concerned about such problematic information, but fewer than one third of voters actively engage in fact-checking themselves; they rely instead on their gut feeling about the veracity of information rather than on concrete evidence of its truthfulness.
In that light: what do audiences regard as mis- and disinformation; how do determine this, and what do they do? This project used a digital diary method with some 38 participants who were prompted each day, via WhatsApp, for examples of problematic information they had come across during the seven days preceding the election, and for how they responded to it. These cases were then coded for key attributes by the research team. All reported cases of problematic information were assessed by a professional fact-checker.
Most examples came from social media; a smaller number came from online news, print newspapers, leaflets, or letters, TV, SMS, Websites, or outdoor advertising. Dominant creators of such content were politicians or parties, followed by mainstream media, unidentified sources, influencers, and other sources. The narratives of this content focussed overwhelmingly on specific policies or particular candidates on parties. One third of this content came in the form of political advertising.
Content was flagged by participants because it contradicted their existing knowledge or beliefs, was biased, provided insufficient evidence or context, was overly emotional, or was implausible. Very few participants reported content because it had already been externally verified as incorrect. Participants mostly indicated that they were sure or very sure about their assessment of such content, especially for content they flagged as contradicting existing knowledge or highly emotional, and much less so for supposedly biased information.
However, only 10% of the content flagged by participants as mis- or disinformation was actually classified as outright false by the fact-checker; 36% was true, while 24% was misleading, and other content missed context or took information out of context. Participants largely responded to encounters with such information by doing nothing, sharing or discussing it with others, or searching for more information.
These observations point to the limits of debunking; if distrust and emotional responses shape perceptions of mis- and disinformation, then factual verification will not help. It will also fail to address the political burnout and judgments by gut feeling related to information overload. Instead, effective interventions will need to address the emotional and narrative dimensions of political mis- and disinformation.











