Following the surprise victories of Brexit and Trump in 2016, even Barack Obama (2017) warned that “it’s become safer to retreat into our own bubbles”, thereby linking increased electoral volatility and polarisation with concepts such as “echo chambers” (Sunstein 2001) and “filter bubbles” (Pariser 2011). The politicians, journalists, and scholars who support these concepts suggest that, with online and social media as the key sources of information for an ever-growing percentage of the public (Newman et al. 2016), echo chambers and filter bubbles are chiefly responsible for the emergence of communities that espouse contrarian and counterfactual perspectives and ideologies, and for their disconnect from the mainstream.
Echo chambers are said enable these groups to reinforce their views by connecting with likeminded others; filter bubbles to shield them from encountering contrary perspectives. Such disconnection from and ignorance of alternative perspectives is assumed to result from a combination of individual choice, in selecting the news sources to consult or the social media accounts to follow, and the algorithmic shaping of such choices, as search engines, news portals, and social media platforms highlight and recommend some sources over others. As platform algorithms learn from the users’ choices, and users make those choices predominantly from the options promoted by the algorithms, a self-reinforcing feedback loop gradually curtails choice to an increasingly narrow and homogeneous set of options.
Rigorous empirical evidence for the operation of such processes is sorely lacking, however. Building on empirical studies that show no significant evidence of filter bubbles or echo chambers in search (e.g. Haim et al. 2018; Krafft et al. 2018; Nechushtai & Lewis 2018) or social media (e.g. Beam et al. 2018; Bruns, 2017), this paper argues that echo chambers and filter bubbles principally constitute an unfounded moral panic that presents a convenient technological scapegoat (search and social platforms and their affordances and algorithms) for a much more critical, fundamentally human-made problem: growing social and political polarisation. This is a problem that cannot be solved by technological means.
Research shows that even – indeed, perhaps especially – the most hyperpartisan users still encounter material that challenges their perspectives, and engage with users who represent opposing views (e.g. Garrett et al. 2013; Weeks et al. 2016). The central question is what they do with such information when they encounter it: do they dismiss it immediately as running counter to their own views? Do they engage in a critical reading, turning it into material to support their own worldview, perhaps as evidence for their own conspiracy theories? Do they respond by offering counter-arguments, by vocally and event violently disagreeing, by making ad hominem attacks, or by knowingly disseminating all-out lies as ‘alternative facts’? More important yet, why do they do so? What is it that has so entrenched and cemented their beliefs that they are no longer open to contestation? This is the debate we need to have: not a proxy argument about the impact of platforms and algorithms, but a meaningful discussion about the complex and compound causes of political and societal polarisation. The ‘echo chamber’ and ‘filter bubble’ concepts have seriously distracted us from that debate, and must now be put to rest.
Beam, M. A., Child, J. T., Hutchens, M. J., & Hmielowski, J. D. (2018). Context Collapse and Privacy Management: Diversity in Facebook Friends Increases Online News Reading and Sharing. New Media & Society, 20(7), 2296–2314.
Bruns, A. (2017). Echo Chamber? What Echo Chamber? Reviewing the Evidence. Presented at the Future of Journalism 2017, Cardiff.
Garrett, R. K., Carnahan, D., & Lynch, E. K. (2013). A Turn toward Avoidance? Selective Exposure to Online Political Information, 2004–2008. Political Behavior, 35(1), 113–134.
Haim, Mario, Andreas Graefe, and Hans-Bernd Brosius. 2018. “Burst of the Filter Bubble? Effects of Personalization on the Diversity of Google News.” Digital Journalism 6 (3): 330–43.
Krafft, Tobias D., Michael Gamer, and Katharina A. Zweig. 2018. “Wer sieht was? Personalisierung, Regionalisierung und die Frage nach der Filterblase in Googles Suchmaschine.” Kaiserslautern: Algorithm Watch.
Nechushtai, Efrat, and Seth C. Lewis. 2018. “What Kind of News Gatekeepers Do We Want Machines to Be? Filter Bubbles, Fragmentation, and the Normative Dimensions of Algorithmic Recommendations.” Computers in Human Behavior, August.
Obama, B. (2017, January 10). President Obama’s Farewell Address: Full Video and Text. New York Times.
Newman, N., Fletcher, R., Levy, D. A. L., & Nielsen, R. K. (2016). Reuters Institute Digital News Report 2016. Oxford: Reuters Institute for the Study of Journalism, University of Oxford.
Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. London: Penguin.
Sunstein, C. R. (2001). Echo Chambers: Bush v. Gore, Impeachment, and Beyond. Princeton, N.J.: Princeton University Press.
Weeks, B. E., Ksiazek, T. B., & Holbert, R. L. (2016). Partisan Enclaves or Shared Media Experiences? A Network Approach to Understanding Citizens’ Political News Environments. Journal of Broadcasting & Electronic Media, 60(2), 248–268.