The first paper in the final session at AoIR 2018 today is SeongJae Min, who is interested in the role of algorithms in determining what we are exposed to on social media; the major finding from his research is that people’s choices matter at least as much as algorithmic shaping.
Concepts such as ‘echo chambers’ and ‘filter bubbles’ have become popularised in recent times, but there is a significant lack of empirical evidence for such phenomena; if anything, they are more prevalent in localised offline contexts than global online networks, where cross-cutting exposure is considerably more likely to occur. But to what extent do social media users experience cross-cutting exposure, and under what circumstances?
The present project surveyed some 271 Facebook users in a first phase, and further broadened this to over 3,000 social media users in a second phase. In the first phase, some 87% reported experiencing some degree of cross-cutting exposure; this was driven predominantly by the weak ties than by the strong ties in their social networks. Users with high cultural diversity in their networks and large networks experienced particularly strong cross-cutting exposure.
Some people actively facilitated this by actively trying to confuse the Facebook algorithm, clicking on random content and following pages with divergent interests; some also felt a need to monitor the opinions of friends in order to be aware of the divergent ideologies they espoused.