The first full session at the IAMCR 2025 conference in Singapore starts for me with a session on algorithmic media, which kicks off with Susanne Eichner. She notes the impact of digitisation on the fragmentation, individualisation, personalisation, and automation of media and users; this has led to research in critical data studies (focussing on the datafication of users and the surveillance capitalism that results from it), as well as in more user-oriented approaches that also acknowledge users’ datafied agency and resistance to such datafication.
How might we bridge these two seemingly opposed logics that variously see audiences as helpless or empowered? Simple binaries or agency versus structure cannot help us here; agency is fluid, without a stable nexus of power, and happens at the macro, meso, and micro level, with different strategies and tactics. We need to work with a politics of the ordinary here.
The idea of data loops may be useful here. Users and companies mutually influence each other through datafication; user data are generated, collected, and analysed by companies, and fed back to users, creating a data loop that users themselves have some degree of control over, too, and agency on both sides can be formative. The presence of reflexivity is also crucial here: user agency can only unfold if there is knowledge and reflexivity about the backend of digital media technologies and systems.
Netflix is a useful case to illustrate this. Its main business is to provide a highly attractive content library that generates and maximises profits; it promises users agency and control over the content while underlying interests and power dynamics are obscured. Audience datafication is a competitive advantage here, but users also have influence via their subscription choices, as do brand image considerations, stock market valuations, government regulation and other forces.
The Netflix recommender engine is central to this; it has defined some 2,000 ‘taste communities’ catering to different tastes, yet this is hidden behind the interface design, and the efficiency of this categorisation system is questionable. Users collaborate with the algorithm if they accept these recommendations, yet might also be irritated by recommendations or push back against its functionality. (For instance, people of colour are often served show thumbnails featuring people of colour, even if they play only a very minor role in the show itself.)
Agentic dynamics thus operate circularly as both sociocultural user practices and institutional practices of media companies. This reinforces power asymmetries, but is also vulnerable to user agency; algorithmic structures show breaks and unintended affordances, and this can trigger users’ algorithmic imaginaries and lead them to develop resistant practices. This requires continuous, deep engagement and reflexivity on behalf of the users, though.