You are here

Why – and How – to Conduct Publicly Engaged Research?

There are duelling keynotes at ECREA 2022 this afternoon, and while I love Eli Skogerbø’s work, I’ve gone to Mirko Tobias Schäfer’s keynote instead. Picking up again on the conference theme of ‘Rethink Impact’, Mirko begins by asking why we should conduct publicly engaged research. Universities are increasingly paying attention to societally engaged research on a number of topics, and the societies and humanities in particular tend to take a particularly defensive stance that continuously seeks to explain how and why they are useful to society. The pandemic has hopefully made this easier to see by now.

It is now clear that our expertise is needed. Publicly engaged research, then, expands our ability to conduct research, provides learning resources, and enables us to contribute to shaping our societies. This increasingly needs cross-institutional, cross-disciplinary, and cross-occupational collaborations, given that many of our current challenges are so substantial. And our work is confronted by other stakeholders (politicians, lobbyists, other interest groups) – so we need to make ourselves visible in these debates.

The pandemic has made it quite clear that technocentric solutions alone are insufficient; COVID apps alone, for instance, clearly weren’t sufficient. Understanding how actual people engage with the crisis is critical; the distinct expertise, methods, and teaching of the humanities is critically relevant, for instance in the area of critical data studies. The challenge here is the knowledge transfer and implementation of such insights, of course: how can such engagement and knowledge transfer be facilitated?

Privileged access to organisations and data can be crucial for this; this connects to practitioners’ knowledge and expertise, and builds on and generates empirical evidence in much the same way that this is common in the STEM disciplines. Such projects are often local, just like all data are local, and this local empirical work sometimes stands in stark contrast with more abstracted, often US-centric critical data studies perspectives. And engaging with these sectors also produces real social data and analytics that are not available from digital trace data sources.

This must also be connected to teaching; students should be involved in such real-world projects, and thereby develop an understanding of the experiences and needs of the practitioners in the partner organisations, as well as the personal and interpersonal aspects that may affect the speed of change. It also enables educators to shape their activities more to real-life situations, of course.

There is also a more normative, impact aspect to this, however, which is more complicated. Universities have plenty of high ambitions to change the world, but this is considerably more difficult to achieve than just describing it.

Mirko’s own institution, the Utrecht Data School, is one amongst a number of research organisations that pursue these aims; he cites organisations like Data & Society and AI Now as other examples. It began by offering data skills to a variety of organisations, for example in local government, and involved its students in the data assignments that came from these partners; the findings from this work directly helped those external partners, but also provided new insights into the workings of these institutions to the researchers, and into the limitations of the available data. This was a combination of applied and basic research.

The Data School is now an interfaculty platform between the Faculties of Science and Humanities, and its three premises are to be where change manifests; to connect teaching and research; and to ensure that the research yields both valuable academic insights and effective societal impact. Its key domains are AI and big data, conducting mainly qualitative research with a focus on governance, accountability, data literacies, and representation, and public debates and media, which generally focusses more quantitatively on platforms, fragmented audiences, and the dynamics of communication and media practices.

Examples of such work are the Data Ethics Decision Aid (DEDA), a dialogic process for evaluating data projects and value-sensitive design, developing digital-ethical literacies, promoting accountability, and engaging in participatory observation. This also led to a Dutch government commission to develop a Fundamental Rights and Algorithms Impact Assessment (FRAIA), collaborating with legal researchers and the government sector, which in turn also opens up new avenues for education programmes, further research, and societal impact.

The Data School also conducted a partnership project that examined gender bias in job Websites, and identified search algorithms that produced different results for differently gendered job title formulations; this was highlighted in its publications but without shaming the platform operator, and the problem was fixed within weeks of being identified. Similarly it worked with a publisher to review the quality of its own recommender system to identify, and from this developed a set of values that such systems should embrace.

Most organisations are of course also interested in examining the public debates surrounding them, and this presents a rich vein of prospective work. Such small assignments lend themselves well to student work, and enable better understandings of the continuing transformation of the public sphere and its consequences for open societies; a particular interest here is also the role of expertise in these debates. And this also provides a basis for longer-term collaborations with journalists, as some of these studies are also of considerable news interest, and in turn produces new insights into how the media industry works. In the end, this led to the creation of a jointly-hosted PhD position, too. This has now also led to a larger project exploring the correlations between parliamentary debates, social media conversations, and public incidents, to explore polarisation and radicalisation.

Key to this overall approach is to go into projects without pre-conceived notions about what the project will do: it is driven by observations, interviews, and project acquisition, and the research plan emerges from this. This requires the development of strong interpersonal relations and an ongoing presence in the partner organisation, too. The research itself then continues to involve participatory observation, but also teaching and educational activities with the partner’s staff, as well as the applied and empirical research itself – and this also produces direct impact and knowledge transfer. And finally, dissemination involves scholarly but also industry publications that deliver tangible, applicable results; and such implementation and optimisation will involve the implementation of findings and deliverables, further evaluation and optimisation, a review of existing theory in light of the new findings, an iterative implementation and improvement of educational formats, and the identification of new and further research opportunities.

Measuring the impact of such work remains difficult, however: there is a great deal of invisible labour involved in this effort. Industry recognition and rewards help, of course. And the rules of engagement must also be defined: in the choice of partners, the suspension of judgment in engaging with them, and the values that such work pursues. Further, the autonomy, independence, and integrity of the scholarly work must also be safeguarded. And the infrastructure for this work is also difficult to establish: this includes research funding and legal support for the research, of course. All of this, finally, also depends on effective teamwork, and the distinctions between researchers, support staff, students, teachers, and practitioners blur.