You are here

Internet Technologies

Reclaiming Alternative Social Media from the Alt-Right

It’s the first day proper of the first proper in-person AoIR conference since Brisbane 2019, and I’m starting with a session on hate speech. It starts with Robert Gehl, who points out how all alternative social media is being reduced to right-wing social media – this ignores other forms of alternative, citizens’ social media, and even studies by reputable centres like the Pew Research Center are guilty of such oversimplification. Alternative social media is much bigger than just a handful of fascist sites.

Decolonising the Internet

It’s Wednesday, I think, so I’m in Dublin for the first face-to-face AoIR conference since AoIR 2019 in Brisbane. It’s genuinely delightful to be amongst this wonderful community again at last. As usual, the conference starts with the conference keynote by Nanjala Nyabola, addressing the conference theme of Decolonising the Internet. She begins by noting that the vast majority of people experience the Internet in a foreign tongue; and it is appropriate to address this issue in Ireland, which has had its own history of having its national identity and language suppressed for so long.

Nanjala’s keynote is based on research which worked to translate keywords from Internet research into Kisuaheli, and the assumption from others has always been that this was an AI and natural language processing project; but it was not, and the real question is what it means to be human in the digital age. This was also her first academic paper in Kisuaheli; it was a project in decolonisation. How does this even happen? Ultimately, as so often, the story begins with the arrival of the British: the colonisers. And too many people in the world still don’t know what it means to be colonised – the damaging, scarring disruption of history and culture; one of the darkest and bloodiest chapters in human history that reorganised societies for the economic benefits of imperial power, and a form of bureaucratised murder, systematised rape, and legitimised robbery.

In Kenya this lasted less than a century, but in the last decade of colonisation alone tens of thousands were killed, often simply for speaking out against oppression. In addition, lives were disrupted by introduced pests and diseases. This form of structural violence was documented in the files of the occupiers, but the larger loss of culture is less measurable, and the patterns of colonial administration often still continue. And the intention of the violence was to reorganise society to make money; to create ideal labourers – including by rooting out local languages by force in schools. That legacy still endures, and the trophies of this violence still remain in British museums, while culture is still being reclaimed and relearned.

Mobile Technologies on the Frontline in Ukraine

It’s a very foggy Friday morning at ECREA 2022, and I’m chairing a morning session on protests, politics, and the digital that begins with a paper by Roman Horbyk, on mobile communication on the frontline in Eastern Ukraine. This is a project that was launched well before the 2022 invasion of Ukraine by Russia, also covering the ongoing hostilities predating it.

News Recommender Systems: Integrating Supply and Demand Perspectives

Up next in this ECREA 2022 session is my temporary University of Zürich colleague Sina Blassnig, whose focus is on news recommender systems. Such systems are algorithms that provide users with personalised recommendations for news content based on past interactions by them or similar users, overall popularity metrics, and other features.

Towards Global Impact for Scholarly Impact: The Case of Global Kids Online

After a very enjoyable pre-conference on social media election campaigns, it’s now time for the main event to start: Sonia Livingstone’s keynote will open the ECREA 2022 conference, the first in-person ECREA conference since 2018, and the first in a Nordic country. Sonia’s focus, and indeed that of the conference overall (the overall theme is “Rethinking Impact”), is on the pathways to impact for scholarly research, with particular focus on scholarly engagement with the United Nations.

The UN buildings in Geneva are impressive, intimidating, and often empty. Entering the UN compound remains unusual for researchers; yet the UN Committee on the Rights of the Child had recognised the impact of digital media on children’s lives, and in 2014 required scholarly advice on its further research agenda. This also involves consultation with children – a task that is both fascinating and demanding. But what do we as media and communication scholars know about digital media that is of value to the UN and its policy-makers?

The UN process works through a set of documents that are called “General Comments”, which set out the current situation; this is informed by a consultation process involving the various stakeholders. The General Comment addressing the impact of digital environments on the rights of children took a substantial amount of time to evolve, and was published only in 2021.

The Dangers of Datafication

The sessions at this Norwegian Media Researcher Conference are organised in the form of particularly constructive feedback on work-in-progress papers – which is great as a format, but doesn’t lend itself particularly well to liveblogging. So, I’ll skip forward right to the next keynote by Raul Ferrer-Conill, whose focus is on the datafication of everyday life. This is something of a departure from his previous work on the gamification of the news.

He begins by outlining the datafication of the mundane: the way people’s social action – as well as non-human action, in fact – is being transformed into quantifiable data, especially online, and that such data therefore become a resource that can be utilised, operationalised, and exploited. Indeed, the sense in the industry is now that ‘everything starts with data’, which reveals a particular, peculiar kind of mindset. Over the past years, the Internet of Things has moved from an idea to a reality, and this has fuelled the “smart” delusion: the belief that more datapoints mean smarter decision-making processes (they usually don’t).

Introducing the ADM+S Australian Search Experience Project

I’ve not yet had the chance to write much about one of the major new projects I’m involved with: the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S), a large-scale, multi-institutional, seven-year research centre that investigates the impact of automated decision-making technologies (including algorithms, artificial intelligence, and other such technologies) on all aspects of our personal and professional lives. In particular, for the first year of the Centre I’ve led the News & Media Focus Area, which recently held its inaugural symposium to take stock of current research projects and plan for the future. (This was also the time for me to hand that leadership over to my colleagues Jean Burgess (QUT) and James Meese (RMIT), as I step back from that role to concentrate on another major project – more on this in a future update.)

Within News & Media, I’ve also led a major research project which we launched publicly in late July, and which is now producing first research outcomes: the Australian Search Experience. Inspired by an earlier project by our ADM+S partner organisation AlgorithmWatch in Germany, this project investigates the extent to which the search results Australian users encounter as they query search engines like Google are personalised and therefore differ from user to user; if they are, this would leave open the possibility of user being placed in so-called ‘filter bubbles’ – a concept which I’ve questioned in my recent book Are Filter Bubbles Real? We even have a promo video:

  

Investigating such personalisation is difficult: since every user is assumed to see a personalised set of search results, we need to compare these results across a large number of users in order to determine whether there is any significant personalisation, and what aspects of these users’ identities might drive such personalisation. While some studies approach this challenge by setting up a large number of ‘fake’ user accounts that are given a particular user persona by making them search repeatedly for specific topics that are expected to contribute to the search engine’s profile for the account, AlgorithmWatch’s earlier, German study took a different approach and invited a large number of real users to contribute as citizen scientists to the study. To do so, they were asked to install a browser plugin that regularly searched for a predefined set of keywords and reported the results back to AlgorithmWatch’s server.

Our ADM+S project uses this same data donation approach, but extends it further: we query four major search engines (Google Search, Google News, Google Video, and YouTube), and we are able to vary our search terms over the duration of the project. Like the earlier project, we also ask users to provide some basic demographic information (in order to link any systemic personalisation patterns we may encounter with those demographics), but never access any of our participants’ own search histories. Our browser plugin is available for the desktop versions of Google Chrome, Mozilla Firefox, and Microsoft Edge, and I’m pleased to say that more than 1,000 citizen scientists have now installed the plugin.

If you’re based in Australia, and you’d like to contribute to the project, please go to the project Website to install the browser plugin. We’d love to get to 1,500 citizen scientists before the end of 2021.

For more background, I spoke to QUT’s Real World News earlier this year to explain the approach we’ve taken in developing this project:

Different Perceptions of Algorithmic Recommender Systems

For the final (wow) session of AoIR 2019 I’m in a session on news automation, which starts with Marijn Martens. He begins by describing algorithms (for instance, news recommender algorithms) as a form of culture, as well as as a form of technical construct – and by highlighting as well how algorithms are being imagined, perceived, and experienced through the mental models that users construct for them.

Building Chatbots to Support University Students

The next speaker in this AoIR 2019 session is Indra Mckie, who shifts our focus to chatbots – which to date have often been found to be somewhat disappointing in their performance. One type of such chatbot are the dialogue systems that are used to complete bookings or make purchases, and speed up customer interaction; another are chat(ter)bots like the famous Eliza that are set to mimic unstructured human conversations.

Towards Social Journalism: Rediscovering the Conversation

The very final session at IAMCR 2019 features a keynote by Jeff Jarvis, who begins by describing him self as ‘not as real academic, but just a journalism professor’. His interest here is in looking past mass media, past media, indeed past text, past stories, and past explanations.

Pages

Subscribe to RSS - Internet Technologies