Tonight is the night of the AoIR 2016 public plenary, and while it's a panel discussion which I won't blog we are going to start with a few short statements from the panellists. We begin with Kate Crawford, who notes the contribution of so many AoIRists to our understanding of the Internet as more than a utopian cyberspace, and instead as a complex stack of network protocol, platform, infrastructural, connectivity, Internet of Things, and other Internet governance layers.
But we have a new problem: more and more artificial intelligence backend systems are being deployed now to ingest and process the data that we constantly generate. These are also moving beyond what we have traditionally called 'the Internet', operating in a wide range of institutions and influencing our perceptions of the world around us. The processes are not particularly intelligent, after all; the famous image of a naked girl fleeing napalm bombings in the Vietnam war was recently censored by Facebook, for instance, and while this image was reinstated this is only the tip of the iceberg of the many arbitrary and opaque decisions that are being made by such algorithms. But the decisions being made by humans are not necessarily all that better – there is overall an irrational confidence in the calculability of reality.
Some of this is related to the question of transparency: increasingly, algorithms are being used because they seem to work well for particular tasks, but it is not understood exactly why they work as they do – not by ordinary users, but also by not by their very creators. There is a need to reverse-engineer and thereby audit these processes, but there are also more and more laws that prevent such reverse-engineering, and thereby restrict accountability. It is tempting to create a kind of 'police AI' that watches the other AIs, but this is also a deeply fraught idea – there are important limits to reducing human decision-making to the limits of computer logic. This is a slow acting poison, and we should not be too seduced by AI governance – we need critical conceptual frameworks instead.
The second panellist tonight is Carolin Gerlitz, and she begins by suggesting that it is platforms that rule the Internet. They set up strict rules for what users can be and do, which are strictly defined in form but can be quite open to interpretation. Before Facebook introduced its new 'reactions', for instance, the like was already used in a variety of literal and ironic ways; the reactions options simply gave these uses their own icons that can now again be repurposed and reinterpreted.
Platforms also create rules about how their data may be accessed, enabling the programmability of platforms. Such rules incentivised developers to engage in innovation, but also served to limit the interpretability of the platforms; changes in the rules have led to considerable changes in what research could be done on and with those platforms, for instance, or have made some forms of research fundamentally impossible.
Further, the engagement with platforms is also mediated by other clients and third-party platforms that set their own rules – this results in a cascading of rules across these different tools. And this also extends into the open Web itself, within which these platforms are an increasingly central infrastructure. This also means that the platforms are creating rules for the wider Web, whose providers are increasingly focussed on providing content that is ready to be used across these platforms.
And the algorithmic processing of content; the legal frameworks that determine how content is governed (or not); the placement of commercial content in fast-paced feeds – these are also each governed by their own rules. All of this sets up an interrogation between openness and closure, and these rules are often opaque and interrelated. Rules in relation to platforms cannot be discussed without talking about modes of valuation: what counts as value and who makes these judgments? The platforms slice all of this into fine-grained, datafied items, and platforms set the rules for the mass production of everyday life. Platforms earn money from affect, creativity, culture, and symbolic production: the data points that they create can be judged against how they measure up against these criteria.
The final panellist tonight is Fieke Jansen from Tactical Tech, who points to the Big Five of the Internet: Apple, Google, Facebook, Microsoft, and Amazon. These are hidden to some extent by an understanding of the Internet as 'the cloud', where ownership structures are sometimes obscure and appear not to matter much to end users (until they do). Fieke has worked with artists and activists to problematise some of these issues: one project set up an algorithm to random buy things (including drugs) from the darknet and have them shipped to a gallery; another used a LinkedIn network scraper to identify people working for the secret service.
Such projects trigger significant discussions in their own right, but they also point to the fact that these creative interventions only make visible what is the bread and butter of the Big Five of the Internet. Google, for instance, began as the alternative search engine, but in 2014 was worth US$400b essentially because of its ability to process of the world's data. Its services, and the services of its competitors, are what we use every day, and it is becoming increasingly difficult to escape their lock-in. Facebook is powerful, on the other hand, because of its 1.4 billion monthly active users, which give it considerable political influence. It has a strong desire to conquer the global south by setting up Facebook Zero in developing nations as 'the Internet'.
The very small number of people who are financially backing these leading companies are highly centralised in a handful of companies: Sequoia and Peter Thiel are amongst this exclusive group of leading investors. How have we come to this point? How did we allow this to develop? Why do we question the ethics of small activist groups, but not of these leading companies and organisations? Why do we continue to transfer power to them? And most importantly, what can we do to fix it?