We're on to the next keynote (which we've delayed through our question time in the previous panel). Wendy Hui Kyong Chun from Brown Univerity makes a start here. She talks of the tendency to take work at interface value - to fetishise new technology as cool rather than look beyond the interface itself. What conditions, what makes possible an experience of use?
'User-friendly' interfaces conflate control with freedom and are emblematic of the post-cold war conflation of paranoid control with freedom. She draws a parallel with the gated communities of southern California. This stems from an attempt to solve political problems with technology. But, thinking like or with a machine makes you paranoid...
She uses her machine's packet sniffer to point out that computers constantly do more than what is apparent to the user - and making this visible even requires the installation of additional software (the sniffer itself). Packets are constantly being moved across the network, and network cards regularly read them all.
"How is it that such a compromising form of communication has been bought and sold as empowering?", Wendy asks. She draws a parallel to racial issues - coloured people were used very prominently in US-based advertising for ISPs. Net advertising offered a racial utopia (which incidentally balanced out the 'Net as porn' trope). Also involved here is a kind of subtle threat - of whites being left behind by the 'others' in their use of the Net. An image of someone in Arab clothing using the Net would have been interpreted as positive (people of all kinds are coming together) before 911; now it is more likely to be perceived as a clear threat that 'they're overtaking us' and use the Net for terrorist purposes.
Back to interface design: she cites a number of early theorists for whom freedom of interaction with machines stems from control; further, in such systems the technology underlying such the interaction is usually hidden from view. Brenda Laurel argues for for strong causality in interaction design; this, for her, is the fundamental trait, and Wendy calls this 'causal pleasure'. What metaphors are applied to this interaction (folders, etc.) is less important, and to much emphasis on metaphors is in fact counter-productive (it narrows down the possibilities).
In terms of metaphors, Wendy suggests that surveillance of interaction isn't the right one - rather, capture is (it's non-visual and enables a decentralised heterogeneous organisation of the interface). This way, the screen might disappear - we might be able to 'jump the screen'.
We're taking a detour here through the history of programming now - whereas in earlier (and earliest times) teams usually of women physically made the wire connections to programme computers, these were later replaced by automatic programming environments: assemblers which translated from human- to machine-readable code. (This was incidentally resisted or simply not envisioned by some of the earlier computer developers because they were used to having an army of wire girls around them.) The move to assembler languages then removes this visibility of the programming effort, and hides some of the inner workings of programming behind layers of (increasingly invisible) technology.
"Computers understood as comprising software and hardware are ideology machines," Wendy says. Operating systems offer is an imaginary connection to our hardware, and have inscribed into them some sort of ideology itself (cf. the differences between Macs, Linux, Windows and the users they attract). Software produces users, which are defined to some extent by the softwares they use (as in "My Documents" folders and other direct interpellations of the user as an individual user).
This illusion of ideology exists not at the level of knowledge, but at the level of doing (drawing on Zizek here), of interaction. The rhetoric of interactivity might therefore be more obfuscatory than empowering. The immaterial emerges as a commodity - the goal of programming emerges not as a functioning machine but a functioning programme.
But note that this understanding of ideology is also limited; nonetheless software sustains notions of ideology and ideology critique. In the post-ideological age, we even atribute greater power to software than ideology now. And we need a better idea of freedom in this new environment - and here free software is a start (but only a start). Why has the opposition of private and public mutated into one between closed and open?
Of course, ultimately true freedom might be an existential nightmare - a freedom which is by definition without controls, and thus threatening, dangerous; a form of destruction that enables both good and evil. But this is necessary in order not to reduce freedom to a kind of gated community - we must explore the vulnerability that comes with open communication, and attempt to design open systems that come with vulnerabilities with which we can live.