You are here

Internet Technologies

Call for Applications: CCI Digital Methods Summer School, 15-19 Feb. 2016 (#cciss16)

We are now inviting applications for the 2016 CCI Digital Methods Summer School. The deadline for application is Monday 21 Sep. 2016.

Hosted by the QUT Digital Media Research Centre (DMRC), the 2016 event will focus on digital methods for sociocultural research. It is designed for university researchers at all stages of their careers, from doctoral students, postdoctoral and mid-career academics to established scholars.

The week-long intensive program will focus on new quantitative, qualitative and data-driven digital methods and their research applications in the humanities and social sciences, with a particular focus on media, communication and cultural studies and their applications in the creative industries.

Participants will work with leading researchers, engage in hands-on workshop activities and will have the opportunity to present and get feedback on their own work.

The Summer School will offer a range of introductory hands-on workshops in topics such as:

  • Digital ethnography
  • Issue mapping
  • Social media data analytics
  • Software and mobile app studies
  • Analysing visual social media
  • Geo-spatial mapping
  • Data visualisation
  • Agent-based modelling
  • Web scraping

The program will be conceptually grounded in the problems of public communication and privacy, digital media production and consumption, and the ethical issues associated with big data and digital methods in the context of digital media environments. There will be talks on these topics in addition to the workshops.

Understanding New Media Rupture-Talk

The next speaker at AoIR 2015 is Michael Stevenson, whose focus is on what he calls new media "rupture-talk". The idea here is to take what we often refer to as "mere talk" more seriously. Michael points to John Perry Barlow's "Declaration of the Independence of Cyberspace" as an example of this – new media as a radical break from the past.

The concept of cyberspace has been on the decline since its heyday in the mid-1990s, even in major booster publications like Wired. But other rupture-talk concepts, such as MOOCs or the "social graph" have emerged, and are also used to signify a radical break from the immediate past. Such terms are often understandably criticised as hype (Evgeny Morozov's The Net Delusion is an obvious example).

The Challenges of Understanding Content Dissemination on Facebook

The final speakers in this Digital Methods plenary are Axel Maireder and Katrin Jungnickel, whose interest is in the uncertainties of the Facebook timeline. Facebook has continued to tinker with how the timeline is selected and presented for several years now, and this affects the flow of communication on the platform; what, then, are the factors which determine that flow?

This study combined content analysis and user surveys, but both these approaches have their drawbacks - it is impossible from the outside to track the content of users' timelines, for example, but surveys of users also suffer from self-reporting biases. In the end, the researchers asked users to copy the links they received through their timelines into an online survey, and to discuss the content of the URLs and the Facebook friends they received them from. Issues with privacy as well as the tedious nature of this approach also affect the results, however. Some 550 users participated in the study.

Generating Representative Samples from Search Engine Results?

The next plenary speaker at Digital Methods is Martin Emmer, whose focus is on sampling methods in digital contexts. Online media are now important public fora, and conventional media are increasingly using digital channels to transmit their content as well; this also leads to a shift in media usage, of course, and some of that shift is also driven by generational change.

If we need to examine the digital space to understand current debates in the public sphere, then, how do we generate representative samples of online content and activities? With traditional mass media, it was possible to draw on comprehensive lists of media providers, with a small handful of alternative media; in the digital environment, channels and platforms have multiplied massively, and it is no longer trivial to select a small number of sites and spaces which represent all online activity.

The Impact of Social Sharing on Google Search Results

The next session at Digital Methods is a plenary panel which begins with Christina Schumann, whose focus is on Google and other search engines as technological actors on the Internet. Search engines are especially important as they now serve as a kind of gatekeeper on the Net - but the criteria they use for ranking and structuring information are often far from transparent.

The basic approach of search engines is to crawl or otherwise gather Internet data which are then indexed and processed into a database; this database is queried as a search query is entered into the search engine. Factors in returning search results include on-page information (content, programming, and design of Web pages) as well as off-page metadata (especially the link networks surrounding each page, relative to the theme of the query).

The Problematic Rise of Read Receipts in Social Media

The final presenter at "Compromised Data" is Kamilla Pietrzyk, whose interest is in the user experience of social media platforms which provide read receipts - as in Facebook chat, iMessage, or Snapchat. Very little research has been done about this so far, but there is growing unease about this functionality, which notifies the sender of a message that the message was opened and (presumably) read.

Email offers this functionality as well, but here the read receipt is a per-case opt-in facility; recipients can choose not to send read receipts as they read the email. Underlying this, though, there are also message delivery notifications in email, which confirm that the email was delivered to the recipient's mailserver, although this does not guarantee that the recipient themself will have read the message.

Bottom-Up Measurements of Network Performance

The next session at "Compromised Data" starts with Fenwick McKelvey, who begins with a reference to the emergence of digitised methods for the study of the Web during the mid-2000s. This was the time around which the latest generation of social media emerged, enabling us to begin thinking about society through the study of the Internet, requiring the development of new research methods by repurposing computer science methods for social science research.

In Toronto, Infoscape Labs developed a number of tools for the exploration of political discourse in Web 2.0, including the Blogometer. This is the emergence of platform studies, paying attention to the platform itself - but this also introduces challenges about how to study the platform, as the core object of research itself intervenes in its study, e.g. through the politics of APIs. This work also required compromises around data access and utilisation, and a growing bifurcation between scholarly and commercial research activities emerged.

Studying the Processes of Media Production

The final speaker in this AoIR 2013 plenary is Gina Neff, who notes that the study of online practices and texts can only provide a limited perspective on resistance to capitalism. The political and economic affordances of the Internet are less open to resisting capitalist models than we might have thought; it tends to subsume resistant practices into online capitalism in the end.

This leads Gina to suggest that the era of the amateur is over. Capitalist dynamics privilege the platform developers, policy makers, proprietors and others over users; the Net is tool for and symbol of the reproduction of this set of power relations. Through it, proto-, pseudo-, and not-quite-yet-professional media makers are subsumed into the system.

The Emancipatory Potential of Tech Activism

The final speaker in this first AoIR 2013 plenary is Christina Dunbar-Hester, whose focus is on activist technical projects - such as micropower radio stations or community wifi networks. The activists describe such activities with the Amish term of barnraising, highlighting the community empowerment and self-sufficiency aspects of such initiatives. The hope is to demystify technology and generate political engagement through further hands-on knowledge sharing.

There is a big difference in this in how technical expertise is seen as empowering (through sharing) rather than disempowering (through the emergence of knowledge elites). But there remains a strong white middle-class basis to this - such sharing continues to speak largely to a male white addressee, and the involvement of women or minorities in these initiatives remains rare.

Online Racism Isn't Just a Glitch

Next up in this plenary at AoIR 2013 is Lisa Nakamura, whose interest is in racism online - an issue which is often downplayed as a minor problem or an irrelevant distraction. But what drives online racism - is it a product of the greater levels of anonymity online (and thus an inevitable, natural, normal effect of the Net)? Does this mean that humans are fundamentally, inherently driven to racism, which the Net enables us to live out? Does the Net enable us to indulge in glitchy behaviour, in other words?

But the machine of the Internet is not a separate, animate entity with its own agency, but is co-created with or by us. The idea that the Net has its own, separate nature is merely a convenient excuse - as in Ian Bogost's statement that it's not gamer culture that's racist, but the Internet itself. If online racism is seen as a glitch in the system, this places it alongside other (e.g. hacker) exploits of glitches - it legitimises and excuses racism as merely off-topic and a failure of protective mechanisms.

Pages

Subscribe to RSS - Internet Technologies