You are here

Bottom-Up Measurements of Network Performance

The next session at "Compromised Data" starts with Fenwick McKelvey, who begins with a reference to the emergence of digitised methods for the study of the Web during the mid-2000s. This was the time around which the latest generation of social media emerged, enabling us to begin thinking about society through the study of the Internet, requiring the development of new research methods by repurposing computer science methods for social science research.

In Toronto, Infoscape Labs developed a number of tools for the exploration of political discourse in Web 2.0, including the Blogometer. This is the emergence of platform studies, paying attention to the platform itself - but this also introduces challenges about how to study the platform, as the core object of research itself intervenes in its study, e.g. through the politics of APIs. This work also required compromises around data access and utilisation, and a growing bifurcation between scholarly and commercial research activities emerged.

All of these were points of complication for the development and study of digital methods. And this is further complicated by the growing intelligence of the network itself, e.g. through more intelligent Internet traffic management with the advent of deep packet inspection and the threatened dismantling of network neutrality. Monitoring of Internet usage has become more and more commonplace; the latest revelations about NSA surveillance activities are just the latest stage in this trend.

How may we extend platform studies into a greater variety of digital media, which are now reliant on a number of algorithms for low-level and core functions, then? These technologies are diverse and variable, and the perception of their impact is often very user-specific. But the Vuze Bad ISP project is one such initiative: it asked users to install a small piece of software on their machines which would monitor the level of ISP-level packet inspection as a form of countersurveillance, and which combines the reported results into a more global perspective.

Fenwick therefore suggests a Measurement Lab which is dedicated to Internet measurement, and has been working with the Canadian Internet Registration Authority to develop standard and reliable metrics to study the performance of the Internet as such as of specific ISPs in particular. A number of tools for doing this now exist, measuring access speeds, packet inspection lags, point-to-point paths (showing whether traffic is routed through the USA or not), etc.

Is this a form of crowdsourcing? Public Internet research needs to take seriously the creation of a commons through its deployment of digital methods (who owns and pays for the digital measurement infrastructure?), the accessibility of tools and data (where do the data reside, how may they be accessed?), and public participation in digital methods (how do we make sure that commitments extend merely beyond building a common infrastructure?).