You are here

Mapping on the Internet

Tim Mackey of Geoscience Australia is next, speaking on maps and the impact of the Internet. His organisation has a significant amount of data (some 500 terabytes, growing by 150 TB per year), and aims to make a significant quantity of this material available online. Historically, of course, such spatial data would have been available in hardcopy and updated on a yearly basis, while now it is electronic and Internet-accessible and updated almost daily. Archive and versioning management therefore become crucial issues. Every client of Geoscience Australia, in effect, will receive a different map.

The organisation began using the Net in 1994, and now serves some 400,000 maps per month. Clients are able to view customised images, download prepackaged or customised data, and access incremental update; access is dynamic or query-based, which makes it impossible to archive through traditional crawler-based content discovery mechanisms. Web services of course need to be using standard, ubiquitous protocols which are now largely based on XML (defined by the Open Geospatial Consortium); there are formats for images of data, vector data, and raster data.

Applications used to process the maps often interface directly with live maps available through the Geoservices site rather than using locally stored copies. As a result, the client doesn't need its own local storage, and is always able to work with the latest version of the data (there is no need for versioned releases of data, and errors can be fixed instantly). This constitutes a paradigm shift for institutions working with mapping data.

It also presents significant problems for institutions like the NLA attempting to archive such data; map data here is presented with a sense of best fit for the estimated audience, but it does not necessarily take into account how that audience may change over time and therefore may not be future-proof. Dynamic access can respond to changing needs, however - but such needs need to be identified, of course. And still, the customised nature of the maps means that archiving is especially problematic; query histories and the various versions produced in response to queries need to be archived for reasons of accountability and verifiability.

Geosciences has now converted its maps from the previous tiles (500 defined geographic regions across the country) to a seamless mapping database which is scalable from the very large to the very small; there are 96 layers in this database (roads, rivers, vegetation, etc.) which themselves group into 19 overall themes (natural features, man-made features, etc.). From these layers, digital data can be extracted on demand, and the distributed in a variety of media. Different historical versions of this data are also being archived.

Overall, then, for Geoservices Web delivery is the key to its future. Archiving this continuously changing product is a significant problem, however. There is a hope that current pilot projects like the seamless database will prove they are the future, and can expand to cover a wider range of details still.