Democracy meets GIS

The rise of powerful consumer-focused mapping tools is shifting the centre of gravity in the once-rarefied spatial information market. But at what cost, convenience?

Governments have relied on geographic information since the time of the ancients, for everything from town planning to sending explorers to the edges of the earth. But while the techniques for gathering geographic data have advanced considerably, its utilisation has remained something of an arcane art.

At least until recently. In the 1990s the Internet accelerated the evolution of maps from static publications to accessible and flexible tools. The release of Australian-developed Google Maps in 2005 cemented geographic information as a familiar element of our Web experience.

Putting powerful yet simple-to-use mapping tools into the hands of consumers showed them the usefulness of maps for displaying all manner of information. But it has also created headaches. For instance, geographic information resides
in a plethora of spatial datasets, whose quality, currency and formatting vary greatly. Making these datasets widely available through Google and Bing also means there is more inaccurate data in the market.

Despite their lack of sophistication, the arrival of consumer tools has also shaken up the geographic information systems (GIS) technology market in other ways. Francisco Urbina, manager for alliances and partnerships at GIS supplier ESRI Australia, says the proliferation of consumer mapping tools has removed some of the mystique from the role of GIS professionals.

“The ‘IS’ component is going into the IT department, because these systems are becoming critical and they need to be robust, scalable, and perform,” Urbina says. “So the ‘G’ people who are producing the maps, and are specialists in their fields, are being relieved of looking after the ‘IS’ side.”

Urbina says ESRI’s response has been to focus on embedding native integration of its tools into business applications such as SAP. Gartner reports that database makers such as Microsoft and Oracle have also improved their spatial data handling, while new tools from Alteryx, Space-Time Insight and IBM’s Netezza are spatially-enabling business data for real-time decision making.

But according to US-based research vice president at analyst firm Gartner, Jeff Vining, it is Google and Bing that loom large in the plans of public sector organisations, despite their limitations.

He cites the City of Boston as an example. Its New Urban Mechanics program has offered seed money to Google’s developers to create new GIS tools, such as for permit allocation or pothole monitoring.

“Google and Bing mapping are location- based services with context and place, and push mapping data, but they are not traditional cartography,” Vining says. “It’s not everything that can be sent to you, but it is enough to make a decision, and you don’t have to spend a lot of money doing the traditional GIS architecture.”

While much of the appeal of consumer tools is driven by cost, their further use is being retarded by the fees that GIS providers charge for data conversion.

“That is a huge disincentive to switch platforms at the moment,” he says. “But I do know that people in the US and UK have an economic necessity to ask if they can get away with using Google or Bing.”

Good maps, bad maps. These new competitors have led traditional GIS suppliers to investigate scalable and cost effective ways of delivering services, including cloud computing. For instance, at the United States Environmental Protection Agency and Department of Agriculture Forestry Service, Vining says ESRI has been hired to assemble a portal of various data sets hosted on Amazon Web Services and publish these to the general public.

Similar projects are afoot closer to home. In South East Queensland, a group of councils is banding together to create the One Source One Queensland project, which aims to make all underground infrastructure locations accessible through a single cloud data store and displayed using Google Earth.

These consumer tools are evolving quickly also. Google in particular is working hard to claim new territory, with 1 billion copies of Google Earth downloaded. Its Google Map Maker tool, available since April 2011, crowdsources the addition of new landmarks and information to the map – and was only recently launched in Australia.

Dylan Lorimer, product manager for Google’s Maps Engine, says numerous US states have used its tools to create their own maps and globes. In the State of Alabama, for example, the project started by publishing data from one county.

“Anyone could log in and look at all of the accurate current relevant data for this county,” Lorimer says. “Over the course of a year, a complete state-wide view was built where each of the counties retained the ability to update the data at any given time.”

While Google may be the hero of the desktop cartographer, it is also inadvertently the biggest culprit in terms of geographic inaccuracies.

According to the chief information officer at the Victorian Department of Sustainability and the Environment, Bruce Thompson, Google Maps uses data from non-authoritative sources of address information.

For example, data coming from the Telstra’s advertising division Sensis may be sourced from Telstra’s own infrastructure rollout – but in the case of new subdivisions this data may be logged before construction is completed, and contain lot numbers rather than residential addresses. Hence lot numbers are confused as addresses in Google Maps.

Thompson knows of one street in Kew, in inner-east Melbourne, which has 187 dwellings that are addressed as ‘Lot 1’. “The problem there is they have a singular indifference to data quality, although they are starting to come around,” he says.

Google for its part pledges to provide to the most up-to-date maps possible, and also encourages users to report errors.

This problem has been felt acutely by NBN Co, which has relied heavily on geographic data to plan its nationwide fibre rollout to what will eventually be more than 10 million premises. In May, chief executive Mike Quigley told a Senate estimates hearing that as much as 30 per cent of data for multi-dwelling units was not consistent (see case study below).

Scaling for clean data. As Thompson explains it, while spatial classification systems such as GPS coordinates are strongly defined, address (cadastral) and administrative boundaries can vary remarkably. The digitisation of information from paper maps took place in the 1980s and 1990s, but those maps often varied greatly in scale. Hence some boundaries in North West Queensland are kilometres out of alignment.

“Spatial is one of the fundamental indices of our environment,” Thompson says. “But we still let people make up their own addresses. “You wouldn’t allow that to happen in the Internet space with IP addresses.”

Thompson is working in conjunction with numerous parties to create a unified, standardised marketplace for spatial information that will reduce data conflict. Also involved are the research organisation CRCSI, data supplier PSMA Australia, the ANZLIC Spatial Information Council and the Spatial Industry Business Association. Their goal is to create a federated marketplace that spans state and national boundaries across Australia and New Zealand.

“We want the marketplace to be something where there are hundreds of thousands of resources, be they data sets or products or services or processes,” Thompson says.
The marketplace would make previously undiscoverable spatial datasets available to anyone who wanted them, with an appropriate remuneration mechanism for data owners.
A demonstrator project involving 1100 datasets will be evaluated over the next six months, but full implementation is at least a year away. While commercial alternatives also exist or are being proposed, Thompson says these remain proprietary and cannot be federated.

The marketplace is a key initiative for the CRCSI. CEO Peter Woodgate says it faces a significant challenge in adhering to the ease-of- use expectations created by Google and Bing.
“You won’t have to have a four year degree in geospatial science to utilise these systems,” he says. “You can be an ordinary citizen and get access to them.”
Woodgate is also wrestling with another ramification of the democratisation of GIS: the desire for consumers to add their own content.
“Verifying the quality of information is a critical aspect of the work that will need to take place over the coming year or two in order to ensure that we do have some quality control filtering mechanism,” he says.

Virtually yours. While the marketplace is an audacious project, it pales in comparison to the mission of Virtual Australian & New Zealand Initiative (VANZI), which aims to recreate the entire geography of both countries in virtual form.
VANZI chief executive officer Michael Haines says the idea originated from his background in logistics and realisation that infrastructure planning could be improved if projects were modelled and trialled virtually. An initial presentation at the spatial@gov conference in 2010 garnered support from the CRCSI, the Municipal Association of Victoria, the Victorian Partnership for Advanced Computing, and NICTA.

VANZI has identified a range of key technologies including GIS, building information modelling (BIM) tools and precinct modelling tools needed to represent complete environments virtually in 3D.
“It is those technologies together that create the virtual world, but in order to create it you need data,” Haines says.
But even if VANZI gains the necessary support, the project itself may take decades to unfold. “It will take us 20 years to build a virtual world that has taken us 200 years to build physically,” Haines says.
All the while, spatial data volumes are growing rapidly. The advent of robotic drones, for instance, is making the gathering of aerial data cheaper, and encouraging its collection on a more frequent basis.

Other initiatives, such as NICTA’s AutoMap, are creating new ways for efficiently and accurately gathering ground-based data. AutoMap has worked with the City of Greater Geelong to inventory road signs across 2500km of urban and rural roads. It uses the vehicle- mounted Earthmine system from Geomatic Technologies to collect 3D panoramic imagery, and then analysed this to identify and locate more than 20,000 signs.
According to NICTA commercialisation manager Paul Stapleton, the long-term goal is to attach this technology to fleets of vehicles that will constantly update roadside infrastructure data for use in mapping.

Further initiatives, such as various environmental sensor deployments being trialled by the CSIRO in the Murray Darling Basin, promise to increase the volumes of live data made available.
Ultimately, CRCSI’s Woodgate says GSI consumers may experience the same real-time access to data they expect through common web searching for news or shopping. This serves to strengthen the case for the proposed spatial marketplace.
“The more current the information,” he explains, “the more valuable it becomes to an end user, so the more likely people are to want to use the marketplace and grow the marketplace.” –
Brad HowarthThis feature originally ran in the June/July 2012 issue of Government Technology Review.

Contact Information

Connect with us

Subscribe to Technology Decisions

Technology Decisions offers senior IT professionals an invaluable source of practical business information from local industry experts and leaders. Each issue of the magazine will feature columns from industry leading Analysts, your C-level Peers, Futurists and Associations, covering all the issues facing IT leaders in Australia and New Zealand today.