Maps in iOS 6

With Apple’s iDevice lineup, vertical integration has been a constant theme. In iOS 6, this trend continues, even if the motivations are a bit more contrived than a desire to further bring more of the iOS experience under Apple’s control. The drive this time is a souring relationship between Apple and Google which has been in the making for some time now. To a broader extent, the Maps.app experience on iOS has been aging for a while with no clear party to blame. iOS 5 lacked turn by turn navigation, 3D buildings, higher resolution map tiles or vector roads, and other modernizations that have been part of the Android mapping experience for quite a while. That is, until now.

With iOS 6 Apple moves to a solution completely detached of Google, and for the most part brings functionality back up to parity with the competition. The features I touched on already are the main highlights of the new Maps.app experience, as iOS 6 now includes: turn by turn navigation with voice guidance, fully textured 3D buildings, and vector roads. Most of the things users have come to expect from Maps.app remain as well, including three separate views: standard, satellite, and hybrid. Those almost don’t need explanation: standard is the typical solid colored region mapping paradigm we’ve grown accustomed to with vector roads, and points plus regions of interest, satellite includes purely aerial or satellite imagery, and hybrid overlays vector maps and points of interest on the satellite view. Nevertheless there are some things missing from the new Maps.app which we will touch on later.

Left to Right: Standard, Hybrid, Satellite

The amount of complexity in moving to a different mapping data source is not something to underestimate. Over the years, Google has built out an extraordinary mapping product with the conjunction of their own street view fleet (which collects both location data, driving conditions data, and a full two pi steradian sphere of imagery), and an even larger set of data contributed by individual handsets reporting back driving conditions which is aggregated into traffic data. Google also made some acquisitions of its own back in the day to catalyze its own Google Maps experience, and at present it is the 9000 pound gorilla for maps.

Most users don’t realize this, but this traffic data is often statistically derived and crowdsourced using data from handsets in areas where in-road inductive sensors aren’t installed. The mobile space has already seen one major battle over who gets WiFi location trilateration data (Skyhook, Apple, or Google) and this traffic data often comes back with WiFi location data as well if the interface is turned on. For traffic data, Apple has begun building out its own dataset with the same approach – by crowdsourcing the velocity vector from iOS 6 devices that are in motion. If you plug your handset into a car charger and have the setting ticked under System Services, you’ll notice “Traffic” showing a purple location services icon while rolling around town. It’s obvious to me that Apple is aggressively collecting data to build its traffic database and get close to parity with Google.

For iOS 6 maps, Apple moves to a variety of data sources beyond what’s in their in-house collection, and for the most part they’re disclosed on the acknowledgements page under settings on iOS 6. There are obviously a few different components to the whole maps product: the road data itself, aerial, satellite, and building textures; business and point of interest listings; and finally reviews. If you look at that acknowledgements page, you can back out a mapping between data sources and the components they contribute. The names have already been pretty well discussed: TomTom, Acxiom, AND, CoreLogic, DigitalGlobe, DMTI, Intermap, Urban Mapping, Waze, Yelp, Flickr, NASA, OpenStreetMap, US Census, US Geological Survey, and the US National Mapping Agency. There are 24 different sources whose data is aggregated together into iOS 6 maps, at least in my region with my acknowledgements page. Apple has also acquired Placebase, Poly9, and C3 Technologies for adding to its own in-house maps team.

The other component is the 3D polygons and textures for terrain and buildings. Terrain elevation data comes from regional government data such as USGS, but for cities and higher resolution terrain data there’s one more key player. That player is C3 Technologies, a company Apple acquired which was an offshoot of SAAB AB. The polygons and textures are generated computationally by taking many aerial photos from small aircraft with a small swath size. For Apple, there are undoubtably some buildings which have seen hand massaging of geometry and textures, it’s in the vast majority of otherwise normal buildings that we can use to gauge quality.

This is an interesting contrast to Google’s 3D dataset, which started out supporting 3D buildings with contributed 3D Sketchup models that were manually built and submitted to Google Earth. After that, Google began also computationally deriving buildings using the isotropic aerial view, and further hand massaging trouble buildings.

If that all sounds complicated, it’s because it is. GIS (Geographic Information Systems) are a science in and of themselves, and the complexity and investment involved with Apple completely managing their own product is nothing to be underestimated. The fact that Apple is willing to not only move to, but also maintain their own GIS should emphasize to what lengths Apple is willing to go to distance themselves from reliance on Google.

What iOS 6.0 maps does away with are the few things that Apple and its data partners can’t yet replicate: street view, and routing on public transportation. For street view, you’re basically out of luck, and users will have to use a standalone Google Maps application in the future which hopefully includes this functionality. I’m told that Apple views 3D building support as the replacement for street view's absence. For routing, Apple is going to rely on 3rd party application support to provide public transportation routing data. Searching for directions and selecting the bus or public transport icon will bring you to a routing apps screen, which will be divided into installed routing applications and suggestions from the App Store based on your region. These will then hand a token back to the Maps app relevant data.

So enough about all this theory and what the framework is like, how does the new iOS 6 Maps application actually work? Well, evaluating that whole experience requires us to break down the features into a few different areas.

The overall maps interface changes dramatically from iOS 5. The new iOS 6 maps interface includes a very different appearance or base layer, and again vector roads and street labels. In iOS 5, maps were previously served as texture tiles and scaled appropriately for different zoom levels. While the Google Maps for Android interface includes a combination of vector maps at some zoom levels and texture tiles at others, iOS 6 moves to an entirely vector approach for roads and graphics in the Standard map layout at every zoom level. In hybrid mode, there are vector roads drawn atop texture tiles, and in Satellite mode you get only the aerial tiles. Apple also changed around the unloaded region texture to a black grid with white lines which also scale.

Overall the appearance of Maps in iOS 6 is clean and unique enough to not be a direct ripoff of Google Maps. The jump to vector roads and labels comes with almost no performance hit on the iPhone 4S, where it remains smooth as butter when zooming and panning around. I played with Maps in the standard and hybrid modes on a 3GS and 4 as well, and didn’t see any unacceptable slowdowns. Performance has drastically improved since the first betas.

The quality of aerial textures at this point is perhaps my only major complaint with this new version, as the LoD can be quite low at some zoom levels, and varies wildly by region. Internationally I saw texture resolution jump around between completely unacceptable and quite good. It's obvious that Apple has prioritized home soil since in the USA things are better. Obviously the regions that have 3D building data offer very high resolution textures as a result of custom aerial capture.

This is as far as you can zoom into Seoul, South Korea in Hybrid or Satellite mode, due to low resolution aerial texture assets at present.

There’s no option to include or exclude point of interest labels, instead they appear at appropriate zoom levels automatically in the standard view. About the only option besides changing map modes under the in-app settings fold is inclusion of traffic notes on roads, and an option to print.

New iOS 6 map assets working in Tweetbot for iPhone

From a developer perspective, I'm told and have read that that the switch to Apple-supplied maps is transparent. All the same APIs are implemented with the new maps as in the past, including pins, overlays, the three already mentioned views (standard, satellite, and hybrid), traffic, and points of interest. The apps that I’ve played with which use the maps API work flawlessly so far on iOS 6.

Post Your Comment

104 Comments

You state that it's a new features that the status bar can be changed by apps, and specifically call out Twitter as one that can add status messages to it.

That was in iOS 5. It annoyed me when Twitter would put a status update in the status bar, because the status bar then lots its "tap to go to the top" function until Twitter's update went away.

Second, you state that the main Photo Stream and Shared Photo Streams are exclusionary, an either/or proposition. I beg to differ. My photos still get dumped into my main Photo Stream, while I can have multiple Shared Photo Streams (either invite-only or public.)Reply