Augmented Reality with Swift and ARToolKit for iPhone

aug·men·ted re·al·i·ty
noun
a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view.

Background

Augmented Reality(AR) has been a hot topic couple of years and it will get huge in the future. There are already few mobile applications for iOS and Android devices such as Pokémon GO (free), Ink Hunter (free), WallaMe (free), Google Translate (free) and much more. Each of these applications makes use of the camera view of the phone to view the real world and superimposes a computer-generated data which could be an image or a text or a blurb with detailed information etc, to provide a composite view.

AR can bring a different dimension to your mobile application. You can either write your own code for implementing the AR view into your app OR there are few open source and commercial libraries which can be integrated.

What are we building?

Today we are going to create a simple Augmented Reality application for iOS devices using the open source library called ARToolKit. The Swift based mobile application displays all the flights in your vicinity that are captured from a raspberry pi powered ADS-B based ground station, which decodes the ADS-B signals to a JSON and sends it as MQTT message to the IoT platform. This particular implementation uses IBM’s IoT platform.
The mobile application also displays an AR view which when pointed towards a flight will display a callout view with the flight information such as altitude, distance, speed. It also displays the weather at the location of the flight. The weather API is integrated using IBM’s Weather Data API.

This blog won’t cover the part to set up the ground station.

What is ARToolKit for iPhone?

ARToolKit for iPhone library for iOS is an objective-C based framework which runs on all modern iOS operating systems, including iPhone, iPad, and iPod touch. The ARToolKit for iOS follows the Model-View-Controller pattern. It has features to use and calibrate your camera, configure video capture in ARToolKit, moving textures on iOS and more. It abstracts all the mathematical calculations of projecting the computer-generated data into the real world data and mapping them side by side, for example, to correctly positioned the callout view with flight information right next to the flight in the camera view.

Integration and Configuring ARToolKit for iPhone

You can download the SDK from the ARToolKit Github page which can be found here. Copy the ARKit folder into your swift project. Since it’s implemented in objective-c you need to have a bridge file which includes all the header files from ARKit folder, which allows you to use the objective-c code into the swift code.

1

2

3

4

5

6

7

8

9

#import "ARKit.h"

#import "ARKitConfig.h"

#import "ARKitEngine.h"

#import "ARViewDelegate.h"

#import "LocalizationDelegate.h"

#import "LocalizationHelper.h"

#import "ARGeoCoordinate.h"

#import "ARObjectView.h"

#import "RadarView.h"

In order to instantiate the AR view, you can initialise the ARKitEngine. It has its own default config which you can change suitable for your application. The default configurations are:

The main configuration for the swift based application is to make use of altitude, add the coordinates and start listening. Coordinates are basically
the geocoordinate which is created based on the longitude and latitude of the flights. The way to generate coordinate using this library is as follows:

Each geo-coordinate has information like location, altitude and calibrating accuracy based on user location and altitude. When each geo-coordinate
is initialized, it automatically calculates the position to superimpose in the camera view.

Custom View in Augmented Reality mode

In order to display the callout view based on the geo-coordinate, the view controller that implements the delegate overrides the view method for each geo-coordinate. In the view method, you can create the callout view based on the geo-coordinate as annotation with all the necessary information you would like to display for the flights. Then the annotation can be added to the ARObjectView as a subview. Below code gives you ideas of how to create your own display in AR mode.

IBM Weather Data API

The callout view also includes weather data at the current location of the flight. Displaying weather data is as easy as calling a restful API after creating a weather data API service with proper credentials. In order to do that go to IBM Bluemix, select Weather Data API service, and create a free tier service to get the credentials. You can call the service using the following:

You Might Also Like

Sanjeev Ghimire is passionate about problem-solving through technology. With more than a decade of software engineering experience ranging from fin-tech to healthcare, Sanjeev was also the CTO and co-founder of onTarget. These days, Sanjeev is a Developer Advocate with IBM focused on emerging technologies such as blockchain, IoT, and AR/VR. When not cranking away in Java, Swift, Python, or Scala, you can find Sanjeev at the gym, playing the drums, or catching an Arsenal F.C. (a.k.a. Gunners) soccer match.