Using a pair of ping pong balls, that Joel our System Administrator found in a cupboard at home, a couple of led torches from down Tottenham Court Road and a touch of superglue we cobbled together our interpretation of the Atlas Glove controllers. (pictured right).

For our test we utilized a projector displaying Google Earth and the control software in a blacked out lecture theatre. This allowed a clear view of the lights which are turned on and off in combination with various hand gestures to remotely control Google Earth. The movie below demonstrates the trial, we were going to leave the movie audio free but couldn’t resist dubbing in the The Sorcerer’s Apprentice by Stokowski. The controller is Joel who quickly became a master of the technique.

Linking Google Earth to a remote vision based control interface is impressive and the fact it worked first time is testament to the teams clear instructions and software.

Of note to some users experiencing a ‘Grey Screen’ when loading the software, through trial and error we found that you also need to install WinVDIG version 1.1.1 (not the current 1.5 release). This enabled the control software to communicate with the webcam.

QRCodes, Augmented Reality, ARK, Unity and the iPad

Stuart Eve here in CASA has been working away in the Unity gaming engine in terms of Augmented Reality applications for the iPhone and iPad. As Stewart himself notes it is surprisingly successful and with at least 3 different ways of getting 3D content to overlay on the iOS video feed (Qualcomm, StringAR and UART). He has been attempting to load 3D content at runtime, so that dynamic situations can be created as a result of user interaction – rather than having to have all of the resources (3D models, etc.) pre-loaded into the app. This not only saves on file size of the app, it also means that the app can pull real-time information and data that can be changed by many people at once.

The preliminary results can be seen in the video below, linked to the Archaeological Recording Kit (ARK)

This example uses the Qualcomm AR API, and ARK v1.0. Obviously at the moment it is marker-based AR (or at least image recognition based), the next task is to incorporate the iDevices’ gyroscope to enable the AR experience to continue even when the QR code is not visible.

It is well worth keeping a watch on Stuarts blog for further updates….