This is an updated version of IOS Installed application I blogged about couple of weeks ago.

The biggest change is , this is no longer just to install iOS applications but Android ones as well. There’s a big IOS/ANDROID switch to manualy select the platform.

I did consider automatic platform selection by the type of attached device, but if you do a lot of cross platform development you tend to have both kind of devices attached, so in that case manual selection of target platform is best..

The other big change is, you no longer need to drag the application manifest xml file unto the app. Just the application file : .ipa for iOS and .apk for Android.

I unpack the application file in the memory and find the bundled application manifest and get the app ID information from it..

Now with AIR 2.6 iOS camera support and the unrestricted camera feed access to developers in iOS4, we can finaly build some cool AR apps on the mobiles right ?. Well… in theory at least!

The Setup :
To see how it would perform, I made a quick AR demo using Flar toolkit 2.5.5 for motion tracking and pattern recognision and Papervision 2.1 for the 3D scene. Scene consist of a simple flat shaded 3Ds model, plus 4 interactive buttons. The whole scene has less then 170 polygons. A very lightweight scene with very low poly count. To make the conditions fair for both devices I kept the AR scene dimensions the same (hence it looks smaller on HTC Desire HD). Both apps use gpu rendering.

I exported the app into an iPhone format using latest available AIR SDK – 2.6. For comparison I also converted the flash project into Android format. I tested the iPhone version on iPhone 4 and Android version on HTC Desire HD.

The demo has 2 buttons that enable the user to turn off the motion tracking/pattern recognition – (FLAR ON/OFF), and scene 3D rendering. By turning off the 3D rendering, and seeing the changes in frame rate, we can see how much resources are consumed only by the rendering of the 3D scene. By turning off the 3D rendering we can see how much resources are spent on the motion tracking.

Here’s the quick video showing the iPhone, and Android version

Framerate results comparison:

(Swf had frame rate set to 25fps)

And here are the lessons learned :

Lesson 1: Augmented reality can be implemented on mobile devices via flash conversion, just not in any usable manner

Although app will still run and function, the poor frame renders the application unusable. The problem is not just visual – stuttering screen playback, it’s also functional. If you have interactive elements in the scene such as buttons, low frame rate will make a click / touch detection difficult and user would have to try multiple times for a touch event to register.

Lesson 2: iOS conversion performance is superior to Android one.
Although both devices run on a similar hardware (both have 1GHz processor) and the apps were converted from the identical swf file – Android performance was on average worse. Also, while framerate on the iPhone seemed to be pretty stable, on the Android it was generally erratic and fluctuating.

Why is this happening?

I personally think it’s due to the way the apps are being converted and run on the device. While iPhone app is generated by converting the swf file into a native app, android app is being converted into an android air app compatible file and run by the Air runtime wrapper that has to be separately installed on the device.
That , in my opinion , creates a performance decrease , as it requires an extra hardware abstraction layer between the code and the device hardware. Think of it as flash file being run inside a flash plugin in the browser, versus OS native app. By design, swf app will never achieve the same performance as the native one.

The question here is , why does Adobe approaches Android conversion in this way. As for the iOS, it is quite clear why the iPhone apps are converted to native format. Apple wouldn’t let them do the Air wrapper approach. For the Android , they don’t need to however, yet they choose to do it. From the marketing point of view , it makes perfect sense of course. In the same way the spread of flash plugin helps to spread the entire platform and the tools (mainly the creative suite).
From the developers perspective it’s not that great though.

It means that every Android user , in order to run your app, will have to download AIR for android runtime. Although this helps to keep the file size relatively small (android version 451Kb vs 5.3MB iPhone version) the air runtime itself has 17 MB!
I would really like to hear justification from Adobe, as they were really conscious about increasing the file size of flash plugin (still less then 3MB) when almost everybody has broadband at home, yet they’re OK with forcing people to download 17MB on 2G/3G mobiles? I think they might be shooting themselves (and developers ) in the foot here.

The extra AIR Runtime file size issue is bad enough, but if you add an inferior and erratic performance to the mix there is really no reason to prefer conversion to air vs native app.

I am aware that performance in AIR 2.7 is better (there’s no Android Air runtime available at this moment so can’t test this personally) and with flash 11 and molehill we might see even better increase when it comes to GPU rendering.

However; there will always be applications that rely heavily on the cpu and for those, native app conversion will be always superior option.

Recently I’ve worked on a mobile app for the company I work for , and instead of just showing company’s address or displaying it on a map I thought.. wouldn’t it be more helpful to give user directions from wherever he is to where the company is ?
To do this, you of course, need to utilise a few APIs:

1. Google Maps API to download map tiles and draw a poly line from user’s location to destination,
2. Air’s Geolocation API to determine current users location
3. Air’s Multitouch API to let the user pan/zoom and rotate the map.

Getting the google maps to show on your display list is pretty straightforward, there’s a couple caveats though. It is now compulsory to initialise map with a sensor parameter (just a Boolean switch expressed in string format indicating whether map is used on a device with geolocating sensor.)
Also , to get it working in AIR app, you have to pass the ‘url’ parameter. I used local host for this.

Once we have the GeolocationEvent fired , we can determine user’s location from event.latitude and event.longitude.
However, Google maps Directions.load normally works with address format query, eg. “from 25 apple Street, NewYork to 22 Boulevard Street, San Francisco”. To get it working with LatLng you have to use this format (Thanks to Barry Hunter from Google maps forums for this tip)

It’s important to centre and zoom the map so that the whole journey is visible on the map.
OnDirFail is called when the given coordinates couldn’t be resolved, and then it’s probably best option to give user a chance to input the address manually.
Tested on Android 2.2 on Desire HD with Air 2.6 runtime.