Casting a hybrid IBM Worklight application to Chromecast

This blog post is contributed by Nathan Hazout, a developer for the web and for mobile who does customer oriented R&D for IBM Worklight.

Google Chromecast is an implementation of Google Cast, “a technology that enables multi-screen experiences and lets a user send and control content like video from a small computing device like a phone, tablet, or laptop to a large display device like a television.”

Google provides native APIs (Android and iOS) to allow your application to send media to a Chromecast dongle. If your application is native, there is nothing Worklight-specific to know here. Follow the developersguide and the userexperienceguidelines to enable your application for Google Cast use.

If your Worklight application is built using web technologies (HTML, JavaScript and so on), this short guide should help you start to bridge the gap between Google Cast’s native software development kit (SDK) and your JavaScript code.

Requirements

The example I provide in this blog post was written for iOS, although you can certainly apply the same principles for your Android applications. A basic understanding of Objective-C and XCode is required to follow along. Doing this yourself requires hands-on experience with native coding.

At the time of writing this post, I did not find a good solution to emulate a Google Cast device; therefore you will need a Chromecast dongle to see your work.

Hybrid project

In IBM Worklight, create a new hybrid application, adding your preferred environment(s). I chose iPhone for this example.

Google Cast is pretty flexible with regard to what you want to “cast” (that is, stream). You can stream videos or even run a complex HTML5 application on the television. Our application, however, will be pretty simple. I want to allow the user to tap on one of the pictures presented to him or her and let the image appear on the Chromecast screen.

To limit the scope of this article, I won’t dive into each specific method needed; rather, you should study the Google Cast documentation. You can begin by downloading this sample application.

For the Google Cast receiver (Chromecast) to be able to access those images, they need to have a public URL; they cannot simply be stored on the device (unless you upload them to a public URL when the user clicks to cast).

We’ll also need a Google Cast icon (available on the Google Cast website) to let the user connect to a device.

There is nothing special about the HTML presented above. Create your own interface following the Google Cast design guidelines. In this example the images are static images; however, in your application you could present images or videos downloaded specifically for this user, from your news feed or from another source.

Setting up the Google Cast API library

In Eclipse, right-click on the iPhone environment and choose Run As XCode Project.

Download the Google Cast Sender API library for iOS and link your XCode project with the downloaded framework file. In my tests, I’ve also had to link with CoreText, MediaAccessibility and CoreGraphics.

Send action to native

When the user taps on the Google Cast icon, we want to display a list of nearby Google Cast receivers. Since the scanning happens in native code, we need to send our intent to our native code.

To do so, we can use the new sendActionToNative API (Worklight 6.2 and above). This method allows us to send arbitrary action codes and data to native listeners. I used this feature (and its inverse, sendActionToJS) in several places in this sample application.

chooseDevice is a method I wrote to display a UIActionSheet and let the user choose a nearby device, using the results from the device scanning (devices property of the device scanner). Don’t forget to run the actual display in the main queue since this is a UI operation.

Connect to device

Listen to the user’s choice by following the UIActionSheetDelegate protocol. You can map the index of the UISheet’s buttons with the index of the scanner’s devices property.

self.selectedDevice = self.deviceScanner.devices[buttonIndex];

Instantiate a GCKDeviceManager, set the delegate to be notified when the connection succeeded and connect to the chosen device.

self.deviceManager =

[[GCKDeviceManager alloc] initWithDevice:self.selectedDevice

clientPackageName:[info objectForKey:@"CFBundleIdentifier"]];

self.deviceManager.delegate = self;

[self.deviceManager connect];

Inside devicemanagerDidConnect, you need to launch a receiverapplication. There are three types of receiver applications. For this example, I chose the simple Default Media Receiver because it does not require any registration.

In different places in the project, I also chose to use Worklight’s BusyIndicator API. In combination with sendActionToNative and sendActionToJS I was able to show a busy indicator while the device was connecting or the photo was being loaded by the Chromecast.

Conclusion

I am sure you can come up with a more creative way to use Google Cast within your application (please share with us!). This blog post and the attached sample project demonstrate that choosing hybrid development does not prevent you from benefiting from the Google Cast API.

Make sure to follow the Google Cast design guidelines and test thoroughly. This sample is just a proof of concept and was not tested for production use.