New iOS 7 APIs for the C# Developer

Jason Awbrey

This article was published in:

This article was filed under:

Apple introduced the newest version of its flagship mobile operating system, iOS 7, in September 2013. The biggest change for users was a completely revamped UI, using brighter colors, flat designs, and a complete move away from skeumorphic interfaces that have existed since the introduction of the original iPhone in 2007.

For developers, iOS 7 includes hundreds of new and modified APIs, many tailored to take advantage of the hardware capabilities of the new iPhone 5s.

And if you are a C# developer, you don’t have to feel left out.

Xamarin provides a set of developer tools that allow C# developers to directly access native iOS APIs in order to write native apps, while still using their deep knowledge of C# and the .NET core framework.

I’ll examine five of these new APIs and how to use them in Xamarin:

Beacons

Text to Speech

Maps

Barcodes

Background Transfers

Beacons

Beacons are a new technology that relies on Bluetooth Low Energy transmitters to allow devices to communicate their presence to other devices within range. One device acts as a beacon, transmitting a signal that other devices can detect and determine their range from.

Imagine setting your iPhone up as a beacon on your nightstand, and having your teenager’s device register its presence when he comes in late, or using standalone beacons in a crowded convention center to help users find room locations.

Let’s look at an example of setting a beacon and a receiver using Xamarin.

First, create a new CLBeaconRegion, assign it a GUID, and tell it what kind of notifications you want to receive.

Next, ask your region to return a PeripheralData Dictionary based on an input power (in dB) that you’ll use to create the beacon. You’ll also create a delegate and DBPeripheralManager object and use the StartAdvertising() method to start the beacon.

Now that you’ve created and started the beacon, you need to create a client class that looks for the beacon.

The next snippet shows how to do this. You’ll create a CLLocationManager object and assign a handler for the RegionEntered event. If the identifier of the Region matches the one you used in the Beacon, you’ll use a LocalNotification to alert the user that you’ve entered the beacon’s region.

You can also tell the user how far (in relative terms) they are from the beacon, which can help guide them in the correct direction if they are within range but heading the wrong way.

Finally, assign a handler to the DidRangeBeacons event, and based on the value of the CLProximity enum, you can use a visual indicator (in this case, background color) to let the user know if he’s getting closer or farther away from the beacon.

Text to Speech

Like many developers of my generation, I first learned to use a computer and do rudimentary programming on the Commodore 64. One of the applications I toyed with was SAM (Software Automatic Mouth) the first commercial software-based speech synthesizer. Thirty years later, Apple has included APIs in iOS 7 that allow you to replicate the experience of making a computer talk to you.

Doing this only takes a couple of lines of code, as shown below. First, create an AVSpeechSynthesizer class, and then an AVSpeechUtterance class initialized with the value of a text string. You can specify different voices based on locale strings, and also adjust the rate and pitch of the voice to match your preference. Once you have everything set up to your liking, call the SpeakUtterance method and pass in the AVSpeechUtterance object.

Maps

The original version of iOS included mapping features using Google’s Maps technology. Starting with iOS 6, Apple replaced Google’s Maps API with their own implementation. In iOS 7, Apple has enhanced Maps with several new features, including 3D projections that also display individual buildings on a map.

First, create a new MKMapView, and define two coordinates: one for the center of the map, and one for the viewpoint. Then tell the MKMapView that you want to include Buildings and also display the maps in 3D by setting the PitchEnabled property. Finally, create a new MKMapCamera, passing in the target and viewpoint coordinates, and a third argument that represents the elevation of the camera viewpoint.

Note that the buildings are only shown on a device and don’t display when running in the simulator.

Barcodes

Using the iOS camera to scan barcodes and QR codes has long been possible with the use of external libraries like RedLaser and ZXing. With iOS 7, Apple now includes barcode scanning capabilities in a native API. You can also easily generate QR Codes from within your app.

Barcode scanning with the iOS 7 API is not as straightforward as using some of the other new APIs, but it isn’t difficult.

To start scanning, create a new AVCaptureSession and define input and output options for it. Next, tell it what types of images you’re looking for using the AVMetaDataObject enum. In this sample, you’ll look for QR Codes and EAN13 barcodes, which are commonly used as UPC codes on consumer products. You will also assign a delegate to the session.

In the Delegate class (Listing 1) you’ll process each AVMetadataObject found by the scanner, and raise an event that the app can respond to by displaying the data to the user.

QR Code generation is made possible by a new Core Image filter. To generate the image, create an instance of CIQRCodeGenerator and assign the text you want to encode in the Message property. The CorrectionLevel property tells the generator how much error correction information to include when encoding. A higher level creates a larger image, but reduces the chances that someone scanning the image will encounter an error. The default level is “M” with a 15% correction level. The other possible values are “H,” “Q,” and “L,” representing 30%, 25%, and 7%.

After you generate the code, you need to transform it from a CIImage into a UIImage so you can display it in a UIImageView.

To initiate a transfer, create a NSUrlSession, passing in a Delegate class, and then call CreateDownloadTask, passing in a URL to the resource you want to download. Finally, call the newly created task’s Resume method.

Skeumorphic Design

Skeumorphic design applies real-world artifacts to digital designs. Prior to iOS 7, skeumorphic elements were common in many Apple apps. The stitched leather appearance of the iPad calendar and the yellow-notepad background of the Notes app are both examples of skeumorphism.

Bluetooth Low Energy

Bluetooth LE is a wireless networking technology developed by Nokia to provide lower power consumption and cost while providing similar range to existing Bluetooth devices. BLE was merged into the core Bluetooth 4.0 standard in 2010. iOS devices that support Bluetooth LE are iPhone 4s and newer, iPod Touch 5th generation, and iPad 3rd generation and newer, including the iPad Mini.