Swift Framework creation

Sometimes, you will find yourself using the same custom code in multiple projects. Simply copying the files into a new project works, but then you have different versions. So, a better way to reuse that code is to create a Framework.

This is my preferred way.There are other ways that The Google will show you. I had luck with none of them – especially at link time. One example has you dragging the framework from the simulator’s temp directory via the Finder. Yikes.

Without a workspace, iterative development on the framework and a potential client of it will be awkward. So, I suggest creating a workspace and adding the framework project to it along with at least one app project that uses the framework.

The above code compiled just fine when the scheme was an iPhone 6. I then plugged in my iPhone 4s and the problem raised its ugly head. If you don’t have an older iOS device, just select the scheme in XCode.

To verify that this was the problem I tried checking the arch and then calling separate init methods. The initial code for both was what you see in the first example here.

I have a hardcopy of the book, so I cannot speak about the quality of the ebook versions. The same content of course, but I know from producing my own epub books that the formatting can be tedious and error prone.

Readers can download a zip of the code examples from the Deitel website. Unfortunately, you have to “register” on their site to get the download link as if we are still living in 1995.

First off, the audience for the book. It is aimed at experienced programmers, especially those with experience in an object oriented language. If you are just starting out, this is probably not the book for you. If that is the case, I’d suggest Swift for Absolute Beginners which is another brand new book.

As the title suggests, this is not a Swift tutorial. Instead, you are introduced to Swift’s features by writing several toy apps. That’s what “app-driven approach” means. I really hate books and course materials that are simple laundry lists of features. In fact, over 90% of the live courses I’ve taught over the past 25 years ignored the printed course materials (unless it was one I authored :)). Laundry lists are easy on the author but hard on the learner. This app-driven approach gets closer to enabling real learning. If the learner has a question in their head while working through the material, and then see the answer a few pages later, that is excellent. Motivational seeding is what I call that. So, you will get a decent foundation in Swift, but you will not see any advanced topics. The things that I’ve banged my head against the wall with, such as interfacing with legacy APIs such are Core Audio or Core MIDI, are not touched upon. I don’t mean those APIs in particular, but interfacing with any of the legacy APIs. As is common with most iOS development books, unit testing is not covered.

The Apps

These are the Apps that the learner will build:

Welcome App

Tip Calculator App

Twitter Searches App

Flag Quiz App

Cannon Game App

Doodlz App

Address Book App

Each App introduces a new iOS and/or Swift feature. For example, the Cannon Game touches on Sprite Kit and the Address Book uses Core Data.

I like the format of each chapter. Each begins with a list of objectives followed by an outline. The page header for the page on the right will be an outline title. I wonder if the ebook formats the outline items as links. This seems to be a small thing, but after you’ve gone through a book, you might need to find something. This helps a lot. It also sets your expectations for what is going to be accomplished in the chapter. Not surprising, the end of each chapter has a “wrap up” telling you what they just told you. Also useful for answering “In what chapter was that thing on X covered?”

Sometimes, the author is a bit lazy. For example, section 4.3.13 talks about external parameter names. The paradigm is given but no code example. Thanks for the Amo, Amas, Amat, but where is the example sentence? Amo libri huius? Also, the Alert controller code on page 148 has a memory leak when you access the text fields in that manner. The Twitter app sidesteps Twitter’s RESTful API and uses a WebView instead. I guess NSURLSession would be too complicated or having to authenticate would be too much trouble.

There are a decent number of technologies touched upon. iCloud, Sprite Kit, Social Framework, Core Data, etc.

The book ends with a chapter on the business end and the App Store. Most developers will tell you that the coding is easier than getting it onto the App Store. Useful information is provided here.

Summary

If you are an experienced programmer, this is a good book to get to get a decent foundation in iOS development and the Swift language.
The softcover book is around 40 bucks.

So, I’m playing around with the new Bluetooth LE MIDI capabilities.
Im my build settings I include the CoreAudioKit framework in order to get
the new Core Audio Bluetooth MIDI (CABTMIDI) controllers CABTMIDILocalPeripheralViewController and CABTMIDICentralViewController.

You also get the Inter-App audio classes CAInterAppAudioSwitcherView and CAInterAppAudioTransportView with CoreAudioKit, but I’m not using them here.

I played around with it, then had to do other work. I came back to it a week later, and it wouldn’t even compile. I didn’t change anything (e.g. no XCode updates). Yes, the CoreAudioKit is indeed included, but the error was one the “import CoreAudioKit”. The compiler didn’t know what that was even though the framework is there and I can even see the headers in the XCode UI tree under CoreAudioKit.framework.

It turns out that the build scheme needs to have a device selected, and not any of the simulator choices. Even if you are building and not running. The device does not need to be attached. You can just choose the first item: iOS Device. Then it will build.

D’Oh!

Apple even says so in a tech note (that I did not know existed). See the resources below.

I can’t count how many times I’ve created a project in my IDE then dropped to the terminal to create a bare git repository, then add that as a remote to my project. And also add/commit/push. So, I decided to make my life a bit easier by writing a small shell script to do all this nonsense. You might find this useful as is or for parts you can copy and paste.

BTW., If I’m the only one working on a project right now, I find that creating the bare repo on Dropbox is handy. This won’t work of course if multiple developers are pushing at the same time.

Asynchronous unit testing in Swift

Let’s stub out some typical code. Here a an API function that takes perhaps a REST endpoint and a delegate that receives a Thing instance. I use a NSURLSessionDataTask because I’m expecting, well, data (as JSON). I’m not showing the gory details of parsing the JSON since that’s not my point here. BTW., it’s not very difficult to parse. The idea is that a Thing is instantiated and the delegate is notified.

So, how do you write a unit test for this kind of code? The API call does not return anything to pass into XCTAssertTrue or siblings. Wouldn’t it be nice if you can make the network API call and wait – with a timeout of course – for a response?

Previously, you’d have to use semaphores, a spin loop, or something similar. Since this is such a common scenario, Apple gave us XCTestExpectation in XCode 6. (Actually, it’s a category in XCTextCase+AsynchronousTesting.)

Here is a simple usage example. I have an instance variable of type XCTestExpectation because I need it in the delegate callback in addition to the test function. I simply instantiate it, make the network call, then call one of the new wait functions. In this case waitForExpectationsWithTimeout. When the delegate is notified, I fulfill the expectation. If you don’t, the test will fail after the timeout.

Swift and AVMIDIPlayer

Previously, I wrote about attaching a low level core audio AUGraph to a MusicSequence to hear something besides sine waves when played via a MusicPlayer. Here, I’ll show you how to use the new higher level AVMIDIPlayer. You can even play a MusicSequence by sticking your elbow in your ear.

Play your MIDI file in the simulator, and you’ll hear sine waves. Huh? A valid SoundFont was sent to the init function, and you hear sine waves? Yeah. After you spend a day verifying that your code is correct, install iOS8 on your actual device and try it there. Yup, it works. Nice.

ps. that slider thing is just some eye candy in the final project. A UISlider moves while playing.

Playing a MusicSequence

The hoary grizzled MusicSequence from the AudioToolbox is still the only way to create a MIDI Sequence on the fly. If you have an app where the user taps in notes, you can store them in a MusicSequence for example. But AVMIDIPlayer has no init function that takes a MusicSequence. Our choices are an NSURL or NSData.

A NSURL doesn’t make sense, but what about NSData? Can you turn a MusicSequence into NSData? Well, there’s MusicSequenceFileCreateData(). With this function, you can pass in a data variable that will be initialized to the data that would be written to a standard MIDI file. You can then use that NSData in the player code in our previous example.

I haven’t checked to see if there is a memory leak with the takeUnretainedValue call. I’ll check that out next.

update: I checked and there is indeed a small memory leak.
The docs for MusicSequenceFileCreateData say that the caller is responsible for releasing the CFData. So OK, takeUnretainedValue is the right one. So I tried saving the data variable as an ivar, checking for nil when playing again, then calling release(). Crash. What about DisposeMusicSequence? OK, I tried saving the sequence as an ivar and calling that. No crash, but memory still leaks. CFRelease is simply unavailable.

Resources

Swift AUGraph and MusicSequence

The AudioToolbox MusicSequence remains the only way to create a MIDI Sequence programmatically. The AVFoundation class AVMIDIPlayer will play a MIDI file, but not a MusicSequence.

AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it). So the way to get a MusicSequence to play with instrument sounds is to create a low level core audio AUGraph and play the sequence with a MusicPlayer.

Apple is moving towards a higher level Audio API with AVFoundation. The AVAudioEngine looks promising, but it is incomplete. Right now there isn’t a way to associate an AudioToolbox MusicSequence with it. So, here I’ll use a low level Core Audio AUGraph for the sounds.

Now you need a MusicPlayer to hear it. Let’s make one give it out MusicSequence.
Here, I “pre roll” the player for fast startup when you hit a play button. You don’t have to do this,
but here is the way to do it.

Wonderful sine waves! What if you want to hear something that approximates actual instruments?

Well, you can load SoundFont or DLS banks – or even individual sound files. Here, I’ll load a SoundFont.
Load it into what? Well, here I’ll load it into a core audio sampler – an AudioUnit. That means I’ll need to create a core audio AUGraph.

The end of the story is this, you associate an AUGraph with the MusicSequence like this.

Soundfont

Let’s create a function to load a SoundFont, then use a “preset” from that font on the sampler unit. You need to fill out a AUSamplerInstrumentData struct. One thing that may trip you up is the fileURL which is an Unmanaged CFURL. Well, NSURL is automatically toll-free-bridged to CFURL. Cool. But it is not Unmanaged, which is what is required. So, here I’m using Unmanaged.passUnretained. If you know a better way, please let me know.

Then we need to set the kAUSamplerProperty_LoadInstrument on our samplerUnit. You do that with AudioUnitSetProperty. The preset numbers are General MIDI patch numbers. In the Github repo, I created a Dictionary of patches for ease of use and an example Picker.

I’m writing an app that uses standard music notation for input. Imagine a view with a staff and a tap inputs a note. Each “note view” represents a note model object. Then you decide that you do not want that note, so you need to delete it. You can get the note by pressing on it. Then that note needs to be deleted from a “notes array”.

So, you have the note, but not its index. If you had the index, Swift gives you zero trouble to remove it from the array.

notes.removeAtIndex(2)

But you don’t have the index. You have the item in the array. Well just use “indexOf”, right? Sure. Where is that? I couldn’t find anything like that. Let me know if you know of one.

What I ended up doing is removing the note by filtering the array. Here is a simple filter that removes the item.

For that != operator to work, you need to implement the Equatable protocol. There is one requirement for this protocol: you provide an overload for the == operator at global scope. “Global scope” means outside of the class. When you overload the == operator, != will work too.

You can remove an item from an array by writing a filter closure. But, your item must implement the Equatable protocol.
If there is a simpler way to remove an item from an array without having its index, please let me know.

Many people here and in the twitterverse have kindly pointed out there there is indeed an indexOf function. But it is not named anything close to that – it is the find(array, item) function.

<soapbox>
There is a lesson in this for API writers on naming. IMHO, it is poorly named. (Is there any ambiguity in the name “indexOf”? What are the chances that a polyglot programmer would seek a method/function named indexOf vs find?). I wonder how many people are going to have indexOf in an Array extension?
</soapbox>

My other problem was finding find. In neither the Array documentation nor the Collection documentation do I see this function. Is it unreasonable for me to be looking there?
Note that filter is defined as a global function and as an Array function.