Main menu

You are here

MIDI Driven Animation using CoreMIDI in Objective C

Submitted by bensmiley on Wed, 03/21/2012 - 23:27

So in this post I'm going to explain how to produce MIDI driven animation on OSX or iOS using the CoreMIDI and CoreAudio frameworks. When I first started trying to do this I thought it would be easy - just register a callback in the MIDI player which is called every time a MIDI message is played. Unfortunately this is not possible and I ended up spending three long days figuring it out from the limited documentation available. Hopefully this post will save someone some time!

Hopefully you will have heard a rather mechanical scale followed by a chromatic scale. It's basic but at least it's a start. The next step is to create an AU graph so that we can play our MIDI file with an instrument effect.

Creating an AUGraph

When I first started reading about AU Graphs I thought it sounded horribly incomprehensible and opaque. In reality it's not too bad just a bit fiddly to set up.

An AUGraph is a container to hold a collection of AUNodes. AU Nodes are effects units which are supplied by Apple. Really it's just like music units in real life. Maybe you have a MIDI keyboard and you want to output the sound as a trumpet with an echo effect. You would need to plug your keyboard into a box which translates MIDI messages and turns them into trumpet sounds. This box would need to be plugged into an echo unit which is plugged into the speakers.

Choosing your AUNodes

In CoreAudio you choose the type of AUNode you need using three properties (defined by ENUMs):

componentManufacturer: The author of the AUNode in this case we will be using audio units from Apple - kAudioUnitManufacturer_Apple

componentType: The unit type

componentSubType: The sub unit type

The unit type and sub-unit type can be found in the Apple documentation or in the header file AUComponent.h. Basically to find the audio unit you need it's easiest to use Google. But say I want a high pass filter, I look in the AUComponent.h header file and find kAudioUnitSubType_HighPassFilter - this is the sub type. I then count how many sub type definitions there were before this one - in this case 2. I then look at the top of the document and look at the third Audio unit type defined kAudioUnitType_MusicEffect. Now I have my manufacturer, type and sub type and I can use the Audio Unit.

For this example we will be using the following two Audio Units:

Sampler: This is a unit converts MIDI to music sounds defined in a Sound Font or AUPreset and is available on iOS 5

RemoteIO: This unit allows us to output sounds to iPhone speakers

So here's the code - adapted from an example provided by Apple but with extra comments.

-(BOOL) createAUGraph {

// Each core audio call returns an OSStatus. This means that we

// Can see if there have been any errors in the setup

OSStatus result = noErr;

// Create 2 audio units one sampler and one IO

AUNode samplerNode, ioNode;

// Specify the common portion of an audio unit's identify, used for both audio units

So, now we've created a new audio graph with a sampler and an output unit. We've connected the sampler unit to the output unit and we've started the graph. Finally we need to set up the instrument effect, connect the music sequence and play.

@"Unable to set the preset property on the Sampler. Error code:%d '%.4s'",

(int) result,

(constchar*)&result);

return result;

}

This code takes a sound font NSURL and a preset number as input. The NSURL should point to the Sound Font file in your Resources directory. Sound Fonts can hold a number of instrument effects so the presetNumber defines which one should be used.

Now we just repeat what we did before but with a few added lines (marked by stars).

// Create a new music player

MusicPlayer p;

// Initialise the music player

NewMusicPlayer(&p);

// ************* Tell the music sequence to output through our new AUGraph

From the sample project you should understand how to play a MIDI file with a Sound Font effect. The final step is to get real time access to the messages being parsed by the MusicPlayer. To do this we need to add an extra step to our chain. Currently it looks like this:MIDI File -> Sequence -> Sampler -> IO Unit -> Speakers
We want it to look like this:MIDI File -> Sequence -> callback function to read messages -> Sampler -> IO Unit -> Speakers
With this system we will receive the messages in real-time before passing them on to the Sampler unit. This cal be achieved by creating a new MIDI end point. A MIDI endpoint is a destination where midi messages can be sent. This could be another MIDI app on your iPhone, an external MIDI instrument or, in this case, a callback function.

Creating a new MIDI end point

In order to capture the MIDI messages we need a destination that they can be sent to. This can be done by creating a MIDI end point:

// Create a client

// This provides general information about the state of the midi engine to the callback MyMIDINotifyProc

So there you have it! Play your MIDI file through a nice reedy SoundFont while collecting the messages to drive your animation! I hope this saves you the 3 days it took me to figure it out! Here's the link again to the project files in case you missed it at the top of the guide. Project Files.

Update:

It's been pointed out to me that several resource files are missing from the project - a midi file called simpletest.mid and a sound font file called Gorts_Filters.SF2. These files can be downloaded here. To add them to the project you need to right click on the resources folder in XCode and click "Add Files". As a side note, this code should work with any MIDI file and any Sound Font file. The only thing to watch with sound font files is that the preset/patch that you're requesting exists.

If you want to ask a general question about CoreAudio or discuss your CoreAudio issue please ask your questions in the CoreAudio section of the forum.

I just ran a quick test. I tried to load a SoundFont file which didn't exist. I get the same error as you. Double check the path. Check that it's as it should be - just the name of the SoundFont file and then the extension separately.

If you're sure the path's correct, have you added the SoundFont file to your XCode project? You have to right click the folder where you store your resources and click "Add Files to [name of project]...". You should be able to see the files in the browse to the left of the screen. After you've done this let me know if you're still having problems.

I don't get that error when I run the script. I would recommend profiling the script with Instruments. XCode->Product->Profile and then choose profile for leaks. It will give you a more detailed idea of where the leak is happening. Once you know that you can add an autorelease pool. For ARC enabled code you use the following:

This is really good work and I think I'm beginning to understand it but I can't get it to compile under IOS5.0
The error I get from compiling your downloaded project is:
Automatic Reference Counting Issue
No known instance method for selector 'initAudioTest'

Hi Chris,
To me this sounds like somewhere in the code there is a call which uses a selector i.e. @selector(initAudioTest) and that the initAudioTest method doesn't exist. I'd search the project for selectors and investigate any call like this. Or you could debug and see exactly where this error happens. I didn't have any problem running the code on iOS5 so it's not an ARC problem. If you're still having trouble let me know.
Ben

That looks fine could you copy me a version of the interface file? It would throw a no known selector error if the method were not defined in the interface. If you email me a zipped version of the project it would be much easier for me to debug. My email is: bensmiley45@hotmail.com.

Hi Ben,
Found the issue and fixed it. I should have looked at the build settings more carefully. For some reason ARC was turned on and garbage collection off. I turned ARC off as I have never used it anyway. Also despite my changing the target to 10S 5 there was still one spot with 5.1 changed that as well and it works fine.

Just trying out the code with a more complex midi file featuring multiple tracks. Without changing any code beyond file names, if I load in the file with the default SF2 font file provided, I can hear the lead guitar track fine, and another drum-specific font file lets me hear the drum track. I'm a bit of a MIDI newbie - how does the MIDI playback interact with the sound font? Is it just a case of it, say, requesting a 'drum' sample, and if we happen to have a valid one loaded then it'll use that? Does it fall back to a default if it can't find the exact instrument requested by the MIDI file?

Also, is there a way to load in multiple instruments so that I can hear both the guitar and drum tracks? Presumably I'd need to do something like create multiple sampler units in my graph and then call loadFromDLSOrSoundFont to initialise each?

I'm in the process of writing an advanced Midi tutorial which will cover: manually sequencing the midi track and playing multi instrument tracks using different sound fonts.

In the meantime here's a brief description of how to achieve the result you're looking for. The sound gets played by the line:

MusicDeviceMIDIEvent (sampler, status, note, velocity);

The sampler is a unit which converts a midi note into a sound using the sound font. To play a different instrument you would need to setup a number of samplers each with a different font (I use a dictionary to store a list of pointers to the mixers) and then use a mixer unit to mix the outputs together. However, I'm not sure how you would find out which track it was which.

I wanted to do sth. similar, and tested your code.. worked great with some adaptions, thank you for that!!
unfortunately it didn't work to use the c - midi callback function to put sth out to other objective c objects.
when a Midi-note arrives.

in the first paragraph of this blog you said about your initial idea: "Unfortunately this is not possible... "
I guess you were talking just about the above being not possible, right?

does somebody know why this is so? I guess it hast sth. to do with the c-run-loop which is not accessible from objective c.. ? just guessing, cause Im relatively new in the "pro-programming" world.

What we're talking about is the ability to register a callback with the music player to notify us when a midi message is parsed.This isn't possible simply because the API doesnt support it (i.e. the framework doesnt provide a way to register this callback)! There are two ways however to get around this. The first is what I've explained here using a virtual endpoint. The second is to write your own music player and parse the midi commands manually. This is more complex but provides maximum flexibiliy. I'm in the process of writing a tutorial on how to do this. If you want to be notified when its ready gollow me on Twitter.

I try to comment many many times but it's don't show. Your project is very useful for me but can you help me show clock time count down (ex:3:20) and when playing i don't know how to show lyric of mid file. In function myMidiReadProc() i can't call any function or call self. Example midi file size 42350byte how to get current byte when playing and call function parser to get lyric of this byte. Thank you so much .

I like this project a lot. I wanted to use it within a Mac OS X project and with some small changes got it to the point where it would have been calling MusicDeviceMIDIEvent repeatedly except for the fact that I had temporarily commented it out since it was my one remaining link problem. I had included the necessary frameworks (AudioToolbox, CoreMIDI, CoreGraphics, CoreAudio), but this link could not be resolved. Perhaps it was the 64-bit environment. I tried to change to 32-bit, but that got other errors, perhaps because I wasn't doing it right.

I'm using Mountain Lion and I don't get any problems. Could you tell me which version of iOS you're using. Problems have been reported with iOS6 where you have to set the UIBackgroundModes to audio. Also, could you tell me the exact error message you're getting.

Thank you for your tutorial, it seems that it could help me a lot.
I had the same problem as Ruben Zilibowitz, with the same error message. (*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'MIDIDestinationCreate failed. Error code: -10844 '§’ˇˇ'')
I added "Required background modes" and the entry "App plays audio"and now I have this error message when I try to run the App:
2013-05-11 22:25:06.693 MidiTest[20025:c07] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSURL initFileURLWithPath:]: nil string parameter'
***
I'm under Mountain Lion and the sdk is set to iOS 6.1

THANK YOU for this great writeup. I was able to get the project working when using your SF2 file, Gorts_Filters.SF2. However, when using any number of other SF2 files, downloaded from various places, the app crashes. The console says:

Lets look at how a Sound Font works. Essentially, a Sound font is a way of storing music samples. Each sample contains all the information needed to produce all the notes supported by MIDI. A Sound Font can contain a large number of samples. Each sample is indexed and retrieved by it's patch number. So, for example, patch 1 could be a piano sound and patch 2 could be a flute.

However, there's another level of organization with Sound Fonts in that patches can be arranged into banks. For example, you could have a piano bank which contained a grand piano, electric piano, harpsichord etc... The "load sound font function" I've written uses the default harmonic bank but you could set the parameter in the function to be the percussion bank or any other bank.

The reason you're getting these errors is that you're trying to access a patch that doesn't exist. For example, access patch 5 when there are only 2 patches.

To solve the issue, use another piece of software to analyze the Sound Font and then choose a patch which exists.

But I'm a bit puzzled by the fact that MyMIDIReadProc is called with two 'note on event' per event, where only one 'note on event' is expected. Can you explain that? And is there a work-around to fix this?

I want to say again that I really appreciate this thread. You pointed me in the right direction on the SoundFont issue, above, and I've got that fixed.

The problem now is how to make sure that playback occurs at the right tempo. I have two midi files of the same piece of Bach music (2-part Invention #1). In both files the time signature is identical (04 02 18 08). In File #1 the microseconds per beat are 500,000 and in File #2 the microseconds per beat are 722,892. Yet File #2 plays much *faster* than File #1. This is true when using your code (adapted) or using another OS X midi player called Rondo.

I now suspect that this problem is due to a different timebase/PPQN/tempo resolution. File #1 has a timebase of 384, but File #2 has a timebase of 192. I think that the MusicPlayer has a certain pulse rate and is using only 192 pulses per quarter note when playing File #2.

First, is it possible to arrange things so that the MusicPlayer plays the correct number of microseconds per quarter note; that is, so that adjusts properly for the file's timebase? I guess that to do this I'd have to set the pulse rate of the Music Player.

If this is not possible, is there a way to programmatically determine the timebase of a midi file (apart from parsing it separately myself) so that I can adjust the tempo accordingly? This seems like a hack but I'd take it if necessary.

I earlier asked about tempo differences in different midis of the same piece of music. Turns out that the "slower" version had twice as many bars in the file. So that explains it, not some far-out problem with ticks or timings.

However, playback is terrible on the actual device. It seems to be taking too many system resources, in that the audio is heard to be playing, but there are frequent brief pauses in the music and user interaction is essentially suspended (is extremely slow) while audio is playing. When playback ends, the system becomes responsive again.