Since a few months ago, when I started studying WPF, I have noticed that there aren't many articles dealing with WPF and audio features. This is more noticeable when it comes to Midi audio. So, scarce material on the issue became the main motivation for this article. The secondary motivation, of course, is that it is a lot of fun to work with Midi, as I found out later.

Although the following article is not written by an expert on music and audio engineering, I hope to provide developers and users in general at least with the basic concepts involving C# and midi.

To use WPF Midi Band provided with this article, if you already have Visual Studio 2010, that's enough to run the application. If you don't, you can download the following 100% free development tool directly from Microsoft:

At this point, you might be wondering why I decided to work Midi. Wouldn't be better to use more high quality music provided by MP3?

Well, to answer that, I must say that, as an IT professional, what made Midi files appealing for me is that they contain lots of data. Quality data. And these data may be processed in many ways, as you can see in this article. That being said, although Midi music often lacks the beauty and depth of traditional instruments, it is in many ways like a programming language (like C#) where we have commands, arguments and enumerations. In short, it is a program that plays music.

Midi stands for Musical Instrument Digital Interface. It is an electronic protocol established in 1982 aiming to standardize the growing electronic instruments industry of the time, and adopted widespread until these days. Midi protocol is not about audio signal. Instead, it is based on event messages, and sends commands to electronic devices, such as which instruments to play, which channel, volume, pitch, define tempo, and so on.

I must admit that, before working in this article, I didn't have much love for Midi music. This is because some musics have an instrument, pitch or volume that can be annoying. But there are also many quality midis out there. It's up to you to find them.

This article would not be possible without the priceless contribution of an article published here in The Code Project years ago by Leslie Sanford. Leslie wrote this article entitled C# Midi Toolkit and it became a must read for C# developers wanting to get more out of Midi.

Leslie Sanford's C# Midi Toolkit

The core of WPF Midi Band application uses Leslie's excellent Toolkit, so if you are really interested in the subject, I strongly recommend reading C# Midi Toolkit.

Basically, you have to follow the steps below to play midi with C# Midi Toolkit:

Instantiate a OutputDevice

Instantiate a Sequence

Instantiate a Sequencer

Subscribe the Sequencer.ChannelMessagePlayed event

In the Sequencer.ChannelMessagePlayed event, call the outDevice.Send, passing the message to the device as an argument

Attach the Sequence to the Sequencer

Instantiate an OutputDevice

Call Sequencer.LoadAsync passing a file name as argument

Listen to the music

There are other methods, of course, but these steps are basic ones. The full implementation is as follows:

This article will make use of Midi event messages to show how to play the Midi music as a virtual band called WPF Midi Band. Please notice that we are dealing with the Midi data reactively, so instead of opening the file and reading the data directly, we instead tell the Sequencer to open the file asynchronously, and then all we have to do is to wait and listen to Sequencer events. Whenever the Sequencer that a ChannelMessage has been played, we immediately send that message to the OutputDevice, which will cause a sound to be played, or stopped, or distorted, and so on.

Below we have the table containing the "Melodic Sounds" supported by Midi protocol:

Midi Melodic Sounds Table

The instrument selection done by sending a ProgramChangecommand in the event message to the output device. When you send ProgramChangeto the device, you define also the channel number reserved for that particular instrument.

For example, the snippet below is an XML serialization of a single message, taken from the beginning of "Sweet Child of Mine" Midi:

Notice that the ProgramChangecommand selects the instrument with Id 35 (Data1 = 35, so the instrument is Electric Bass - pick) in the Channel #3 (MidiChannel = 3). This single message prepares the output device to process any incoming notes events arriving at the Channel 3 as Electric Bass notes.

As the messages sent to the output device trigger the HandleChannelMessagePlayed back in our code behind class, we have to decide which instrument (in the screen) should receive those messages.

The first thing we have to check is whether the ChannelCommand is a ProgramChange command. In this case, we have to feed a dictionary to store the Instrument Id, so that the future messages coming to that particular channel could be directed to the correct instruments:

The second thing to be taken into consideration is whether a drum sound is being played or not. This is done by inspecting the MidiChannel and checking if the message is coming to the 10th Channel. Since MidiChannel is a 0-based data, we look for MidiChannel=9. This particular channel is reserved for drum sounds:

Notice above how we send the incoming message to the DrumsControl. Notice also that we have to deal with the execution coming from another thread different from the UI thread.

If the incoming message is not a drum sound, we should check the MidiChannel number. Then, based on the Melodic Sounds table shown earlier in this article, we decide which control (in the screen) should receive that message.

The Piano Control resembles the C# Midi Toolkit original piano. The result is almost the same. The main difference, is that I ported it from Windows Forms to a WPF interface. The Piano Control will play any keyboard-like instrument, such as Grand Piano, Keyboard, Clavinet, etc. It also plays woodwind and tubular instruments such as flute, brass and saxophone. It could be a good idea that WPF Midi Band had one control for each instrument, but this would certainly cause visual confusion. Besides, in most classic rock bands (like WPF Midi Band), such instruments would probably play in an electronic keyboard.

WPF Midi Band's Keyboard

When selecting the instrument, we send the incoming message to the PianoControl based on the following conditions:

The Guitar Control is a UserControl made of a guitar arm, containing 6 strings and a series of frets. It has the traditional guitar tuning (EADGBE), but you could replace it by another tuning if you like. This is not configurable by default, so you should put your hands on the code and recompile the application.

Each of the strings contain a sequence of notes. Each note falls between 2 frets and in a specific string. The application calculates which interval the note belongs to and then decides where the visual element representing the note will be shown.

WPF Midi Band's Guitar

The messages are sent to the GuitarControl only if the instruments are compatible with the guitar arm:

The strings data are stored in an array of structure named StringInfo, which contains info about the 6 strings: minimum and maximum note Ids, the corresponding grid row and the Rectangle which represents the visual string on the screen:

Finally, we also must handle the situations where the note is being released. In this case, we remove the selected StringInfo from the dicNotesOn dictionary, and remove the visual elements from the Grid:

The Bass Control is much like the Guitar Control. The basic differences: it has 2 less strings (the high-pitched ones), and the positions in the instrument's arm correspond to different range of note values.

WPF Midi Band's Bass

Here goes the code that decides if the message should be sent to the BassControl:

The Drums Control is probably the most appealing of all. At first, it seemed very difficult to represent a real drum playing in real time. But then an idea came to me, that I could use animations to creating "pulsating" animations for each individual part of the drums. Fortunately, it gave the application a very interesting result.

Bass Drum: In music, the bass drum is used to mark or keep time. In marches, it is used to project tempo (marching bands historically march to the beat of the bass). A basic beat for rock and roll has the bass drum played on the first and third beats of a bars of common time, with the snare drum on the second and fourth beats, called "back beats".

Floor Tom: A floor tom is a double-headed tom-tom drum which usually stands on the floor on three legs. However, they can also be attached to a cymbal stand with a drum clamp.

Tom-toms: A wide variety of configurations are commonly available and in use at all levels from advanced student kits upwards. Most toms range in size between 6" and 20", though floor toms can go as large as 24". Two "power" depth tom-toms of 12x10 (12" diameter by 10" depth) and 13x11 is a common hanging tom configuration. Also popular is the "fusion" configuration of 10x8 and either 12x8 or 12x9, and the again popular "classic" configuration of 12x8 and 13x9, which is still used by some jazz drummers. A third hanging tom is often used instead of a floor tom.

Snare Drum: The snare drum is a drum with strands of snares made of curled metal wire, metal cable, plastic cable, or gut cords stretched across the drumhead, typically the bottom. Pipe and tabor and some military snare drums often have a second set of snares on the bottom (internal) side of the top (batter) head to make a "brighter" sound, and the Brazilian caixa commonly has snares on the top of the upper drumhead. The snare drum is considered one of the most important drums of the drum kit.

The 3 classes of drums instruments above were grouped in this section because they share the same kind of Animations. Whenever a sound come to those instruments, the drums control triggers an animation on their ScaleTransform, so the instruments appear as if they were "pulsating". Of course real drums would never do that, but in the end effect captures well the feeling of the drumbeat.

Besides the "pulsating" animation, these instruments also become more or less transparent when beaten.

Here goes a XAML code example showing how these animations are set up for one instrument:

Hi-Hat:The hi-hat consists of two cymbals that are mounted on a stand, one on top of the other, and clashed together using a pedal on the stand. A narrow metal shaft or rod runs through both cymbals into a hollow tube and connects to the pedal. The top cymbal is connected to the rod with a clutch, while the bottom cymbal remains stationary resting on the hollow tube. The height of the top-cymbal (open position) is adjustable. When the foot plate of the pedal is pressed, the top cymbal crashes onto the bottom cymbal (closed hi-hat). When released, the top cymbal returns to its original position above the bottom cymbal (open hi-hat). A tension unit controls the amount of pressure required to lower the top cymbal, and how fast it returns to its open position.

Ride Cymbal: The ride cymbal is a type of cymbal that is a standard part of most drum kits. Its function, very similar to the hi-hat it is an alternative to[1], is to maintain a steady rhythmic pattern, sometimes called a ride pattern rather than to provide accents as with, for example, the crash cymbal. The ride can fulfil any function or rhythm the hi-hat does, with the exclusion of an open and closed sound[1]. In rock and popular music another percussion instrument such as a shaker or maraca may be substituted for the cymbal in a ride pattern, especially in quieter styles such as soft-ballads or bossa-nova

These 2 drums instruments have different animations. While the others have that "pulsating" effect, in the Hi-Hat the top cymbal crashes onto the bottom cymbal when the pedal is pressed. This is done by a DoubleAnimation targeting the Y coordinate of the TranslateTransform of the top cymbal:

Thank you for reading WPF Midi Band. I hope you have enjoyed it as much as I had. And I also hope the article and the application could be useful for you in some way. And please leave a comment! If you have any complaints, suggestions or doubts, your feedback will be very important, not just for this article, but also for future articles.

Share

About the Author

Marcelo Ricardo de Oliveira is a senior software developer who lives with his lovely wife Luciana and his little buddy and stepson Kauê in Guarulhos, Brazil, is co-founder of the Brazilian TV Guide TV Map and currently works for ILang Educação.

He is often working with serious, enterprise projects, although in spare time he's trying to write fun Code Project articles involving WPF, Silverlight, XNA, HTML5 canvas, Windows Phone app development, game development and music.

Best Web Dev article of March 2013
Best Web Dev article of August 2012
Best Web Dev article of May 2012
Best Mobile article of January 2012
Best Mobile article of December 2011
Best Mobile article of October 2011
Best Web Dev article of September 2011
Best Web Dev article of August 2011
HTML5 / CSS3 Competition - Second Prize
Best ASP.NET article of June 2011
Best ASP.NET article of May 2011
Best ASP.NET article of April 2011
Best C# article of November 2010
Best overall article of November 2010
Best C# article of October 2010
Best C# article of September 2010
Best overall article of September 2010
Best overall article of February 2010
Best C# article of November 2009

I should have said “implement”, not “utilize”. Sorry…but I was watching one of your YouTube videos as I typed, so it is really your fault, not mine! Lol!

I think that you would have to go back to circa DirectX 8 to see the utility of the Direct Music Producer and I believe that release was the last place that I saw the DMBoids example program. The DMP environment was dedicated to the production of MIDI tracks, segments, styles, patterns, motifs, etc. The DMP acted as a MIDI sequencer and as a music composition tool. The DMBoids example incorporated that tools finished product into a flocking boids example in an interesting, if simplistic, manner. As with most examples, it was intended to demonstrate a capability and not a full potential. Your effort with the MIDI implementation in your example program reminded me of those bits and also of my attempts to use the DMP, Direct3D and DirectMusic back then to do my own version of Animusic. It was clear from the preface to your article that you enjoyed working with MIDI and I was merely encouraging you to re-discover some things from the past that one with your skills might bring back into focus and have a lot of fun while doing that.