Gryphondalehttp://www.gryphondale.com
Visionary SoundSat, 01 Oct 2016 01:56:41 +0000en-UShourly1http://wordpress.org/?v=4.3.15Designing Music NOWhttp://www.gryphondale.com/2016/09/29/designing-music-now/
http://www.gryphondale.com/2016/09/29/designing-music-now/#commentsThu, 29 Sep 2016 05:00:54 +0000http://www.gryphondale.com/?p=1094As Managing Editor and Founder of Designing Music NOW, most of my writing can be found at DesigningMusicNOW and also as a Gamasutra. Here are link to some of the articles I have written on various topics on game audio. Reviews – Sample libraries such by such companies as Orchestral Tools, Spitfire, Output, Soundiron, Fluffy Audio, Virharmonic, […]

]]>http://www.gryphondale.com/2016/09/29/designing-music-now/feed/0Composing Adaptive Music for Games with Elias – Part 3 – Stampede and Unity Integrationhttp://www.gryphondale.com/2015/08/22/composing-adaptive-music-for-games-with-elias-part-3-stampede-and-unity-integration/
http://www.gryphondale.com/2015/08/22/composing-adaptive-music-for-games-with-elias-part-3-stampede-and-unity-integration/#commentsSat, 22 Aug 2015 00:57:03 +0000http://s579734099.onlinehome.us/?p=985“The history of the music industry is inevitably also the story of the development of technology.” – Edgar Bronfman, Jr. “Any sufficiently advanced technology is indistinguishable from magic.” – Arthur C. Clarke Introduction Welcome to the third article in this series – “Composing Adaptive Music with Elias.” If you missed the first two articles, they […]

In the first article, I gave an overview of the Elias studio and introduced the importance of using adaptive music in video games. In the second article, I went into detail about the specifics of Elias and some of its advanced features, and how to get the most out of it when composing adaptive music. In this third article, I am going to show you a real world example of how adaptive music is used in a game, how simple it is for programmers to get Elias up and running in Unity, and discuss the performance and memory usage of Elais, which turns out to have negligible impact on the CPU or audio memory (less than .2% CPU impact and only .2 MB of audio memory no matter how large the theme is).

The game I am going to show you is called Stampede, a super fun virtual reality VR game being developed by Collin Parker of Black Matter Labs. Collin is an indie game developer based in Los Angeles, and Stampede is a game being built for the upcoming launch of the Oculus Rift and other VR headsets. A demo level is already available for you to check out for free if you have an Oculus headset here. Those who have played the game are having a blast with it, as you will see in some of the live gameplay videos. Stampede is being developed in Unity, and we are using FMOD for the SFX, Ambients and VO and Elias for the adaptive music in the game. FMOD and Elias play very nicely together as you will see in the implementation and performance video below.

Elias has just announced that the Elias Engine is FREE for development budgets under 150K USD, and so there is no reason not to download it and try it out in your game. I know my experience with it has been incredibly positive, and you will be glad that you did!

Black Matter Labs

I spoke with Collin, founder of Black Matter Labs, about the game development, how the music was implemented, and about his experience in working with Elias in this video interview here:

Collin and I met at Indiecade in Los Angeles last year, and have been working together on this game since then. He has been a pioneer in trying out new technologies, and has been an avid supporter of the Oculus and other VR headsets such as the HTC Vive and the Sony Morpheus.

Stampede

Stampede is a VR tower defence game inspired by the movie Jumanji in which crazed animals are attacking your village. The game is set in Africa, so the animals are all pretty fierce when they get riled up – you will be battling elephants, hippos, zebra and hawks among other animals on the first Savannah level. Other levels include a wide variety of animals from land, air, trees and the sea as you lead up to the final boss level, which will be fought in a volcano.

Since this game involves waves of big game animals, I asked Collin if he has anything against elephants, zebras and hippos. He assured me that he is an avowed animal lover, and is not promoting hunting or killing of any animals in any way. In fact, he created this fictional world where the animals have become crazed by some unknown source, which you will have to discover and remove as part of the game’s backstory.

There are tons of great reviews of the demo level of the game, and I put together a video with excerpts from four of them to give you an idea of the gameplay and how much fun people are having with it:

Here is the official trailer:

Music in Stampede

When I first met Collin at Indiecade LA last year, I got to sit down and play an early version of the game. I was all smiles and sweating bullets by the end of my first attempt (it was hot that day too…) It was one of my favorite games from the show, and I was thrilled that he approached me soon after to work on the SFX, VO and Music for the game. I had just discovered Elias, and I sent him an early version of the music in Elias, which he loved. However, the Unity plugin had not been developed at that time, so we decided to proceed implementing the music in FMOD for the time being since we were using it for the SFX and VO already. As soon as the Elias Unity plugin was available, we switched back to Elias and now are using it for all the music in the game.

This gave me a unique perspective to compare the difference between my FMOD implementation of the music and the Elias implementation.

Walkthrough of Elias Theme for Stampede:

FMOD, WWISE, Fabric and Elias

Elias works perfectly well with FMOD in Unity, as it would with any other audio middleware such as WWISE or Fabric, and they actually have different purposes. The other popular audio middleware products were developed first as sound effect playback engines, and now also support music, but Elias was developed by composers for composers and is a sophisticated music only solution. Though the others can be used to play back music adaptively, they do not have any built in templates. Therefore, you need to learn how to use the tool, then learn how to create adaptive music solutions from scratch, and then implement it yourself.

These other middleware solutions are best at playing back music horizontally, meaning that you can switch from one loop of music to the next depending on the requirements of the game, and you can fairly easily add things like stingers to smooth the transitions. However, if you want to attempt to do something as sophisticated as Elias can do out of the box, you are going to run into some severe limitations. These same limitations would apply equally to WWISE, Fabric or any other middleware that is designed primarily for sound effects. Elias uses the vertical resequencing method for playing back adaptive music, and as you will see, this is by far the smoothest and most advanced way to implement adaptive music in games.

Another huge benefit is the incredible variety that the vertical resequencing method gives over the more simple horizontal method. In Elias, an exploration theme with 6 tracks and 6 variations can result in 6 to the power of 6 variations, or 46,656 variations! In the horizontal approach, if you have 6 tracks – you have exactly 6 variations and this can lead to boring repetitive music and ear fatigue that usually results in the player just turning off the music.

For example, my early FMOD music player implementation looked like this:

It was basically a horizontal approach which simply played pre-rendered musical scenes depending on the wave of animals in the Savanna level. There are 10 waves in the Savanna level, and there were 6 “levels” of pre-rendered music, so Collin distributed these more or less evenly across the different waves, with the most intense music playing in a loop at the most difficult final 2 waves. This simple implementation took about a week to set up in FMOD. There are other ways to implement adaptive music in FMOD and other middleware, but to do what Elias can do out of the box would take me weeks or months to set up a similarly complex music player, and it probably would not have all the features explained in my second article. If you are concerned with performance, as you should be, then as you will see with the performance section it turns out that Elias can offer huge benefits here as well.

Finally, if you want to use other middleware in conjunction with Elias and still be able to control the volume of the music playback, then it is a simple matter just to run the audio stream through that middleware so you can do things like sidechains for dialog or sound effects.

Elias Theme for Stampede

The theme for the Savanah level has 15 levels, and can be heard here:

And here is what it looks like in Elias:

There are also a bunch of stingers, as you can see here:

Implementation

Integrating Elias into Unity is very simple, and according to Collin, once you know where the music theme should go (the streaming assets folder by default), it only takes a matter of minutes. However, it does take the usual amount of time time to set up the game logic to playback the music in the way you want it to react in the game. For example, in Stampede, the first 3 levels are idle music which plays before and after a wave of animals start “coming to destroy the village!” After that, the levels gradually increase and decrease depending on the number of animals attacking or remaining. For example, in the early waves, the theme may reach up to level 5 or 6 of intensity and then slowly go back to level 1. However, on the later waves of the game when animals are coming at you from every angle, even the air, the intensity can reach the top level before heading back down to level 1 as you defeat more and more of them.

Though this is not intended to be a complete tutorial on the process of implementing Elias in Unity, I did try it myself and found it to be incredibly easy. Just download the Elias Unity Plugin here, and then double click on it (you have to have Unity installed on your computer, and it works with version 4.6 and above.) Once you have done that, Elias is installed! They also included a demo theme that is automatically included in the proper directory, the /Assets/StreamingAssets folder as shown here:

You can put your own themes in the same directory following the instructions in the ReadMe, which state: “The folder structure for a theme called ‘demo’ would thus be:

Assets/StreamingAssets/ELIAS_Themes/demo, where the directory demo contains demo.epro and /audio“

There are also a number of exquisitely composed themes available by searching for Elias on the Unity Asset Store, such as this gem that was first demoed at GDC 2015:

Elias Unity Integration and Performance

Here is a video where I interview Collin about the adaptive music engine, integration of Elias into Unity (at 4:15), the performance of Elias in Unity (7:20), and the target platforms (10:00):

Performance and Memory Usage

There are 8 music tracks and 5 stinger tracks. There are 52 separate “stems” for the music, and 25 stingers, and the total file size for this theme is 43 MB in OGG files. However, when the music is played back in Unity, as you can see in the Elias Unity Integration and Performance video, only .2 MB of audio memory is used at any time! In addition, only .2-.3% additional CPU is used – so the impact on the overall performance is negligible. This is because Elias uses a configurable streaming buffer which defaults to 400 ms. That means only a small amount of the entire theme is present in memory at any given time!

Elias Demo Player

I have recorded a video showing how to put your own theme into the Unity Demo player along with screen capture of the Demo Player working. I go into where to find the very well written documentation and other Elias assets in some detail here:

Conclusion

My thanks to Collin and to the amazing programmers and composers at Elias Software for letting me pick their brains in order to write this in depth series of articles. It has been a long journey, and if you have read all three articles in full, pat yourself on the back! You are on your way to having some of the best sounding adaptive music in your game that is available, and you won’t be sorry you tried it out!

]]>http://www.gryphondale.com/2015/08/22/composing-adaptive-music-for-games-with-elias-part-3-stampede-and-unity-integration/feed/0Composing Adaptive Music with Elias – Part 2 – Advanced Conceptshttp://www.gryphondale.com/2015/08/22/composing-adaptive-music-with-elias-part-2-advanced-concepts/
http://www.gryphondale.com/2015/08/22/composing-adaptive-music-with-elias-part-2-advanced-concepts/#commentsSat, 22 Aug 2015 00:54:14 +0000http://s579734099.onlinehome.us/?p=982“The days of pure pencil-and-paper music composition are gone for good. If we wish to compose music in modern times, we must be able to navigate the wonderful world of computers and technology.” – Winifred Phillips, “A Composer’s Guide to Game Music” Introduction In my last article, I introduced some of the basic concepts of […]

]]>“The days of pure pencil-and-paper music composition are gone for good. If we wish to compose music in modern times, we must be able to navigate the wonderful world of computers and technology.” – Winifred Phillips, “A Composer’s Guide to Game Music”

Introduction

In my last article, I introduced some of the basic concepts of writing adaptive music using a middleware tool called Elias from Elias Software. In that article I promised to go into more detail and depth about the advanced features of Elias, and so let’s dive in! If you did not have a chance to read it yet, it would be a good idea to familiarize yourself with it here:

Elias, like a good video game, is easy to use but difficult to master. That means some detailed explanation is necessary to understand some of the advanced capabilities of the tool. This is a new type of tool with some new features, and therefore there are some terms that need more explanation, such as Objective Mode, Exploration Mode, Agility, Rendezvous, and Urgency. I will also go into Reverb and Reverb Tails; Fades and Crossfades; Agility and setting Custom Agility Beatpoints; Changing Keys in Elias and understanding the Rendezvous setting; and the concept and usage of pickup beats for stingers. I will also discuss some of the optimal settings for reverb and fades.

I take a fairly deep dive into the inner workings of Elias, but I also have created some videos for you to watch that will help to show the principles with real world examples, including a song I wrote specifically for this blog called Blade Revisited. By the time you finish this article, you should understand all the finer points of Elias and be able to use it like a pro!

Let’s check out Blade Revisited and an overview of some of the features I will be going into in this article:

Objective and Exploration Modes – In Theory and Practice

Objective mode is used in scenarios where the music needs to go from one phase of gameplay to another. In the score I created for this article, imagine you are a futuristic bounty hunter wandering through the techno-slums in a world similar to The Fifth Element or Blade Runner or games like Deus Ex. There is some tension building, but primarily you are just exploring the city. Then, out of the corner of your eyes you see your target, the evil slave runner Gothmong. Your heart quickens as you begin to chase Gothmong through the underbelly of the city. Eventually, you catch up to him and he is surrounded by his loyal goons, and a battle ensues.

In order to score this sequence of events, you would first need an exploration theme to play as you wander the city in search of Gothmong. Then you would switch to an objective mode for the chase leading into the battle mode. The battle would grow more and more intense. The intensity of the music could be correlated and controlled by game parameters such as your health and also Gothmong’s health or number of henchmen left. It would need to be able to play different outcomes based on your success or failure. Finally, if you succeed, then you would need the music to settle back down into the calmer, exploration mode theme.
Let’s take a listen to the three “scenes” that I created using Elias for this article:

Exploration mode plays different layers in the tracks randomly. One of the most annoying things about looped music in video games is that if you spend a lot of time in a particular area, you will hear the same music over and over again until you are sick of it. One of the big advantages of the exploration mode is variety. If you had 6 tracks with 6 variations (drums, bass, melody, harmonies, etc.) then you would have 6 to the power of 6 variations, or 46,656 variations! One of the most important things to remember when writing exploration themes in Elias is to make sure all the layers on the tracks play well together. To accomplish this, it is wise not to have too much difference in dynamic range between the different layers. For example, you can use different instruments for the melody or bass, but make sure they are all played around the same volume. Also, within the track, you would not want wild variations in the timbre or note ranges of the instruments. Try to keep them in similar ranges you would for vocalists of similar range: bass, tenor, soprano, alto etc.

Objective mode is great for music that needs to go from one activity or emotion to another in the game. In our example, the chase theme and battle theme are similar, but vary in intensity and in emotion characterized by the music. When the character is moving from one scene to the next, or one type of activity to the next is when it is best to use the objective mode. You have a lot more control over the type of music that will be played when, and Elias becomes a kind of conductor or arranger who follows the player around making sure the right music is played at the right time, just as you would hear in a film.

Here is a video showing Exploration mode for the Blade Revisited theme and comparing it to the same theme in Objective mode:

Reverb and Reverb Tails

The brain’s ability to detect reverb is an amazing thing. We evolved this ability to be able to tell the size of the space that we were in, and what objects are nearby by us. Dolphins ability to ecolocate is in some sense just a reverb superpower.

For example, if I were to play the sound of a snare drum in a closet and then again in a large cave, our ears would clearly perceive the difference. The brain would be able to detect the size of the room it was played in with amazing accuracy. Sound designers know this and most 3D games take reverb very seriously to ensure realism, and there are even some cool plugins developed by Impulsonic that do very accurate realtime modeling of rooms and spaces in games.

I find it useful to visualize these concepts, especially when writing about them, so here is an example of a simple snare drum hit without reverb (top) and with a long reverb tail (bottom.)

Composers who ignore reverb do so at their own peril. Well, maybe not peril, but it is very important to understand and employ it correctly.

In games that do not employ audio middleware, composers have to pay attention to reverb tails and handle them manually. We have to copy the reverb tail from the end of the loop and mix it into the beginning of the loop. This is a lot of extra work, and it is not a perfect solution since it can sound very strange the first time the loop is played to have some unknown reverb source in the mix (a fade in can help here.)

Elias handles this tricky situation beautifully by including a reverb feature that is applied to the audio at the end of the effects chain. What this means is that when the song loops back to the beginning, or even when it changes from one layer to the next, the reverb tail is added back in by the software engine, and you don’t have to do anything top the source files! Let’s check out an example of how that works in Elias in this next video:

Note that when you start a new theme, the default settings in the theme settings dialog have the reverb set to ON, as seen here:

If the mix is set to 0, the reverb is completely disabled irrespective of what the other settings are.This is probably a good idea IF you plan to set the reverb track by track, as shown here:

Note however, that the more tracks that have different reverb applied will have an impact on the CPU usage during playback. To help overcome this limitation, Elias automatically employs submixes for reverbs that have the same settings. For example, if you have 6 tracks in your theme, but only two different reverb settings, then only 2 reverbs will be used for all the tracks.

Think of the mix as the percentage of signal that is made up of reverb (If you had it set to 100 it would be almost all reverb) and the size as being the size of the room. As for optimal settings for reverb, in my experience with Elias, a pretty natural sounding room reverb is Mix = 20, Size = 20 and Damp = 35. If you were to set size to 10, mix to 10 and damp to 60, it would be like playing in a small carpeted room. If you set size to 100, mix to 30 and damp to 60 it makes a very convincing cave reverb. I give several examples in the video that you can listen to, and I encourage you to experiment on your own with simple sounds, like a snare, to see how the parameters affect the output.

In general it is best to include dry or fairly dry stems when importing into Elias and let Elias handle the reverb for you. Also, I have found that the music tends to sound most natural when there is at least some reverb on the mix. Finally, try not to have too many different reverbs in the theme – it is more optimal to use no more than two or three different reverb settings- say one for the drums, one for choirs, and one of the other instruments.

Fades and Crossfades

In general fades should be so clean that they are not noticed at all. If a fade is too obvious, it really can destroy the immersiveness of the music and our sensitive hearing can easily pick up a bad fade.

In Elias, you could just use the default settings and possibly never have any issues with fades, but as you will see there are some types of instruments you will definitely want to tweak the default settings to get the best results.

Another area of Elias that can be controlled with a great deal of specificity is fades. You can set ranges for fade ins, fade outs and crossfades.

To see how all this works, check out this short video here:

Here is the default setting in the global fade settings, in milliseconds:

The only time you will hear the fade ins is when a track starts playing, or when the track goes from a layer with silence to a layer with music. To see this visually, check out what happens to a kick drum when there is 0 ms fade in (top) and a 100 ms fade in (bottom.) Therefore, on tracks that have a quick attack, like drums, pizzicato strings or mallets, you would be tempted to turn the fade to 0, however, in practice this is not a good idea. It is better to set the fade between 5 and 10 ms for these cases in order to avoid any possible pops or dc offsets from happening during a layer change.

Conversely the only time you will hear the fade out is when the music on a layer goes to a layer with silence on it. To see this, check out the snare drum with reverb on it – the top track has no fade out and is not nearly as smooth as the bottom track which has a 500 ms (or half second) fade out.

Crossfades, however, are happening all the time and deserve some looking into. Different types of instruments require different crossfade settings. Let’s look at this example of the kick drum crossfading with a snare hit. The first two beats are the kick, and the second two are the snare. There’s reverb on the track, so you can also see the reverb tails here. In the top track, there is no crossfade and the snare plays without any of the kick’s reverb tail. In the bottom track, the crossfade is set to 500 ms and you can see how the reverb tail of the kick drum colors the sound of the snare in the third beat, and is mostly gone by the second beat. This is a more natural effect, and in general you will want to use crossfades.

What are the minimum and maximum settings you see in the dialogs? I will go into those more when I discuss Urgency later on, but for now you can think of it like this: when Urgency is set to 100, or most urgent, then the minimum fade setting is used. When the Urgency is set to minimum, or 0, then the maximum fade setting will most likely be applied. Why do I say most likely? That is because the engine uses heuristics to determine the best settings for each type of track depending on the tracks amplitude and transients.

Now that you understand the three types of fades and how they are applied to the music, what are some optimal settings? Here you can see the default settings that Elias will use if you do not set them yourself.

For the fade in, you will want to be sure to set both min and max to 0 for kick drums, and anything that has a sharp attack, as mentioned above. For legato instruments, choirs, and pads you will want to set the Min to at least 50. A 50 ms fade is not perceptible in a large mix, and therefore is a very safe setting, but sometimes it is not enough, especially for music that is not tightly quantized. For this, you may want to set the Min between 100 and 250. If it is too long, say 500 ms, then you will definitely notice it and probably should be avoided unless you want to apply that effect to the music.

For crossfades, the default settings are pretty good for most cases, but you can experiment with much quicker crossfades. You can add longer crossfades on slow legato passages or even on tracks that have a lot of dynamic range between each of the layers. For example, if you have some soft strings on one layer followed by loud horns on the next layer, you may want a longer and smoother transition there to keep from startling the listener. Stingers, which will be discussed in more detail below, can help to smooth these kinds of transitions as well, but are not required when you have the ability to fine tune fades so carefully.

For fade outs, you see they are defaulted to 500 ms. This half second fade usually works quite well and makes the transitions to silence (in that track) very smooth. The only time you would want to set it to 0 would be if you absolutely don’t want to hear any of that track when the next layer starts, but as I mentioned above, it would be better to set it to 5 or 10 ms in these cases to avoid pops. Longer fade outs can be useful in cases where you would want to have a lead instrument ring out into the start of the next layer, especially if it has reverb on it.

Agility and Custom Agility Beat Points

Agility is another key concept in Elias. It is what sets it apart from other middleware solutions I have used for music, and it requires some playing with it to understand how it works. Basically, it is the setting which tells Elias when it is OK to switch from one layer to the next. Usually, you want this to happen in time with the music, and very often you want that to happen on the first beat of a new measure. However, with Elias, you can set that to any beat of any measure you want! Moreover, by adjusting additional settings, you can give Elias more or less control over the times at which it will switch.

Let’s say, for example, you have a 4 bar melody line, but the harmony (chords) are changing every two beats. You can set the melody track to only change every 4 bars, but tell the chords it is ok to change every 2 beats! If you have the melody being played by a violin on one layer and a flute on another, you would not want the melody line to change in the middle of an important phrase. However, it would be great to have it change on the next phrase and even be able to do a call and response style interchange between several different instruments, all while the harmonies and rhythms can be as agile as they like underneath.

There are two places where agility can be set, in the Theme Settings dialog, which will be the global settings for all tracks:

and the Settings dialog for each individual track:

In the image above, on the left hand side you can see the default setting which is 1 bar. This means that Elias will default to switching every bar on the first beat of that bar unless you override that setting. On the left in the image above, you can see that Agility has been turned on, and a custom agility input box is shown by selecting “Custom Setting” from the drop down menu. Custom beat points can be added manually by clicking the “Add” button, but can also be added by starting playback (F5) and then just tapping the spacebar when you want to set a beat point.

This can be done in either the Theme Settings or the Track Settings, and here is an example of how it might look:

Check out the video to see this in action:

As you saw in the video, the notion of “strict” agility means that Elias will strictly adhere to the exact beat points you set in these dialogs. This all obeys the logic you indicate. For example, if you click the “Strict Agility On?” radio button in the Theme Settings Dialog, then Elias will obey those settings. If you then, on a track by track basis click the “Override Strict Agility in Theme Settings for this Track?” radio button, then, for this track, Elias will use the settings you made here. If “Strict” is never set, then Elias will use its best judgement to adhere to the agility settings in those dialogs, but it may choose to switch at a musically better time to do so. What is an example of this? Well, in the case of the kick drum with no reverb, for example, there is a silence before each beat. Elias can determine this and know that it is OK to switch at any time there is a silence before the beat. Elias apparently uses many other factors when determining when it is appropriate to switch, and that is part of its magic special sauce.

Changing Keys and Rendezvous

It is very common in video game music to play the same themes in different keys. Each key has its own feel to it, and it is one way game music can keep the player from getting bored with the music. Elias has a built in feature that allows for key changes provided you have exported your song in different keys. It does not do a warping or transposition of the music for you, but most modern DAWS have this capability. Rather than relying on the DAW to transpose, you can also just record in MIDI and then bounce to different audio files in different keys.

Once you have the new set of stems in the new key, just change the key in the Track Settings dialog (as shown in the bottom image) and then load them into Elias.

Then, in the player (F5) you can switch between the two keys very easily by selecting the key you want.

Notice the check box above Key which says Rendezvous? This is an important concept in Elias and it is directly applicable to key changes. When Rendezvous is checked, all the tracks in the theme will switch at the same rendezvous point on a level change. In the image below, I have just switched from Level 1 to Level 2 in the player, and the tracks on Level 2 are blinking. They will all switch at exactly the same time when Rendezvous is checked.

The reason this is important is that when you are switching from one key to another, you want all the tracks to change at the same time, otherwise you would be getting some very dissonant sounding music! Unless you are going for a Free Jazz style effect in your music, then this is very desirable.

Now let’s take a look at this short video on Key Change and Rendezvous:

There is another place where you can set the Rendezvous which is the final tab in the Theme Settings dialog, as shown here:

In the example above, you can tell the theme exactly when it is OK for a rendezvous to take place. Here, the music will only change on the first beat of bars 1,2,5, and 6, or the third beat of bars 3,4,7 and 8. Note this is NOT the same as Agility in that this will only be used if you have the rendezvous button clicked on the player, or if you are doing a key change. One final note – do you see that Fade of 75 ms on the bottom of the Rendezvous dialog? That is a global setting and it will apply to all tracks anytime a rendezvous occurs, even if you have set the fade out to 0 in the track…so beware of this.

Stingers and Pickup Beats

One of the coolest things to add to video game music are called stingers. You hear them all the time in game soundtracks, and they are used in genres like EDM and more and more in pop music as well. Stingers can be of many types, but the most common are risers, cymbals, orchestra hits, drum hits or rolls (including timpani), and SFX. These are often used to help hide or smooth transitions, and can also be used to signal a big change in the music. Key changes are often accompanied by stingers as well. Here are some examples:

Here are some hits:

And here are three different risers, which culminate in a hit at different number of bars (I used a great program called Rise and Hit by Native Instruments which you can use to create some of these with specific crescendo points:

In the second example, above, you can see that it would be important for Elias to know when to start playing the stinger so that the hit occurs at just the right moment. Well, that is where they have implemented the concept of pickup beats, which is located on the stingers page under the Track Settings.

As you can see in the image above, the first track, Hits, has no pickup beats because the cymbal crash, for example, would play right when the transition happens. For the 1 Bar Risers, where the Hit occurs after the first four beats, the PickupBeats radio button is clicked and the number of beats is set to 4, or 1 bar of the 4/4 song. This works wonderfully, as you can hear in this video example:

Urgency

The final thing that ties together everything you have learned so far is the concept of Urgency. Unlike other settings, there is no global setting for this parameter. It’s default value is 100. Urgency takes two things into account, the fade settings and the agility settings. As I described earlier in the description of fades, if Urgency is set to 100, then the shortest fade will be used. If it is set to 0, then the longest fade will be used. But that is not all that Urgency does. If you have it set high, to 100, then the engine will try to make the nearest possible Agility setting that it can. So if you have Agility set to every beat for example, then Elias will try to make sure it switches on the nearest beat. However, if you have the urgency set to 0, then Elias can be more picky about when it will switch. It will eventually ALWAYS make the switch to a new level, basically as soon as it can find a good place to do so.

Here is how Urgency works in action:

Conclusion

As you can see from this article, the designers of Elias have put in a tremendous amount of thought and care into making sure that you have a great deal of control over the playback of the music. As composers, it is important to understand all these capabilities so that when you are writing music for the game, you will know what the engine can do to make the most out of your hard work.

Prior to Elias and other middleware, game composers had to settle for writing loops that got to be very monotonous for the players. This caused ear fatigue to the point where many of us would just turn the music off rather than listen to another loop through of the same theme, no matter how good it was.

Now, we can offer gamers a tremendous amount of variety and we can also score their actions just like they were in a film. This does take extra time and planning on the part of the composers and programmers, but in the games I have worked on using this tool the benefits are clearly apparent. In the next article, I will show you a real world example of how this works in the VR game Stampede being developed by Black Matter Labs. It will show how Elias has been integrated into a project using FMOD and UNITY, and I look forward to seeing you back here then.

“The days of pure pencil-and-paper music composition are gone for good. If we wish to compose music in modern times, we must be able to navigate the wonderful world of computers and technology.” – Winifred Phillips, “A Composer’s Guide to Game Music”

Introduction

In my last article, I introduced some of the basic concepts of writing adaptive music using a middleware tool called Elias from Elias Software. In that article I promised to go into more detail and depth about the advanced features of Elias, and so let’s dive in! If you did not have a chance to read it yet, it would be a good idea to familiarize yourself with it here.

Elias, like a good video game, is easy to use but difficult to master. That means some detailed explanation is necessary to understand some of the advanced capabilities of the tool. This is a new type of tool with some new features, and therefore there are some terms that need more explanation, such as Objective Mode, Exploration Mode, Agility, Rendezvous, and Urgency. I will also go into Reverb and Reverb Tails; Fades and Crossfades; Agility and setting Custom Agility Beatpoints; Changing Keys in Elias and understanding the Rendezvous setting; and the concept and usage of pickup beats for stingers. I will also discuss some of the optimal settings for reverb and fades.

I take a fairly deep dive into the inner workings of Elias, but I also have created some videos for you to watch that will help to show the principles with real world examples, including a song I wrote specifically for this blog called Blade Revisited. By the time you finish this article, you should understand all the finer points of Elias and be able to use it like a pro!

Let’s check out Blade Revisited and an overview of some of the features I will be going into in this article:

Objective and Exploration Modes – In Theory and Practice

Objective mode is used in scenarios where the music needs to go from one phase of gameplay to another. In the score I created for this article, imagine you are a futuristic bounty hunter wandering through the techno-slums in a world similar to The Fifth Element or Blade Runner or games like Deus Ex. There is some tension building, but primarily you are just exploring the city. Then, out of the corner of your eyes you see your target, the evil slave runner Gothmong. Your heart quickens as you begin to chase Gothmong through the underbelly of the city. Eventually, you catch up to him and he is surrounded by his loyal goons, and a battle ensues.

In order to score this sequence of events, you would first need an exploration theme to play as you wander the city in search of Gothmong. Then you would switch to an objective mode for the chase leading into the battle mode. The battle would grow more and more intense. The intensity of the music could be correlated and controlled by game parameters such as your health and also Gothmong’s health or number of henchmen left. It would need to be able to play different outcomes based on your success or failure. Finally, if you succeed, then you would need the music to settle back down into the calmer, exploration mode theme.
Let’s take a listen to the three “scenes” that I created using Elias for this article:

Exploration mode plays different layers in the tracks randomly. One of the most annoying things about looped music in video games is that if you spend a lot of time in a particular area, you will hear the same music over and over again until you are sick of it. One of the big advantages of the exploration mode is variety. If you had 6 tracks with 6 variations (drums, bass, melody, harmonies, etc.) then you would have 6 to the power of 6 variations, or 46,656 variations! One of the most important things to remember when writing exploration themes in Elias is to make sure all the layers on the tracks play well together. To accomplish this, it is wise not to have too much difference in dynamic range between the different layers. For example, you can use different instruments for the melody or bass, but make sure they are all played around the same volume. Also, within the track, you would not want wild variations in the timbre or note ranges of the instruments. Try to keep them in similar ranges you would for vocalists of similar range: bass, tenor, soprano, alto etc.

Objective mode is great for music that needs to go from one activity or emotion to another in the game. In our example, the chase theme and battle theme are similar, but vary in intensity and in emotion characterized by the music. When the character is moving from one scene to the next, or one type of activity to the next is when it is best to use the objective mode. You have a lot more control over the type of music that will be played when, and Elias becomes a kind of conductor or arranger who follows the player around making sure the right music is played at the right time, just as you would hear in a film.

Here is a video showing Exploration mode for the Blade Revisited theme and comparing it to the same theme in Objective mode:

Reverb and Reverb Tails

The brain’s ability to detect reverb is an amazing thing. We evolved this ability to be able to tell the size of the space that we were in, and what objects are nearby by us. Dolphins ability to ecolocate is in some sense just a reverb superpower.

For example, if I were to play the sound of a snare drum in a closet and then again in a large cave, our ears would clearly perceive the difference. The brain would be able to detect the size of the room it was played in with amazing accuracy. Sound designers know this and most 3D games take reverb very seriously to ensure realism, and there are even some cool plugins developed by Impulsonic that do very accurate realtime modeling of rooms and spaces in games.

I find it useful to visualize these concepts, especially when writing about them, so here is an example of a simple snare drum hit without reverb (top) and with a long reverb tail (bottom.)

Composers who ignore reverb do so at their own peril. Well, maybe not peril, but it is very important to understand and employ it correctly.

In games that do not employ audio middleware, composers have to pay attention to reverb tails and handle them manually. We have to copy the reverb tail from the end of the loop and mix it into the beginning of the loop. This is a lot of extra work, and it is not a perfect solution since it can sound very strange the first time the loop is played to have some unknown reverb source in the mix (a fade in can help here.)

Elias handles this tricky situation beautifully by including a reverb feature that is applied to the audio at the end of the effects chain. What this means is that when the song loops back to the beginning, or even when it changes from one layer to the next, the reverb tail is added back in by the software engine, and you don’t have to do anything top the source files! Let’s check out an example of how that works in Elias in this next video:

Note that when you start a new theme, the default settings in the theme settings dialog have the reverb set to ON, as seen here:
If the mix is set to 0, the reverb is completely disabled irrespective of what the other settings are.This is probably a good idea IF you plan to set the reverb track by track, as shown here:

Note however, that the more tracks that have different reverb applied will have an impact on the CPU usage during playback. To help overcome this limitation, Elias automatically employs submixes for reverbs that have the same settings. For example, if you have 6 tracks in your theme, but only two different reverb settings, then only 2 reverbs will be used for all the tracks.

Think of the mix as the percentage of signal that is made up of reverb (If you had it set to 100 it would be almost all reverb) and the size as being the size of the room. As for optimal settings for reverb, in my experience with Elias, a pretty natural sounding room reverb is Mix = 20, Size = 20 and Damp = 35. If you were to set size to 10, mix to 10 and damp to 60, it would be like playing in a small carpeted room. If you set size to 100, mix to 30 and damp to 60 it makes a very convincing cave reverb. I give several examples in the video that you can listen to, and I encourage you to experiment on your own with simple sounds, like a snare, to see how the parameters affect the output.

In general it is best to include dry or fairly dry stems when importing into Elias and let Elias handle the reverb for you. Also, I have found that the music tends to sound most natural when there is at least some reverb on the mix. Finally, try not to have too many different reverbs in the theme – it is more optimal to use no more than two or three different reverb settings- say one for the drums, one for choirs, and one of the other instruments.

Fades and Crossfades

In general fades should be so clean that they are not noticed at all. If a fade is too obvious, it really can destroy the immersiveness of the music and our sensitive hearing can easily pick up a bad fade.

In Elias, you could just use the default settings and possibly never have any issues with fades, but as you will see there are some types of instruments you will definitely want to tweak the default settings to get the best results.

Another area of Elias that can be controlled with a great deal of specificity is fades. You can set ranges for fade ins, fade outs and crossfades.

To see how all this works, check out this short video here:

Here is the default setting in the global fade settings, in milliseconds:

The only time you will hear the fade ins is when a track starts playing, or when the track goes from a layer with silence to a layer with music. To see this visually, check out what happens to a kick drum when there is 0 ms fade in (top) and a 100 ms fade in (bottom.) Therefore, on tracks that have a quick attack, like drums, pizzicato strings or mallets, you would be tempted to turn the fade to 0, however, in practice this is not a good idea. It is better to set the fade between 5 and 10 ms for these cases in order to avoid any possible pops or dc offsets from happening during a layer change.

Conversely the only time you will hear the fade out is when the music on a layer goes to a layer with silence on it. To see this, check out the snare drum with reverb on it – the top track has no fade out and is not nearly as smooth as the bottom track which has a 500 ms (or half second) fade out.

Crossfades, however, are happening all the time and deserve some looking into. Different types of instruments require different crossfade settings. Let’s look at this example of the kick drum crossfading with a snare hit. The first two beats are the kick, and the second two are the snare. There’s reverb on the track, so you can also see the reverb tails here. In the top track, there is no crossfade and the snare plays without any of the kick’s reverb tail. In the bottom track, the crossfade is set to 500 ms and you can see how the reverb tail of the kick drum colors the sound of the snare in the third beat, and is mostly gone by the second beat. This is a more natural effect, and in general you will want to use crossfades.

What are the minimum and maximum settings you see in the dialogs? I will go into those more when I discuss Urgency later on, but for now you can think of it like this: when Urgency is set to 100, or most urgent, then the minimum fade setting is used. When the Urgency is set to minimum, or 0, then the maximum fade setting will most likely be applied. Why do I say most likely? That is because the engine uses heuristics to determine the best settings for each type of track depending on the tracks amplitude and transients.

Now that you understand the three types of fades and how they are applied to the music, what are some optimal settings? Here you can see the default settings that Elias will use if you do not set them yourself.

For the fade in, you will want to be sure to set both min and max to 0 for kick drums, and anything that has a sharp attack, as mentioned above. For legato instruments, choirs, and pads you will want to set the Min to at least 50. A 50 ms fade is not perceptible in a large mix, and therefore is a very safe setting, but sometimes it is not enough, especially for music that is not tightly quantized. For this, you may want to set the Min between 100 and 250. If it is too long, say 500 ms, then you will definitely notice it and probably should be avoided unless you want to apply that effect to the music.

For crossfades, the default settings are pretty good for most cases, but you can experiment with much quicker crossfades. You can add longer crossfades on slow legato passages or even on tracks that have a lot of dynamic range between each of the layers. For example, if you have some soft strings on one layer followed by loud horns on the next layer, you may want a longer and smoother transition there to keep from startling the listener. Stingers, which will be discussed in more detail below, can help to smooth these kinds of transitions as well, but are not required when you have the ability to fine tune fades so carefully.

For fade outs, you see they are defaulted to 500 ms. This half second fade usually works quite well and makes the transitions to silence (in that track) very smooth. The only time you would want to set it to 0 would be if you absolutely don’t want to hear any of that track when the next layer starts, but as I mentioned above, it would be better to set it to 5 or 10 ms in these cases to avoid pops. Longer fade outs can be useful in cases where you would want to have a lead instrument ring out into the start of the next layer, especially if it has reverb on it.

Agility and Custom Agility Beat Points

Agility is another key concept in Elias. It is what sets it apart from other middleware solutions I have used for music, and it requires some playing with it to understand how it works. Basically, it is the setting which tells Elias when it is OK to switch from one layer to the next. Usually, you want this to happen in time with the music, and very often you want that to happen on the first beat of a new measure. However, with Elias, you can set that to any beat of any measure you want! Moreover, by adjusting additional settings, you can give Elias more or less control over the times at which it will switch.

Let’s say, for example, you have a 4 bar melody line, but the harmony (chords) are changing every two beats. You can set the melody track to only change every 4 bars, but tell the chords it is ok to change every 2 beats! If you have the melody being played by a violin on one layer and a flute on another, you would not want the melody line to change in the middle of an important phrase. However, it would be great to have it change on the next phrase and even be able to do a call and response style interchange between several different instruments, all while the harmonies and rhythms can be as agile as they like underneath.

There are two places where agility can be set, in the Theme Settings dialog, which will be the global settings for all tracks:

and the Settings dialog for each individual track:

In the image above, on the left hand side you can see the default setting which is 1 bar. This means that Elias will default to switching every bar on the first beat of that bar unless you override that setting. On the left in the image above, you can see that Agility has been turned on, and a custom agility input box is shown by selecting “Custom Setting” from the drop down menu. Custom beat points can be added manually by clicking the “Add” button, but can also be added by starting playback (F5) and then just tapping the spacebar when you want to set a beat point.

This can be done in either the Theme Settings or the Track Settings, and here is an example of how it might look:

Check out the video to see this in action:

As you saw in the video, the notion of “strict” agility means that Elias will strictly adhere to the exact beat points you set in these dialogs. This all obeys the logic you indicate. For example, if you click the “Strict Agility On?” radio button in the Theme Settings Dialog, then Elias will obey those settings. If you then, on a track by track basis click the “Override Strict Agility in Theme Settings for this Track?” radio button, then, for this track, Elias will use the settings you made here. If “Strict” is never set, then Elias will use its best judgement to adhere to the agility settings in those dialogs, but it may choose to switch at a musically better time to do so. What is an example of this? Well, in the case of the kick drum with no reverb, for example, there is a silence before each beat. Elias can determine this and know that it is OK to switch at any time there is a silence before the beat. Elias apparently uses many other factors when determining when it is appropriate to switch, and that is part of its magic special sauce.

Changing Keys and Rendezvous

It is very common in video game music to play the same themes in different keys. Each key has its own feel to it, and it is one way game music can keep the player from getting bored with the music. Elias has a built in feature that allows for key changes provided you have exported your song in different keys. It does not do a warping or transposition of the music for you, but most modern DAWS have this capability. Rather than relying on the DAW to transpose, you can also just record in MIDI and then bounce to different audio files in different keys.

Once you have the new set of stems in the new key, just change the key in the Track Settings dialog (as shown in the bottom image) and then load them into Elias.

Then, in the player (F5) you can switch between the two keys very easily by selecting the key you want.

Notice the check box above Key which says Rendezvous? This is an important concept in Elias and it is directly applicable to key changes. When Rendezvous is checked, all the tracks in the theme will switch at the same rendezvous point on a level change. In the image below, I have just switched from Level 1 to Level 2 in the player, and the tracks on Level 2 are blinking. They will all switch at exactly the same time when Rendezvous is checked.

The reason this is important is that when you are switching from one key to another, you want all the tracks to change at the same time, otherwise you would be getting some very dissonant sounding music! Unless you are going for a Free Jazz style effect in your music, then this is very desirable.

Now let’s take a look at this short video on Key Change and Rendezvous:

There is another place where you can set the Rendezvous which is the final tab in the Theme Settings dialog, as shown here:

In the example above, you can tell the theme exactly when it is OK for a rendezvous to take place. Here, the music will only change on the first beat of bars 1,2,5, and 6, or the third beat of bars 3,4,7 and 8. Note this is NOT the same as Agility in that this will only be used if you have the rendezvous button clicked on the player, or if you are doing a key change. One final note – do you see that Fade of 75 ms on the bottom of the Rendezvous dialog? That is a global setting and it will apply to all tracks anytime a rendezvous occurs, even if you have set the fade out to 0 in the track…so beware of this.

Stingers and Pickup Beats

One of the coolest things to add to video game music are called stingers. You hear them all the time in game soundtracks, and they are used in genres like EDM and more and more in pop music as well. Stingers can be of many types, but the most common are risers, cymbals, orchestra hits, drum hits or rolls (including timpani), and SFX. These are often used to help hide or smooth transitions, and can also be used to signal a big change in the music. Key changes are often accompanied by stingers as well. Here are some examples:

Here are some hits:

And here are three different risers, which culminate in a hit at different number of bars (I used a great program called Rise and Hit by Native Instruments which you can use to create some of these with specific crescendo points:

In the second example, above, you can see that it would be important for Elias to know when to start playing the stinger so that the hit occurs at just the right moment. Well, that is where they have implemented the concept of pickup beats, which is located on the stingers page under the Track Settings.

As you can see in the image above, the first track, Hits, has no pickup beats because the cymbal crash, for example, would play right when the transition happens. For the 1 Bar Risers, where the Hit occurs after the first four beats, the PickupBeats radio button is clicked and the number of beats is set to 4, or 1 bar of the 4/4 song. This works wonderfully, as you can hear in this video example:

Urgency

The final thing that ties together everything you have learned so far is the concept of Urgency. Unlike other settings, there is no global setting for this parameter. It’s default value is 100. Urgency takes two things into account, the fade settings and the agility settings. As I described earlier in the description of fades, if Urgency is set to 100, then the shortest fade will be used. If it is set to 0, then the longest fade will be used. But that is not all that Urgency does. If you have it set high, to 100, then the engine will try to make the nearest possible Agility setting that it can. So if you have Agility set to every beat for example, then Elias will try to make sure it switches on the nearest beat. However, if you have the urgency set to 0, then Elias can be more picky about when it will switch. It will eventually ALWAYS make the switch to a new level, basically as soon as it can find a good place to do so.

Here is how Urgency works in action:

Conclusion

As you can see from this article, the designers of Elias have put in a tremendous amount of thought and care into making sure that you have a great deal of control over the playback of the music. As composers, it is important to understand all these capabilities so that when you are writing music for the game, you will know what the engine can do to make the most out of your hard work.

Prior to Elias and other middleware, game composers had to settle for writing loops that got to be very monotonous for the players. This caused ear fatigue to the point where many of us would just turn the music off rather than listen to another loop through of the same theme, no matter how good it was.

Now, we can offer gamers a tremendous amount of variety and we can also score their actions just like they were in a film. This does take extra time and planning on the part of the composers and programmers, but in the games I have worked on using this tool the benefits are clearly apparent. In the next article, I will show you a real world example of how this works in the VR game Stampede being developed by Black Matter Labs. It will show how Elias has been integrated into a project using FMOD and UNITY, and I look forward to seeing you back here then.

]]>http://www.gryphondale.com/2015/08/22/composing-adaptive-music-with-elias-part-2-advanced-concepts/feed/0Composing Adaptive Music With Elias – Part 1 – Introductionhttp://www.gryphondale.com/2015/08/22/composing-adaptive-music-with-elias-part-1-introduction/
http://www.gryphondale.com/2015/08/22/composing-adaptive-music-with-elias-part-1-introduction/#commentsSat, 22 Aug 2015 00:49:35 +0000http://s579734099.onlinehome.us/?p=979“But I’m just a soul whose intentions are good Oh Lord, please don’t let me be misunderstood” – Bennie Benjamin, Gloria Caldwell, and Sol Marcus “To be great is to be misunderstood” – Ralph Waldo Emerson Introduction Imagine you are a proud adventurer walking through a forest. Tranquil forest-y music is playing. Then you spot […]

Imagine you are a proud adventurer walking through a forest. Tranquil forest-y music is playing. Then you spot a cave, and you decide to enter. The music changes to a tense tremolo string theme. Then, all of a sudden, out of the darkness jumps the biggest monster you have ever encountered and battle ensues. The music grows gradually more fierce and frantic until you are within an inch of your life. Being a proud adventurer, however, you succeed in slaying the beast and the music changes to a victory theme.

How is this kind of music written, and how is it implemented in video games today? Welcome to this first in a series of articles about composing adaptive music for video games. In this first article, I am going to talk about the two main approaches to writing adaptive music, horizontal and vertical. I discovered a really interesting new tool called Elias that will help to illustrate my points and allow me to show you some concrete examples.

To demonstrate Elias, I composed an adaptive score which I call “Elias Space Theme.” Here is an MP3 version of the theme, which was recorded directly from Elias:

Skipping Ahead

If you already know the basics of Elias and how it works, you can skip ahead the Part 2 and 3 of this series here:

Sometimes, something so innovative comes along that it is misunderstood. Such is the case with Elias Software’s Elias Engine and Elias Studio. I first started using Elias Studio last year when it was just announced and in version 1.0. I immediately fell in love with its intuitive interface and powerful features. I was able to turn a linear composition into an adaptive score ready for a video game in less than an hour. Other middleware products that I have used have taken me weeks to learn how to do something like Elias can do right out of the box. Elias is not designed to be a replacement for middleware such as FMOD or WWISE, but rather as a complement to them. It has two main features, Elias Engine and Elias Studio. The best part – Elias Studio is free for composers to download and use and we can start creating adaptive scores that sound rich, complex, and natural, as if we had a conductor and a full orchestra in front of us. Elias Engine can be licensed by game developers and has a very friendly fee structure for indie developers that scales up for larger projects. There is a new Unity plugin, and it can be used on both Mac and Windows. It is the first audio middleware product that I am aware of that was created for composers by composers.

Let’s pause for a moment and take a look at Elias in this overview video:

Elias is a tool that composers and game developers can use to manage complex scores that play the appropriate music at the appropriate time based on the events and actions in the game. It is a win-win for composers who put so much thought and creativity into their music; for game developers who want to create great games and for gamers who would get an experience just as immersive as a movie score.

Elias is designed to handle live music rather than programmed or procedural music. It is able to adjust seamlessly and adapt between parts of the musical score in a way that is imperceptible to the listener. Elias behaves like a conductor, and the game tells Elias when to move smoothly between parts of the score. Elias manages when and how the parts change. Some parts change immediately, and other parts change at musically appropriate bars and beats.

In my estimation, having worked with Elias for many months, it fully achieves this goal and surpasses it beautifully. It is lightweight; cross-platform; stable and easy to use and implement for both composers and game developers.

Misunderstood

What about that misunderstanding? Well, when most people look at Elias they tend to think it is a loop engine, similar to Ableton Live. Indeed, it has a similar, familiar layout but that is where the similarity ends. There are two main ways of thinking about adaptive music composition for video games; vertical and horizontal. In horizontal composition, the music is often written in short 1-4 bar chunks that can be elegantly switched around and interchanged. For example, if two bar chunks are used, then for every two bars a new fully mixed set of two bar music would be played. A more common approach is the vertical layering, which has two main flavors. Vertical layers are analogous to stem mixing in standard DAW’s with the main difference being that not all tracks are meant to be played simultaneously. The two flavors are additive and interchange. In the additive approach, layers are added one on top of another, and in the interchange approach, layers are swapped out. Elias employs the latter approach, and this turns out to be one of the most flexible and versatile ways to write game music.

However, Elias goes one step further and allows you to set the way in which these layers are switched and the timing for each track. For example, if you have chords changing every 2 bars and a melody line that is 8 bars long, then you can switch chords every 2 bars, and your melody line every 8. The secret sauce is in how these layers switch – which can be exact (or “strict” in Elias terms) or fluid, which is determined heuristically by the Elias engine. I will explain this further in another article.

In addition, Elias currently has two modes; objective and exploration. In objective mode, the intensity typically builds from low to high, for example in a game situation where a battle gets gradually more chaotic and intense. In exploration mode, layers are played randomly and interchanged with one another to allow for tremendous amount of variation with a relatively small number of “tracks.”

Interface

The interface is clean and elegant with a lot of features buried beneath the surface. Once you open Elias Studio, you can easily add tracks and then simply drag audio files into the interface and arrange them.

There is a global settings dialog which lets you modify the overall settings for the project, such as reverb settings; whether you want an “Objective” or “Exploration” theme, and so on.

There is also a settings dialog containing modifications such as fades and “agility” which tells Elias how and when to switch between layers. There is also a reverb setting that can be different for each track if desired.

Also pictured here is the “stinger” tracks which lets you set up any number of stinger types and variations for your song.

There are a full set of tutorial videos on their website, and I encourage you to watch those as they go into great detail about all the wonderful features of Elias Studio. There is also a special type of track called “Stingers” which lets you add stingers such as cymbal crashes; orchestra hits, or drum rolls at any point in the composition, such as on level changes.

Elias in Action

Let’s now take a deeper look at Elias in the next video:

The vision for this composition came from watching the TV program Cosmos, and from playing video games such as Master of Orion, Elite Dangerous and MassEffect. I wanted the theme to have an idée fixe, or theme that would be played by various instruments and synths as the theme evolved over time, along with variations for the different scenes. I wanted it to be ethereal, infinite, and accompany the beauty of flying through space and entering various solar systems. This was not designed to be battle music, or haunting in any way, so I decided on a major chord harmony that evolved through the circle of 5ths. This allowed the music to move through six different keys and return to the beginning, analogous to a Shepard Scale which is used frequently in video games. The Shepard Scale is a kind of musical illusion that gives the infinite staircase feeling you might be familiar with from Super Mario 64.

In terms of tonality, or timbre, I wanted the composition to be alive with full orchestra, but also to have an ambient evolving synth bass. So, imagine if you will a full orchestra jamming with synths and you’ll get the picture.

To learn more about the composition, let’s take a look at the Pro Tools Session that I used to mix the song:

Conclusion

This article is an overview of the two main types ways composers write adaptive music for video games, horizontal and vertical. In the videos I go into detail about the most advanced type of writing adaptive music, the vertical interchange method. One tool designed to help composers write and implement this type of music is Elias.

Elias is an elegant and powerful tool for creating complex adaptive scores for video games. It could even be used for scoring videos and films as well. Other middleware solutions are very complicated to learn and even more complicated to master. None of them currently offer an easy solution to do the types of things that Elias can do out of the box. Elias is lightweight, non-destructive and can switch between musical ideas on the fly and make musical transitions seem effortless. Elias is designed to handle live, non-quantized music and play it back with incredible variety, smoothly.

It is free for composers to use and has a very flexible licensing plan for developers. There are opportunities for composers to create themes and then sell them through the Elias website, and there is also a collection of amazing themes available now on the website.

Elias is now in version 1.5, and is constantly undergoing feature upgrades and has evolved in many exciting ways in the time that I have been using it. The developers have recently received significant funding, and are working around the clock to add better functionality and new features, such as the recent Unity integration. They are very approachable, friendly, and willing to listen to new ideas and consider adding new features all the time. Best of all, they are here to stay and have created an amazing tool that I have really enjoyed learning and using on my projects.

In the next article I will talk about some of the advanced features of Elias, and in Part 3, I will go into writing adaptive music for a virtual reality game called “Stampede” which is written in Unity for Oculus Rift, HTC Vive and the Sony Morpheus.

]]>http://www.gryphondale.com/2015/08/09/blade-revisited/feed/0Warning: Parameter 1 to W3_Plugin_TotalCache::ob_callback() expected to be a reference, value given in /homepages/1/d579734097/htdocs/Gryphondale.com/wp-includes/functions.php on line 3343