Dan Reynolds is an award winning composer and sound designer with a love for video games and interactive arts. For almost a decade, Dan has engaged every new project as an opportunity to create compelling audio experiences. Whether cutting dialog, designing sounds, producing music, implementing in-engine audio, MIDI programming, or just writing music--Dan embraces varied and new opportunities.

The ExiledGame Kickstarter Trailer CueThis was originally composed for a Kickstarter trailer for an Indie game project. Classic 3 Act composition.

Embracing The Madness (Alt Version)MAD Drums Demo TrackThis was written as a demo cue for Handheld Sound's MAD Drum Virtual Instrument--besides a single synth patch and the drums, all other sounds are recorded live and processed heavily.

Beyond the StarsWonderland Adventures 3This was originally composed for the afterlife zone in Wonderland Adventures 3--the objective was to create a spiritually resonant piece that did not speak to any specific Earthly religion.

Implementation Example:

Here is a recent example of some interactive music composition/design/implementation. I wanted to experiment with the Unreal Engine 4's visual scripting system called BluePrints--I had a hypothesis on creating a smooth crossfade between multiple layers of music activity and I had to build a level to test my hypothesis.

The first thing I had to build was a case manager hooked into the Player Controller.

The interactivity concept required real-time fade movement based on player character movement, so I built a simple switch case that set a numerical target for a mathematical interpolation system.

Next in line is the interpolation manager-this series of blueprints manages which layer of music will be fading in at any given moment and whether the fade is on its way out or coming in.

The MovementControlValue is essentially the current value of the activity level and it is constantly modified by the FInterp To function. It's always trying to reach the Movement Target value, which is passed from the Activity Target value from the previous blueprints. Essentially, if the player character is in motion, the movement target value is high and so the MovementControlValue tries to go up, if the player is not moving, then the Movement Target value is low, and the MovementControlValue tries to go down.

I vary the interpolation speed based on whether it's going up or down. This way, the music ramps up faster than it ramps down.

After this point, I set the Med Status and Act Status (which are the volume levels of the two fading music layers) and feed them into an arbitrary pointer called Current Track which points to the currently playing audio component.

Below, you can see my Music Segment--in this case, two bars of music (or a single vertical slice of music). Here is where I set the value of the play length, play the music cue, and then delay the tick for the duration of the music.

By using a delay, I can arbitrarily start the next piece of music-this allows my music segments to have reverb tails, allowing a smoother transition between segments and a more musical stop, if I branch off to another segment or some exit point.

The Music Delay Duration is a custom function I built within BluePrints. It's essentially a music math function that determines the length of time of each individual beat based on BPM information and then multiplies that amount by the number of beats reported.

Sound Replacement Demo

I recently spotted a very cool student animation project and got permission from the artist to redo the Sound Design for my own demo purposes—this recent work I feel expresses well my sense of mixing, editing, and naturalistic design (approx. 20-30% originally sourced sounds).

This is a more cartoony or arts-and-crafty example of a sound design aesthetic. This was a casual physics puzzler I did sound and music for—the sound was a lot of fun to work on and the sound sources are approx. 95% originally recorded.

I won’t lie, I spent a fair amount of my time trying to get an endless sludgy loop of leaking goo. The creature vocalizations are actually more varied than is on the video, but the client was still tweaking an emotional state system for the critters when he recorded this.