Overview

The following page outlines an audio study in which I take an existing game, remove all its audio, and create a completely new audio profile from scratch. For this study I used the "Shooter Game" in Unreal Engine.

Ramp up

Click to hear placeholder audio further outlined below.

Before doing anything, I highlighted the "Audio" folder that came with the project and deleted it.

I made a few temporary cues to hook into all the existing, now empty, audio hook locations.

This process lets me explore the project bones and identify how the game's systems interact with each other on the back end.

I like to take notes during this time, to list aspects of the current set up I plan to modify and make better.

Record Foley & gather raw audio

At this stage I have a good idea about the raw audio that's needed to create my cues, so I move on to gathering those resources.​ Some of those come from my own backlog, my DAW (for any tonal sources), and from recording fresh FOLEY.

Being creepy in my apartment, wondering if my neighbors hear all this.

cue sheets

At this point, I've begun an extensive audio cue sheet to track the wave file assets I'll be creating. That sheet is set up to track the file paths and naming conventions that are in the project's "Content" folder.

Master in Adobe Audition

Using my cue sheet, I knock out each asset one by one, always keeping organized at every step.​I use Adobe Audition to edit all my raw sources. If there is a synthetic, tonal source needed, I use FL Studio to generate those types of sounds.

Video grabs were taken to use for audio that was tied to animations.

set up cues in unreal engine

Once I've made the singular sound effects and brought them into the engine, I set them up as cues to be called in the game. ​Note: This is a multi player game and so I included Attenuation nodes for handling the cue as either a local or remote player.

Add cues to audio hook locations

In one of my first steps I named all my initial placeholder cues by their intended final names, and added them to their hook locations. Most of these place holders update to the final sound as soon as I finish a cue, but there are some instances, like animations, that required me to wait till now to implement. This step sweeps up, and double checks that everything I wanted is in the game functioning.

Sound Classes

So far in this process, I have not made any, global balance passes to my audio based off play testing. I prefer to master all my audio at a volume range of -12db to -6db, outside of the engine, regardless of what systems they apply to and then use Unreal's sound class hierarchy to make those sweeping dynamic adjustments.Here are the classes I ended up with:

Passive Sound Mix

I didn't need to do much here, but there were a few Player Health systems tied to Player States, managed by the User Interface. I created a sound mixer, that muffled all SFX Class audio and turned the volume down. I then added that mixer as a passive effect triggered by the UI Sound Class.​ The effect is, if the player has any Systems like, menus, or low health HUD overlays present, then the audio class will dynamically handle the audio balancing.

Reverb Settings

Acoustics are something that really bring an audio profile together. I identified two locations with in the map and created two reverb effects. ​