A simple example scene

To get your hands dirty quickly here is an example scene that will get you up and running within a minute. This example demonstrates a very reverberant hall in which a listener and sound source are placed, separated by a low wall. The sound that is emitted by the source is a short click to make the acoustical properties of the hall explicitely audible. The result that is given below features some very distinct early reflections. Over time the response of the room fades into the more gradual tail of the reverberation. A more detailed documentation for all of the features and settings is being worked on. You can download a .zip archive that contains all necessary files from github.

Result

Walkthrough

Several panels are available that allow you to configure settings for the auditory render

For meshes to reflect sound they need to be explicitely told to do so in the interface

The material properties define the interaction between the rays of sound and the meshes

The sound source specifies what .wav file goes into the simulation

The Listener object specifies the file path of the output .wav file

An intermediate .EAR file is generated by Blender that specifies all the data and can be read in by the application

The rendering application is a console application. The process consists of two steps. The complexity of the first step depends mostly on the amount of reflecting geometry. The second step depends mostly on the length of the rendered impulse response and the length of the input sound file