I really loved that one, lots of content and really well done. A lot of real life haunted houses could get some inspiration from it :)
Please do a final with proper sound syncing and a little more sophisticated/contrasty lighting

A quick note on the audio of this production. As already noted in the nfo, we used a slightly modified version of Oidos for this production. Audio was rendered into 4 separate channels, allowing us to control the volume of the channel as well as the amount of reverb on the channel. 4th channel also allowed panning. All of these parameters was controlled from withing rocket. This of course meant, that working with the track got quite complex, as I did not have that information in Renoise.

The general concept for the audio was this:
Use channel 1 for rooms/scenes
Use channel 2 for hallways connecting rooms/scenes
Use channel 3 for rooms/scenes
Use channel 4 for sound/direction-specific effects

So as you move from room A (channel 1) into the hallway (channel 2) leading to room B (channel 3), we would turn down the volume of channel 1 while maintaining the reverb. Volume of channel 2 was then faded in to give the sound/music from the hallway. About halfway through, the reverb from channel 1 was faded out while reverb from channel 3 was faded in, making it appear as if you start to hear the sounds and music from the next room as you travel through the hallway. When entering room B, the volume of channel 3 was then turned up and the volume of channel 1 and 2 faded out.

Now, in theory, this was a super fantastic idea that could lead to some quite interesting effects and a brand new and more realistic sound experience. However, all of this being controlled from Rocket, meant that while working on the track in Renoise, I did not have the channel-volume and reverb level available so there is more or less 3 different pieces/parts of music playing on top of each other at any given point in time.

To help me keep track of which instruments were playing in which channel, I created 4 tracks for each instrument and used the built-in Send track-DSP in Renoise to direct the output of the track to one of the 4 dedicated main sends. They, in turn, all send their output to one master send, where I had the reverb.

Feel free to behold the insanity (and of course to grab the instruments or whatever you want to do with it) http://www.niebe.dk/fnuque/Haunted_v50.xrns but please take note, that it will sound super weird from the lack of channel volume control applied to it.

Since we didn't really have to shave off bytes to have it fit the 8k limit, song data is in no way optimized for size. I did chop up a few of the longer note-durations with the organ and choir-instruments to reduce the huge precalc time required, though.

I might be wrong here (and I hope I'll be corrected if I am), but I don't think they could not find any more room for size optimization; after all, it hardly pushes new boundaries on the amount of content or, say, geometric complexity, or post-processing. Feels more like they were rather short on time and/or did not have any concrete plans to improve it further. But more than that I think the demo wouldn't necessarily be better off if it were on the visual level of that Dying Stars clown head model. The stunted visuals actually help the campy feel imo.