lsd's rants about games, music, linux, and technology

Main menu

Post navigation

thinking inside the box

The computer has revolutionised the way we make music, but it also begs a question: how much work do you do “in the box”, using software sequencers, effects, and instruments, and how much do you do with hardware and traditional instruments? When I started making music again last year, having a powerful hardware synth was a huge enabler for me — I really do believe that it, as much as anything, is the reason I’m still making music with Linux now after so many abortive attempts over the years. Now that I have a few tracks under my belt, though, I’m as surprised as anyone to realise that I seem to be working “in the box” more and more.

That’s not to say that I started with a totally hardware-centric workflow, though. My work has always revolved around Ardour, not just for recording, but also for effects, and I’ve used Hydrogen for drums on just about everything I’ve recorded so far. However, on my first track (atlantis), the instrumentation was all Blofeld, and it was mostly played live, with just a few bits of piecemeal sequencing.

Just another day at the office: Qtractor, Ardour and Hydrogen running in unison

By comparison, my latest track (frozen summer) was made entirely in the box, though it’s perhaps not a fair comparison point since it was 100% sample-based and I don’t have a hardware sampler. A better comparison is my cover of Enjoy the Silence, where everything but the (Hydrogen) drums was sequenced from start to finish in Qtractor, using software synths as well as software effects.

Some of that was FluidSynth playing Soundfonts for the less “synthy” sounds (guitar, horns, etc.), but I also made extensive use of WhySynth, an analog-style synth. I even worked with WhySynth the way I’d work with my Blofeld, crafting my own patches instead of using presets for most sounds. Why did I ditch my hardware and embrace softsynths?

The box is convenient

The answer, perhaps unsurprisingly, was convenience. With a separate softsynth on each track, it’s very easy to create custom patches on each one, and then add in custom effects chains that process the results on each channel, too. Using multi-mode on the Blofeld I can run multiple patches at once, and edit them individually, but the results come out a single stereo output in to my PC’s single stereo input, so I can’t add realtime effects to individual instruments unless I use the Blofeld’s (very limited) internal effects. Recall is also an issue — with softsynths, when you load your session, it’s exactly as you left it, but with a hardware synth, you usually have to set it back up yourself.

The other obviously convenience factor is portability. I spend a bit of time on buses, and working with softsynths means that I can do everything on the go. My laptop has a 2.4Ghz Core 2 Duo, and it’s easily handled everything I’ve thrown at it so far.

WhySynth doesn't inspire, but it's a good bread-and-butter synth

That’s not quite the end of the story though, because I did end up using the Blofeld on that track. After sequencing it all on my laptop, I moved it to my desktop to add some polish, and while all the FluidSynth sounds stayed in the final product, I replaced some of the WhySynth sounds. It’s a perfectly serviceable analog-style synth but, try as I might, I couldn’t get enough “oomph” in the bass part, or the right filter squelch in the “bieuuw” effect sound that comes in around the second chorus. With the Blofeld, I was able to nail both sounds very quickly.

There’s also a tactile element that softsynths are typically unable to capture. A good synth is an instrument in itself, with an interface that beckons the user to create new sounds and interact with them in realtime. A softsynth might be capable of making the same sounds, but I’m yet to find one that inspires me like a good hardware synth can.

Finding a balance

Ultimately, a lot of the choices about when to use hardware or software come down to compromise. Using softsynths plugins inside Qtractor is very convenient, but the available synths have limited possibilities. Using hardware gives me better sounds that I can program more quickly, but it ties me to my home studio, and limits my effects options while sequencing.

There’s even a middle-ground between the two extremes under Linux — JACK softsynths like Specimen and PHASEX. I can run these on my laptop, and depending on how they’re set up I run separate realtime effects racks for each under Ardour, but they require setup each time you start them, so you lose the convenience of total recall that you get with softsynth plugins.

I think the answer to the question of when to use hardware or software for a job is “it depends” — each track is going to have its own ideal line in the sand, and that’ll vary from person to person, too. Sometimes I’m going to start with a killer sound or riff on the Blofeld and build a track around that, so it’s going to make sense to use hardware all the way through. Other times, I’ll be able to get away with softsynth plugins, even if they’re just guide sounds that I end up replacing later, or it could just be that a softsynth is the best tool I have for a particular job.

Perhaps the best answer is to use whatever gets me the best results with minimal fuss and maximum enjoyment. I don’t think softsynths will ever replace my hardware, but that adding them to my toolkit has definitely made me more productive.

7 thoughts on “thinking inside the box”

Isn’t it complicated, that everytime you’ll go on with the project, you must start 3 apps up and load the project in both ardour, qtractor and hydrogen and then connect them in qjackctl or do you have an easier way of doing that (LASH or something like that)?

Nope, you’re right, it’s a little involved right now — I have to launch the three apps and load their respective projects independently. Ardour remembers connections, though, so if I have Hydrogen hooked up to track inputs in Ardour for monitoring and recording, for instance, Ardour will remember that, and automatically re-connect to Hydrogen when you load your project. For this to work, you just need to make sure you launch Ardour last, so that all the relevant JACK ports are already there for it to connect to.

Where it becomes a lot more complex is when you involve other JACK apps, like external synth apps (PHASEX or Specimen, for instance). Then you have yet another thing to load and configure independently, and you may have to manually hook up your MIDI connections, too. Qtractor seems to remember ALSA MIDI connections in much the same way that Ardour remembers JACK connections, though, so if you launch your synths, then Qtractor, and then Ardour, you should be okay.

While this works well enough for the most part, it’s definitely not ideal, and it’s certainly the main reason why I’m finding myself doing more work in Qtractor rather than relying on external synths (either hardware or JACK software). It’s a hard problem to solve, though — LASH was one attempt, but it seems to have died a quiet death. LADISH seems like the current favourite for session management — I’m not sure how widely supported it is right now, but it looks like it might succeed where LASH never quite managed to.

Yep, I’m really hoping that Ardour 3’s MIDI handling is mature enough to replace my need for Qtractor; it should also have some of the automation features that Qtractor currently lacks, too. I need to work with both audio and MIDI, and while I know Qtractor can handle both, it doesn’t feel like it’s quite up to the task with audio for my needs.