I'm attempting to make a Flanger plug-in for one of my University projects, and I'm a little stuck on how to implement the LFO to control the delay time parameter in C++ code. I was just wondering if anyone could suggest some example code of a very generic Flanger unit. Would I need to create a sine wave array via the sampling frequency, and then multiply each index by a sample in the buffer?

Using existing code for a university project, that's cheating!! Just think about how a flanger worked in old hardware...

You have a bucket brigade delay. Modulation influences the rate at which the buckets are processed.

Back to building that in software: you have a fixed size buffer for the delay. The LFO changes the rate at which samples are put in and read out. So the first requirement is you have a mechanism to run the delay buffer at variating rate, which is different than the sampling rate of the host. So some flexible form of resampling / sample rate conversion is needed. Got that part already?

Now the LFO: it's trivial to make an oscillator. Waveform doesn't need to be sine, can be triangular also (more used in practice I think) which is easier to implement. How often do you ask the LFO it's value? At each incoming sample?

We are the KVR collective. Resistance is futile. You will be assimilated.My MusicCalc is back online!!

Forget BBDs unless you're specifically trying to model a BBD and prepared for the additional challenges of variable rate processing. That's not how most digital flangers work anyway, it's totally different principle.

Classic digital flangers are much easier to implement in software: you simply need a ring-buffer (or something equivalent) which stores a few milliseconds of previous input, and you "access backwards" using interpolated reads (start with linear interpolation, implement something better later if you want to improve quality). How much you access backwards depends on the LFO value.

Thanks very much guys. My thoughts at the moment are to generate some form of shifted/scaled sinusoid (with values ranging from 0 to 1), muliplied by the buffer size of the circular buffer to then attach to its 'read' index, producing the varying-delay for the output. The sinusoid would be running at the same sampling frequency as the buffer, so it would be an array multiplication, outputting different values for each incoming sample. Would any of this work in practise? Apologies if this is all very basic stuff - I'm very much a beginner in this field (if you hadn't already worked that out )I'm in the process of understanding linear interpolation at the moment, but only for the purpose of achieving a fractional delay unit to avoid glitches. I hadn't really thought about it from that side...

and BertKoor: I'm not cheating, honest! I know it looks bad from here, but I really DO want to fully understand how they all work, rather than just submitting a copied project and learning nothing from it. Plus, I don't want to insult anyone's intelligence by assuming that they won't be able to spot an obviously-plagiarised code

The purpose of the linear (or whatever) interpolation is exactly that: allow you to take any fractional value for the read index, since rounding to the nearest integer sample sounds remarkably horrible. Feel free to try, though.

Other than that, the high-level description sounds fine. Note though that you probably want to control the LFO rate, the max delay time, and the depth of the modulation. Flangers often also allow feedback (mixing the output from the delay back to the input; for stability keep the feedback gain strictly between -1 and 1 and be careful not to blow your ears when testing).

edit: Remember the delay ring-buffer can be larger than what you actually need. It's often easier to keep a constant size buffer even when modifying the actual delay, and some people might even round the delay size up to a power-of-two to allow bitmasking the indexes for cheap modulo.

Thanks very much I'm planning to allow a controllable parameter for the oscillation rate via the cos(2*pi*f*t) function when I create the waveform, and the max delay time will be controlled by changing the buffer size of the buffer array. I'm hoping to add feedback and mix controls, but I'm just using a simple feedforward system so that I can check the functionality before moving on to more ambitious things