Butterworth filter "regular" and "zero phase"

The Butterworth filter is a commonly known filter often used in online filtering (during the measurement) and introduces a typically undesired phase shift (delay) into the filtered data.

The length of the delay increases with decreasing cut-off frequency and increasing order. With increasing frequency (that is the frequency content of the signal) and filter order the phase shift becomes non-linear. That means the delay is not the same for all frequencies.

The red signal below has been filtered with a regular Butterworth filter at 50 Hz and a filter order of 8. You can clearly see the phase shift compared to the original (black) signal.This phase shift can be prevented only if the complete signal is known in advance, which is impossible during a measurement.

Offline, this restriction obviously does not apply, here we can prevent the phase shift by going forward and backward over the signal to eliminate the phase shift. This is done when the setting zero phase is selected. You can see from the blue signal below that the phase shift is gone.

We have kept the regular mode in the software, since this is how the Butterworth filter is designed. The meaningfulness of its usage is rather an academic discussion.

More detailed information on the Butterworth filter can be found here: