Take a look at the attached grc file:
As implemented, it does not work (changing the delay does not have an effect).
If I introduce the "fictitious" filter (1,0,0,0,0,..)
it works as expected.
AM I doing something wrong in the first case?
Achilleas

It seems that the runtime machinery pays attention to d_history *only*
on block init, and at no other time, which leads to unexpected

results. But, surely, this must have worked at some point?

I mean, I regularly use filters whose parameters I change dynamically,
and they apparently do what I want, although, perhaps at the moment

of changing parameters, the phasing isn't "right", but they seem to work.
Someone with more exposure to the gr-runtime stuff should comment here.