Recently, I've read several papers where zero-lag Butterworth filters are used (both low-pass and high-pass).
I have a fair understanding of (Butterworth) filters, but how can we design one with zero lag? Can someone explain me how this works?
From my understanding, any low-pass filter yields an intrinsic delay.
Or does this involve bidirectional filtering?
Answer
As you suspect, this paper says that zero-lag Butterworth filtering is obtained by passing the signal through the filter twice: once in the forwards direction and once in reverse.
No comments:
Post a Comment