I was explaining to someone how Fourier series work in context of constructing signals that are not everywhere differentiable, e.g. square waves, sawtooth waves, etc. When I mentioned the Gibbs phenomenon however, I realized that I never really learned of why it happens. In fact, as the story goes, not everyone even realized that it's an actual mathematical property of infinite series of periodic signals and not a computational fluke, and it turns out that most proofs are fairly laborious and elaborate.
After reading several of them, I started realizing why such phenomenon might occur, but I have a background in real and complex analysis, topology and so on. The question is can I fully explain and rigorously prove Gibbs phenomenon mathematically to someone with only the basic undergraduate calculus courses in their arsenal (or any other general prerequisites for an undergraduate signal processing course)? If so, then how?
Answer
The book "Dr. Euler's Fabulous Formula: Cures Many Mathematical Ills", by P. Nahin, Princeton University Press, leads up to and contains an explanation of the Gibbs phenomena which might be suitable for someone with a good undergraduate university level math background.
No comments:
Post a Comment