I'm designing a digital PID for control and stabilization of laser amplitude. The noise PSD of the laser intensity is around -140dB/Hz, and I want to servo that between DC and 100kHz. I think I've found a solution on the input side, but would like to make sure my understanding of noise spectral density vs SNR is correct.
I'm thinking about using an ADC at 5MSPS with a SNR of 100dB. My understanding is that this means that the power spectral density for a full-scale signal will have a noise floor of -163 dBFS/Hz, calculated form the SNR and the bandwidth. Since I only want to servo to 100kHz, I can use a digital FIR filter to maintain that noise floor so that I have a healthy amount of SNR between the noise that I want to servo and the intrinsic noise of the ADC.
What I'm a bit confused about is what the FIR filter does to the noise spectrum. I'm pretty sure that the filter will simply attenuate everything past the passband, and maintain the 167 dBFS/Hz - the SNR increases, but the effective bandwidth decreases. Is this actually correct?
No comments:
Post a Comment