Magnetometer measures the derivative of the magnetic field, or dB/dt, with an output in microvolts (mV). The Sampling rate is 128 Hz, so if we collect data for 2 minutes, $2 \times 60 \times 128=15360$ points (discrete case). When I perform an FFT on these time series, what will the units of amplitude density be after the transform?
Answer
In short: if you directly use the voltage output, your FFT amplitude should be in mV as well. If you have the sensor calibration curve, the FFT amplitude should be in Teslas per second (T/s), as you look at a derivative.
If you look at a power density spectrum (squared), then the units above should be squared as well (mV$^2$ or T$^2$/s$^2$).
In general, for a given unit U, albeit in volt (V), tesla (T), whatever. The FFT sums samples in the original units (U) multiplied by unitless complex values (due to discretization). The units after FFT remain the same as for the original signal, i.e. U.
If you take the absolute value, the same again. For instance, the $0$-frequency index gives you the DC, or average value of your signal (or at least it is proportional, with a factor of the number of samples, or its square root depending on the normalization).
The continuous Fourier transform, in contrast, "sums" the samples times complex cisoids times a time differential (the $\mathrm{d}t$). Its units would be U.s (s for seconds).
No comments:
Post a Comment