Tuesday, February 13, 2018

noise - Relationship between entropy and SNR


In general any form of enropy is defined as uncertainty or randomness. In a noisy environment, with increase in noise, I believe that entropy increases since we are more uncertain about the information content of the desired signal. What is the relationship between entropy and SNR? With increase in signal to noise ration, the noise power decreases but this does not imply that the information content of the signal increases!! The information content may remain same, so does that mean entropy is unaffected?



Answer



When you say that the "information content may remain the same," do you mean the information in the total signal, or the information of the desired signal? Hopefully this will answer both cases. I know Shannon entropy much better than Kolmogorov so I'll use that, but hopefully the logic will translate.


Let's say $X = S + N$ is your total signal ($X$), comprised of the sum of your desired signal $S$ and your noise component $N$. Let's call entropy $H$. As you said, noise adds entropy to a system by increasing its complexity. However, it's not necessarily only because we're more uncertain about the information content of the signal, but because there's more uncertainty in the signal overall. If SNR kind of measures how certain we are of what $S$ is, then $H(X)$ kind of measures how well we can predict future states of $X$ based on the current state of $X$. Entropy is concerned with how complex the whole signal is, regardless of the composition of noise vs. non-noise.


If you increase SNR by removing noise (attenuating $N$), you decrease the total signal $X$'s complexity and thus its entropy. You haven't lost any information carried by $S$, only (presumably meaningless) information carried by $N$. If $N$ is random noise, then obviously it doesn't carry meaningful information, but it takes a certain amount of information to describe $N$'s state, determined by the number of states that N can be in, and the probability of it being in each of those states. That's the entropy.


We can look at two Gaussian distributions with different variances, say one has a variance of $1$ and the other has a variance of $100$. Just looking at the equation for a Gaussian distribution, we see that the $Var=100$ distribution has a maximum probability that is only $\frac {1} {10}$th the value of the $var=1$ distr's probability. Conversely, this means that there is greater probability that the $Var=100$ distr will take values other than the mean, or that there is more certainty that the $Var=1$ distribution will take values near the mean. So, the $Var=1$ distribution has a lower entropy than the $Var=100$ distribution.



We established that higher variance implies higher entropy. Looking at error propagation, it is also true that $Var(X+Y) >= Var(X) + Var(Y)$ (equal for independent $X$, $Y$). If $X = S + N$, then for entropy $H$, $H(X) = H(S+N)$. Since $H$ is (indirectly) a function of variance, we can fudge things a little to say $H(Var[X]) = H(Var[S+N])$. To simplify, we say $S$ and $N$ are independent, so $H(Var[X]) = H(Var[S] + Var[N])$. Improved SNR often means attenuating noise power. This new signal with higher SNR will then be $X = S + (\frac {1} {k})N$, for $k>1$. Entropy then becomes $H(Var[X]) = H(Var[S] + (1/k)^2*Var[N])$. $k$ is greater than $1$, so $Var[N]$ will decrease when N is attenuated. If $Var[N]$ decreases, so does $Var[S+N]$, and therefore $Var[X]$, resulting in a decrease in $H(X)$.


Not very concise, sorry. In short, $X$'s entropy decreases if you increase SNR, but you've done nothing to $S$'s information. I can't find the sources right now, but there's a method to calculate SNR and mutual information (a bivariate measure similar to entropy) from each other. Maybe the main takeaway is that SNR and entropy don't measure the same thing.


No comments:

Post a Comment

digital communications - Understanding the Matched Filter

I have a question about matched filtering. Does the matched filter maximise the SNR at the moment of decision only? As far as I understand, ...