Can somebody mathematically prove why $$\mathrm dq = T\,\mathrm dS$$ for a reversible process?
I realize I am supposed to provide some research but I can't find anything. The best I got is $\mathrm dE = T\,\mathrm dS$ which is basically $\mathrm dG = \mathrm dH - T\,\mathrm dS$ and it seems to be a circular argument (since for derivation of $G$ most proofs use the equation $\mathrm dq=T\,\mathrm dS$).
Answer
$\def\d{\mathrm{d}}\def\c{_\mathrm{C}}$Firstly, note that a more correct version of your equation is
$$\delta q_\mathrm{rev}=T\d S.\tag1$$
We use a Greek delta to signify that heat is not, in general, a state function. Its value during a change does not simply depend on the beginning and end points but also on the way which the change is traversed. Subscript '$\mathrm{rev}$' is for a reversible process.
Secondly, I should warn that the proofs I supply are in no way rigorous but should suffice for a standard thermodynamics course.
Thirdly, I have used a small $q$ for heat, reserving capital $Q$ for a partition function.
Classical thermodynamics
The classical definition of entropy in thermodynamics is due to German scientist Rudolf Clausius. He showed that, for a cyclic process, the following inequality holds:
$$\oint\frac{\delta q}{T} \leq 0.\tag{2}$$
Lemma. For a Carnot cycle, the following line integral over a circular path equals zero: $$\oint\c \frac{\delta q_\mathrm{rev}}{T} = 0.\tag{L}$$
Proof
Recall and remember that in a Carnot cycle efficacy $\eta_\mathrm{C}$ of a system may be expressed in two equivalent ways, by a heat quotient and a temperature quotient.
$$\eta\c=1+\frac{q_2}{q_1}\tag{3a}$$ $$\eta\c=1-\frac{T_2}{T_1}\tag{3b}$$
Substrating one from the other yields
$$\frac{q_2}{q_1} + \frac{T_2}{T_1}=0\tag{4}$$
which, through algebraic manipulation, is the same as
$$\frac{q_1}{T_1} + \frac{q_2}{T_2}=0\tag{5}.$$ Hence for a Carnot cycle, reversible by definition, we optain
$$\oint\c \frac{\delta q_\mathrm{rev}}{T} = 0.\tag{L}$$
I will omit the subscript $_C$ for Carnot in the integral when I mention in the text or it is obvious by the same subscript on some other quantity.
Proof of Clausius integral inequality
Start with the first law of thermodynamics.
$$\d U = \delta q + \delta w \tag6$$
Therefore, for a Carnot engine,
$$w\c = -\oint \delta q_\mathrm{rev}.\tag7$$
Let us take some other engine, called engine $2$. Again from the first law of thermodynamics,
$$w_2 = -\oint \delta q_2.\tag8$$
Assume that, contrary to the theorem,
$$\oint \frac{\delta q_2}{T}>0.\tag9$$
Let us join these two cycles to form a composite engine. The work done by this system will be
$$w_\mathrm{total} = w\c + w_2 = -\oint(\delta q_\mathrm{rev} + \delta q_2) = -\oint\delta q_\mathrm{total}. \tag{10}$$
By our lemma $\mathrm{(L)}$, assumption $(9)$ and result $(10)$, we see that
$$\oint\frac{\delta q_\mathrm{rev} + \delta q_2}{T} \stackrel{(10)}{=} \oint \frac{\delta q_\mathrm{total}}{T} \stackrel{\mathrm{(L)},(9)}{>} 0 \tag{11}.$$
Now change the sign or direction of the Carnot engine, and its size so that no work is done in the composite cycle with engine $2$. (This should always be possible via Carnot's theorem which I will not prove here.) Then
$$w_\mathrm{total} = \oint\delta q_\mathrm{total} = 0. \tag{12}$$
If we break cyclic line integrals apart into $y-1$ smaller integrals that add up to the cyclic integral, we achieve
$$W_\mathrm{total} = \sum_{i
The terms $q_i(b_j)$ may be both positive and negative but in equation $(13a)$ cancel out to give zero. In equation $(13b)$ the terms are weighted by temperatures in such a way that positivity wins. Therefore, on the whole, negative $q_i(b_j)$ terms have been divided by larger temperatures whereas positive terms possess smaller temperature denominators. Physically, this implies that heat is transferred from a lower temperature reservoir to a higher temperature bath without doing any net work. This is in violation of the second law of thermodynamics (Clausius statement).
Therefore, our assumption $(9)$ that
$$\oint \frac{\delta q_2}{T}>0$$
is actually false. The result
$$\oint \frac{\delta q_2}{T}\leq 0$$
immediately follows.
Corollary. For any reversible process, not just a Carnot cycle, $$\oint \frac{\delta q_\mathrm{rev}}{T} = 0.$$ In other words, temperature becomes an integrating factor in a reversible process.
Proof
By Clausius integral inequality $(1)$, we know immediately that
$$\oint \frac{\delta q_\mathrm{rev}}{T} \leq 0. \tag{14}$$
Assume again to the contrary that
$$\oint \frac{\delta q_\mathrm{rev}}{T} < 0. \tag{15}$$
Since this is a reversible process, we may change the direction of the engine. Only the sign of the integral will alter.
$$\oint \frac{\delta q_\mathrm{rev}}{T} > 0. \tag{16}$$
If one proceeds analagously to the proof of the Clausius integral inequality, this will lead to a contradiction. Thus, assumption $(15)$ is not valid, and by $(14)$ we are left with an equality.
Why bother with all of this?
We have now proven that for any reversible process,
$$\oint \frac{\delta q_\mathrm{rev}}{T} = 0. \tag{17}$$
This means that we may define a change in a new state function $S$ which we call entropy such that
$$\Delta S= \int_{S_1}^{S_2} \frac{\delta q_\mathrm{rev}}{T}\tag{18},$$
or in differential form,
$$\d S = \frac{\delta q_\mathrm{rev}}{T}. \tag{19}$$
This is the definition of a differential of entropy. There is nothing else to derive; what we showed above is that we may indeed define such a state function. To achieve your equation, multiply both sides of equation $(19)$ by temperature.
$$T\d S = \delta q_\mathrm{rev}. \tag{CT}$$
Statistical thermodynamics
Another way to go would be to define entropy in statistical thermodynamics. Then the formula $\mathrm{(CT)}$ may be derived.
Statistics in general
For a discrete random variable $X$, entropy is defined in statistics to be the function $\mathrm{H}$ such that
$$\mathrm{H}(X)= -\sum_k p_k\ln p_k \tag{GS}$$
where small $p$ denotes probability.
Boltzmann statistical thermodynamics
Say we have a large number $\mathcal{A}$ of isolated systems which we together call an ensemble. Suppose the system may be described as
\begin{align}\begin{split} \text{energies}\ \ &E_1\ \ &E_2\ \ldots &E_n\ \ldots\\ \text{number of such systems}\ \ \ \ &a_1\ \ &a_2\ \ldots\ &a_n\ \ldots\end{split}\end{align}
where
$$\mathcal{A} = \sum_i a_i.\tag{20}$$
The systems of different energies in this ensemble are distinguishable. The systems that have the same energy, i.e., are degenerate, are indistinguishable. If we supplied each system regardless of energy with an index of energy, the number of ways to order these indices would be
$$\mathcal{A}!.\tag{21a}$$
But as we said, the systems of equal energy are indistinguishable. The number of ways of ordering each of these is
$$a_j!.\tag{21b}$$
So to avoid overcounting we must divide:
$$K(a_1,a_2,\ldots,a_n,\ldots) = \frac{\mathcal{A!}}{\prod_i a_i!}.\tag{21c}$$
You may think of this $K$ as the total number of anagrams of words with $\mathcal{A}$ letters where each letter repeats itself $a_j$ times. Boltzmann used the big letter $W$ instead of $K$ and defined entropy to be
$$S = k_\mathrm{B}\ln W.\tag{BD}$$
The natural logarithm guarantees, among other things, that entropy is not a huge quantity, and that entropy for different (independent) systems is additive. In other words, if
$$W_\mathrm{total} = W_\mathrm{A}W_\mathrm{B}$$
then
$$S_\mathrm{total} = S_\mathrm{A} + S_\mathrm{B}.$$
Boltzmann entropy as a special case of statistical entropy
Begin with the statistical definition.
$$\mathrm{H}(X)= -\sum_k p_k\ln p_k \tag{GS}$$
Say each and every state in $W$ has equal probability. Then, since the sum of probabilities must equal $1$, we optain
$$p_j = \frac{1}{W}.\tag{22}$$
(You may prove that this constraint actually maximises entropy. This is what we expect from a measure of 'information' in a system where 'knowledge of the system' is smallest in the sense that each state has equal probability. See 'Intuitive explanation of entropy?' on math.stackexchange.com by An.Ditlev.)
Substituting
$$\mathrm{H}(X)= -\sum_k \frac{1}{W}\ln \frac{1}{W} = -W \frac{1}{W}\ln \frac{1}{W} = -\ln \frac{1}{W} = \ln W.\tag{23}$$
This shows that Boltzmann constant is actually unnecessary and a figment of historical definitions. We shall see this in one other way down below. We conclude that Boltzmann's definition of entropy is a special case of a more general entropy when probabilities are equal. In thermodynamics, this more general entropy is called Gibbs entropy.
$$S\equiv -k_\mathrm{B}\sum_k p_k\ln p_k\tag{GD}$$
This is our new definition of entropy. Here we have an absolute quantity; in classical thermodynamics (without the third law) we defined the differential of entropy.
Equivalence of Gibbs entropy and Clausius entropy
A partition function $Q$
Intuitively, we assume that the relative populations of system $a_m$ and $a_n$ are given by some function depending on energy.
$$\frac{a_m}{a_n} = f(E_m - E_n)\tag{24}$$
The difference in energy is a natural choice because any arbitrary zero for an energy state should cancel. Furthermore, the function $f$ ought to be transitive.
$$\frac{a_x}{a_n} = \frac{a_m}{a_n}\frac{a_x}{a_m} \iff f(E_x - E_n) = f(E_m - E_n)f(E_x - E_m)\tag{25}$$
The most obvious function with these properties is the exponential function. Thus we define
$$a_j = C\mathrm{e}^{-\beta E_j}, \ \ C\in\mathbb{R},\tag{26}$$
where $\beta$ is positive everywhere by convention. Its form is to be determined. We start by tackling the constant $C$. Sum both sides of equation $(26)$.
$$\mathcal{A}=\sum_i a_i = C\sum_i\mathrm{e}^{-\beta E_i} \tag{27}$$
Divide equation $(26)$ by equation $(27)$.
$$\frac{a_j}{\mathcal{A}} = \frac{\mathrm{e}^{-\beta E_j}}{\sum_i\mathrm{e}^{-\beta E_i}} \tag{28a}$$
In the limit, the left-hand side becomes probability of being in a state $a_j$.
$$p_j = \frac{\mathrm{e}^{-\beta E_j}}{\sum_i\mathrm{e}^{-\beta E_i}} \tag{28b}$$
The denominator is defined to be a partition function $Q$. Assumuming that $E=E(N,V)$ (number of particles and volume, respectively) we see immidiately
$$p_j(N,V,\beta) = \frac{\mathrm{e}^{-\beta E_j(N,V)}}{Q(N,V,\beta)}. \tag{29}$$
The first law of thermodynamics with partition functions
From the definition of expectation value for a discrete variable $X$,
$$\mathrm{E}X = \sum_ip_ix_i.$$
Let us identify internal energy with an expectation value of energy. Then
$$U = \sum_i p_i(N,V,\beta)E_i(N;V).\tag{30}$$
Take a differential of both sides.
$$\d U = \sum_i p_i(N,V,\beta)\d E_i(N;V) + \sum_i E_i(N;V)\d p_i(N,V,\beta)\tag{31}$$
This expression for probability and energy holds for a reversible process. If we associate the first term with work, the second term will be heat. Doing so will allow us to prove the equivalence but it will also lead to correct results for ideal gases if you do the maths. For brevity (ha ha), I will gloss over such discussions but in my opinion the rigorous explanation (if there is one) deserves its own question. (I am not aware of the rigorous proof.) Hence,
$$\delta q_\mathrm{rev} = \sum_i E_i(N;V)\d p_i(N,V,\beta).\tag{32}$$
Proof of equivalence
Recall the definition of Gibbs entropy.
$$S\equiv -k_\mathrm{B}\sum_k p_k\ln p_k\tag{GD}$$
Take the differential of both sides.
$$\d S = -k_\mathrm{B}\sum_k (1 +\ln p_k)\d p_k\tag{33}$$
The term $\sum_k \d p_k$ will equal zero by linearity and because probabilities sum to $1$.
$$\sum_k \d p_k = \d\sum_k p_k = 0\tag{34}$$
Thus,
$$\d S = -k_\mathrm{B}\sum_k \ln p_k\d p_k.\tag{35}$$
Substituting equation $(29)$ into the natural logarithm term in equation $(35)$ yields
$$\d S = k_\mathrm{B}\sum_k \left[\beta E_k(N,V) + \ln Q(N,V,\beta)\right]\d p_k.\tag{36}$$
Similarly to equation $(34)$ the final logarithm term cancels.
$$\sum_k \ln Q(N,V,\beta) \d p_k = \ln Q(N,V,\beta)\d \sum_k p_k = 0\tag{37}$$
We are left with
$$\d S = k_\mathrm{B}\beta\sum_k E_k(N,V) \d p_k(N,V,\beta)\tag{38}$$
which by equation $(32)$ translates to
$$\d S = k_\mathrm{B}\beta\delta q_\mathrm{rev}.\tag{39}$$
Note that $k_\mathrm{B}\beta$ is also an intgrating factor. From our corollary before and to complete the equivalence, we are forced to choose
$$\beta = \frac{1}{k_\mathrm{B}T},\tag{40}$$
and finally,
$$\d S = \frac{\delta q_\mathrm{rev}}{T} \implies T\d S = q_\mathrm{rev}.\tag{19,CT}$$
In conclusion, we have shown that Gibbs entropy is equivalent to Clausius entropy up to an additive constant. This is because a differential of a constant is zero. The choice of $\beta$ again demonstrates how superfluous Boltzmann's constant really is. We introduce it in the Gibbs entropy definiton, only to be forced to get rid of it. For more information on Boltzmann's constant, and how it relates to temperature and the zeroth law, see 'Is the Boltzmann constant really that important?' on physics.stackexchange.com by DanielSank. And so,
$$S_\mathrm{Gibbs} = S_\mathrm{Clausius} + r,\ \ r\in \mathbb{R}.\tag{41}$$
If the most classic form (zero entropy at absolute zero) of the third law of thermodynamics holds, and it always does not,
$$r = 0. \tag{42}$$
TL; DR
The reason why
$$\delta q_\mathrm{rev}=T\d S\tag1$$
holds depends on your starting point.
- There are different forms of the first and second laws. Proving that a state function $S$ exists varies based on one's axioms.
- Entropy may be defined in various ways. If a classical definition is used, there is only to prove that we may indeed define such a state function. If a statistical definition is applied, we must first prove the classical definition as a special case.
- Note that the second law of thermodynamics only defines a difference or differential in entropy. Together with the third law, one may establish an absolute scale. However, the statistical definition is already absolute to begin with.
- Boltzmann entropy is a special case of Gibbs entropy, not the other way around. Gibbs entropy agrees with Clausius entropy (in reversible thermodynamics) up to an additive constant.
$$S_\mathrm{Gibbs} = S_\mathrm{Clausius} + r,\ \ r\in \mathbb{R}\tag{41}$$
If the most classical form of the third law holds, $r = 0$.
No comments:
Post a Comment