Saturday, May 27, 2017

The second law of thermodynamics

In this blog post I want to discuss one of the most fundamental laws of statistical physics, known as the second law of thermodynamics:

The entropy of a closed system tends to a maximum

This definition is a bit loose and implies that you know what entropy is. I will define entropy formally below, but for now let's say entropy is a measure for how spread out energy is in a system. A system with the same density, temperature and pressure everywhere has a very high entropy. A system where one part has a higher temperature and another has a lower temperature, like a hot and a cold plate as shown in Figure 1 (left),  has a low entropy, because the energy is clustered in the high temperature part.
Figure 1: Example of a low entropy system (left) and a high entropy system (right). Entropy is a measure for how spread out energy is in a system.
After the plates are brought in contact (Figure 1, middle), the temperature averages due to a flow of energy from hot to cold. The system on the right of the figure has a high entropy, since the energy is now evenly spread out in this system.

Entropy 


The formal definition of entropy is given by
\[
S = k_B\ln\left[\Omega(E)\right],
\]where $k_B=8.617\times10^{-5}\,\text{eV/K}$ is the Boltzmann constant and $\Omega(E)$ represents the available states of a system. The fact that systems have states which they can reside in is a fundamental principle of Quantum Mechanics.

For example the electron which is bound to a proton, forming a Hydrogen atom, has different energy levels it can occupy. The electron can go to a lower energy state by emitting photons. Since the energy states are discrete, the photon energies emitted by a hydrogen atom are equivalent to the difference in the energy levels available for the electron.

The energy state with the lowest energy is called the ground state. Using the Bohr-Sommerfeld approximation the different energy levels of the hydrogen atom are given by
\[
E_n = - \frac{  m_e e^4}{2 ( 4 \pi \epsilon_0)^2 \hbar^2 } \frac{1}{n^2} = \frac{-13.606\,\text{eV}}{n^2},
\]with $n = 1$ referring to the ground state, $m_e$ is the electron mass, $e$ is the electron charge and $\epsilon_0$ is the electric permeability. This equation ignores all kinetic states, which are given by the quantum numbers $\ell = 0$, ..., $n-1$, $m = -\ell$, ..., $\ell$ and the spin $\sigma = [-\frac{1}{2}, \frac{1}{2}]$. A more complete treatment would be given by the Dirac equation.

If we simplify the picture and just assume that all kinetic energy states corresponding to the main quantum number $n$ have exactly the same energy, we can calculate the number of states $\Omega$ as a function of energy.
Figure 2: Number of energy states for the hydrogen atom as a function of energy.
Figure 2 shows the enormous increase of available states for the electron in a hydrogen atom at large energies. The ground state is at $E = -13.606\,$eV, which is the minimum energy a hydrogen atom can have. The number of available states is much larger at larger energies which means that entropy is maximised at large energies.

The second law wants to maximise the total entropy of the system. From intuition we know that bringing cold and hot objects together, the temperature will average out. This means that cold atoms are lifted into higher states and hot atoms are moved to lower energy states. But is this really maximising entropy? After all, the hotter atoms now occupy lower energy states, which have smaller $\Omega$ and therefore the entropy of those atoms has decreased.

Let's look at a system of 6 hydrogen atoms. The second law tells us that the total entropy of all hydrogen atoms tends towards a maximum. Let's assume the system has enough energy to put all atoms in $n=2$. We could move one atom from $n=2$ into the ground state, which releases energy, which we can use to lift the other 5 atoms to $n=3$, because the difference in energy between $n=2$ and $n=3$ is only $E = 1.888\,$eV, while moving the one atom from $n=2$ to $n=1$ releases $E = 10.206\,$eV of energy. Since the other 5 atoms are in a larger energy state, they will have more entropy, while the one atom in the ground state lost some entropy.

If such a process would maximise the total entropy, we would expect that a gas left alone would separate into cold and hot atoms over time. This is of course not what happens in reality, so let's check that keeping all atoms at $n=2$ does indeed maximise entropy.

First I have to say that this is a rough treatment, since the different kinetic energy states are not actually all degenerate, so the proper treatment is more complicated. I will also ignore the proportionality constant $k_B$ when calculating entropy.

We start with 6 atoms in n=2. Considering all kinetic quantum numbers we have

$n = 1$: $\Omega = 2$, $E = -13.606\,$eV
$n = 2$: $\Omega = 12$, $E = -3.40\,$eV
$n = 3$: $\Omega = 30$, $E = -1.512\,$eV

The number of states can be calculated by considering all available quantum numbers. The ground state for example has only the spin of the electron which can have two values and hence $\Omega = 2$. With 6 atoms in $n=2$ we have $S_{\rm total} = 6*\ln 12 = 14.88$, where I ignored the Boltzmann constant.

If we now move one atom to $n=1$ we get enough free energy to lift the other 5 atoms to $n=3$. What would be the entropy of this new system?
\[
S_{\rm total} = S_{n=0} + S_{n=3} = \ln 2 + 5*\ln 30 = 10.89,
\]therefore we indeed maximise entropy by putting all hydrogen atoms in the same state. This explains why the temperature of two gases averages when they come in contact and why the second law can also be formulated as 'Heat always moves from high to low temperatures'.

Finally I should say that with 6 atoms we do not actually have a statistically significant sample, which makes the application of the second law difficult. We will for the rest of this post move away from talking about energy states $\Omega$ but you should keep in mind that entropy has a sound motivation based on quantum states.

Maxwell's Demon 


James Clerk Maxwell proposed a thought experiment, which seems to violate the second law of thermodynamics, called Maxwell's Demon. Maxwell imagined two gases A and B in isolated condition with a wall between. The wall can be opened to let particles move between A and B. This wall is operated by a demon. The demon measures speed of particles moving towards the wall and opens the wall whenever a particle with larger than average speed moves from A to B and when a particle with less than average speed moves from B to A. Since speed is equivalent to temperature, this procedure will increase the temperature of B and reduce the temperature of A without any energy being added to the system. This reduces the entropy of the system and therefore violating the second law.

In 1929 the physicist Leo Szilard showed that Maxwell's Demon does not violate the second law, because the demon needs to know the velocities of a very large number of particles, and to obtain that information would require a small amount of energy. Including this energy into the calculation the system does comply with the second law.

Heat engines 


A heat engine is a system where the difference in temperature of two reservoirs is used to produce work. For example petrol is burned in a car engine, which moves pistons to produce mechanical work. Above we formulated the second law of thermodynamics in a sentence as 'a closed system tends towards maximum entropy'. We can define the entropy change of a system ($\Delta S$) as the heat which has entered a closed system ($Q$) divided by the common temperature ($T$) at the point where the heat transfer took place:
\[
\Delta S = \frac{Q}{T}.
\]A perfect heat engine with $100\%$ efficiency would be a system where one uses heat from a hot reservoir and turns all the heat into work. This is in practice impossible, since the efficiency of any heat engine is given by
\[
e = 1 - \frac{T_{\rm cold}}{T_{\rm hot}},
\]which implies that one would need a reservoir with $T_{\rm cold} = 0$ or $T_{\rm hot} = \infty$ to achieve $100\%$ efficiency. Typical combustion engines operating between $1700\,$Kelvin and $298\,$Kelvin, leading to a maximum efficiency for such an engine of $82\%$. Friction and waste processes usually lead to an effective engine efficiency significantly below this limit.

Heat engines work because heat naturally flows from hot to cold places. If there were no cold reservoir towards which it could move there would be no heat flow and the engine would not work. 

Arrow of time 


Given that the second law tells us that systems move towards maximum entropy, we can infer from the entropy of a system at different times, which is earlier and which is later in time. If we see a hot coffee and a cold coffee we can infer that the coffee was hot first and then cooled down. We would never see the process going the other way, in a closed system.

The entropy of the hot coffee is smaller, since the energy is clustered in the coffee, while after a while the heat is distributed to the surroundings and the entropy increased. This means that entropy defines an arrow of time.

You can even take this further and ask what this means for the Universe as a whole, since the second law says that the final state of the Universe will be a state of maximum entropy.

Why do we have structure in the Universe? 


The Universe can, in some approximation, be described as a closed system, and therefore the entropy will increase. That means that the Universe will tend towards a state where the mean density and temperature are equal everywhere, the so called heat death.

You might ask how galaxies and black holes will ever give up their particles. Well our planet does actually lose material all the time, since gas which escapes the gravitational potential will just fly out in space. On a very long timescale this will result in all stars, planets and galaxies to evaporate into space.

Even though black holes in principle cannot lose material (not even light can escape black holes), they do produce a radiation, called Hawking radiation, which eventually will cause all black holes to evaporate. The time scale for that to happen is very very large, but it will happen eventually. Probably black holes are the last low entropy islands, before the total entropy reaches its maximum.

The heat death state of the Universe is fairly hopeless for any life since at that point all energy is trapped in entropy, which cannot be moved to a lower energy state to produce mechanical work. To move a system to a lower state we would need a reservoir with a lower temperature, but at the heat death the temperature is the same everywhere.

Given the law as described above, one might ask how did we end up with structure in the Universe? The Universe started off very homogenous. For example the Cosmic Microwave Background (CMB) tells us that the average fluctuation in the density field of the Universe $300\,000$ years after the big bang was $10^{-5}$. Today, 11 billion years after the big bang, these fluctuations are of order $1$. So this means matter (and hence energy) got more clustered over time? Isn't this violating the second law of thermodynamics and implying a process which leads to a reduced entropy?

Not really. Remember, the temperature is in the denominator of the equation for $\Delta S$ and the high temperature at early times means that the entropy actually increased since then. So the structure growth in the Universe does not violate the second law. However, as mentioned before, given enough time, the second law will smooth out all structure in the Universe.

cheers
Florian

No comments:

Post a Comment