try ai
Popular Science
Edit
Share
Feedback
  • Nernst's theorem

Nernst's theorem

SciencePediaSciencePedia
Key Takeaways
  • Nernst's theorem, the foundation of the Third Law of Thermodynamics, states that the change in entropy for any process approaches zero as the temperature approaches absolute zero.
  • Quantum mechanics explains this theorem by postulating that a perfect crystal in equilibrium has a unique, non-degenerate ground state, resulting in zero absolute entropy at T=0.
  • A direct consequence of the theorem is the unattainability principle, which asserts that absolute zero cannot be reached in a finite number of steps because cooling processes become infinitely inefficient.
  • The theorem universally predicts that various thermal response properties, such as heat capacity and the coefficient of thermal expansion, must vanish as temperature approaches absolute zero.

Introduction

The grand principles of thermodynamics govern the flow of energy and the direction of spontaneous change throughout the universe. While the first two laws establish the conservation of energy and the inexorable rise of entropy, a fundamental question remained at the dawn of the 20th century: what happens in the extreme realm of absolute zero? Does the drive towards disorder, so powerful at room temperature, simply fade away, or does it follow a more profound rule? This question lies at the heart of Nernst's heat theorem, the principle that would become the Third Law of Thermodynamics. The article addresses this knowledge gap by exploring the behavior of entropy at the universe's coldest frontier.

This article delves into the foundational concepts and far-reaching implications of Nernst's theorem. In the "Principles and Mechanisms" section, we will unpack the core postulate that entropy change vanishes as temperature approaches absolute zero, explore its quantum mechanical basis in the concept of a unique ground state, and discuss apparent exceptions like residual entropy. Following this, the "Applications and Interdisciplinary Connections" section will demonstrate the theorem's predictive power, showing how it dictates the behavior of materials, governs chemical and electrochemical reactions, and even provides a surprising link to the quantum theory of black-body radiation, ultimately revealing why absolute zero remains an unattainable horizon.

Principles and Mechanisms

Imagine yourself a chemist in the late 19th century, trying to predict whether a chemical reaction will spontaneously occur. You have two great forces to consider: the drive towards lower energy, represented by the change in enthalpy (ΔH\Delta HΔH), and the drive towards greater disorder, represented by the change in entropy (ΔS\Delta SΔS). The great equation that balances these two tendencies is the Gibbs free energy relation, ΔG=ΔH−TΔS\Delta G = \Delta H - T\Delta SΔG=ΔH−TΔS. A reaction proceeds spontaneously if ΔG\Delta GΔG is negative. At room temperature, the entropic term, TΔST\Delta STΔS, is a powerful player, often deciding the outcome. But what happens as you venture into the recently explored realm of ultra-low temperatures, towards the absolute zero of temperature, T=0T=0T=0?

A Low-Temperature Puzzle

One might naively guess that as the temperature TTT approaches zero, the term TΔST\Delta STΔS simply fades into irrelevance because you are multiplying by an ever-smaller number. If this were the case, the spontaneity of a reaction would be dictated solely by the enthalpy change, ΔH\Delta HΔH. Experiments bore this out: as T→0T \to 0T→0, the measured value of ΔG\Delta GΔG indeed approached ΔH\Delta HΔH. But is the reason really so trivial? What if the change in entropy, ΔS\Delta SΔS, became wildly large as the temperature dropped? Could a diverging ΔS\Delta SΔS conspire with a vanishing TTT to keep the product TΔST\Delta STΔS finite?

Physics abhors an unconstrained infinity. Relying on the mere fact that T→0T \to 0T→0 is an incomplete argument, like saying a car must stop because its speed is decreasing, without knowing if the driver is slamming on the accelerator. The true answer is far more profound and beautiful, and it was provided by the insight of Walther Nernst.

Nernst's Postulate: The Rule of Calm

Nernst's heat theorem, which forms the basis of the Third Law of Thermodynamics, is a statement of radical simplicity and power. It asserts that for any process that takes a system from one equilibrium state to another at the same temperature, the change in entropy, ΔS\Delta SΔS, itself approaches zero as the temperature approaches absolute zero.

lim⁡T→0ΔS=0\lim_{T \to 0} \Delta S = 0limT→0​ΔS=0

This is the missing piece of the puzzle. It’s not just that the temperature factor TTT is vanishing; the entropy change ΔS\Delta SΔS is also vanishing. The universe, it seems, imposes a rule of calm at its coldest frontier. The consequence is that near absolute zero, all equilibrium states of a given substance have essentially the same entropy. Whether you compress it, expand it, or place it in a magnetic field, its entropy remains unchanged. The chaotic dance of particles that gives rise to entropy at high temperatures settles into a single, serene state.

This principle is what underpins the equivalence of the Nernst heat theorem and the ​​unattainability principle​​—the famous declaration that absolute zero cannot be reached in a finite number of steps. Imagine entropy curves for a substance at two different pressures, P1P_1P1​ and P2P_2P2​. At high temperatures, these curves are distinct. You can cool a substance by using a cycle of compressing it isothermally (moving from the higher entropy curve to the lower one) and then expanding it adiabatically (moving down along a constant-entropy line to a lower temperature). But Nernst's theorem demands that as T→0T \to 0T→0, these two entropy curves must merge into a single value. As the curves converge, the temperature drop you get from each adiabatic step becomes smaller and smaller. Reaching the point of convergence, T=0T=0T=0, would require an infinite number of these ever-diminishing steps. Absolute zero is thus an asymptotic limit, a destination you can journey towards forever but never reach.

The Quantum Basis: Why Things Must Get Orderly

Thermodynamics tells us what happens, but to understand why, we must descend into the strange and wonderful world of quantum mechanics. Entropy, from a microscopic perspective, is a measure of the number of ways a system can arrange its constituents—its atoms, molecules, and electrons—to have the same total energy. The famous Boltzmann formula captures this: S=kBln⁡(W)S = k_B \ln(W)S=kB​ln(W), where WWW is the number of accessible microstates.

As a system is cooled, it loses energy and seeks its lowest possible energy state, known as the ​​ground state​​. For a ​​perfect crystal​​—a flawless, repeating lattice of a single pure substance with every atom in its designated place and orientation—quantum mechanics predicts that there is often only one, unique arrangement that corresponds to this lowest energy. The ground state is ​​non-degenerate​​. If there is only one way to be (W=1W=1W=1), then the entropy is precisely zero: S=kBln⁡(1)=0S = k_B \ln(1) = 0S=kB​ln(1)=0. This is the ​​Planck postulate​​, a stronger and more specific version of Nernst's theorem. At the absolute zero of temperature, perfect order reigns, and the entropy is absolutely zero. A perfect crystal at T=0T=0T=0 has no disorder to give.

The Catch: Residual Entropy and Frozen Mistakes

Of course, the real world is messier than our idealized "perfect crystal." What happens if a system has more than one ground state configuration? This is known as ​​ground-state degeneracy​​, and it gives rise to a ​​residual entropy​​ at absolute zero. If the ground state has a degeneracy of g0g_0g0​, meaning there are g0g_0g0​ different-but-equal-energy configurations, the entropy at T=0T=0T=0 will be S0=kBln⁡(g0)S_0 = k_B \ln(g_0)S0​=kB​ln(g0​).

This is not just a theoretical curiosity; it happens in real materials.

  • ​​Configurational Disorder:​​ Consider solid carbon monoxide (CO). The CO molecule has a very small dipole moment, so the energy difference between lining up in a crystal as CO CO CO versus CO OC CO is tiny. As the liquid freezes, the molecules get stuck in random orientations. They are "frozen mistakes," a form of quenched disorder. The system doesn't have enough time or thermal energy to find the true, single lowest-energy state. This leads to a measurable residual entropy.
  • ​​Isotopic Mixing:​​ A crystal of chlorine might seem perfect, but natural chlorine is a mix of isotopes, primarily chlorine-35 and chlorine-37. Randomly distributing these two isotopes on a perfect lattice creates a huge number of possible arrangements, leading to a "mixing" entropy that persists even at T=0T=0T=0.
  • ​​Nuclear Spins:​​ Many atomic nuclei have a property called spin. In the absence of strong interactions between them, these spins can point in random directions, contributing another source of degeneracy and residual entropy.

For a substance to truly reach zero entropy, it must be a single-domain, isotopically pure, perfect crystal whose constituent particles are arranged in a unique, non-degenerate ground state, including the ordering of any nuclear spins. The Third Law, in its strictest form, is a statement about perfect equilibrium, a state that real, messy systems may struggle to achieve.

A Testable Prediction: The Vanishing of Heat

This journey into the quantum realm makes a powerful, testable prediction about a macroscopic property we can readily measure: ​​heat capacity​​ (CpC_pCp​), the amount of heat required to raise a substance's temperature by one degree. The absolute entropy of a substance at temperature TTT is calculated by integrating the heat capacity from absolute zero:

S(T)=∫0TCp(T′)T′dT′S(T) = \int_{0}^{T} \frac{C_p(T')}{T'} dT'S(T)=∫0T​T′Cp​(T′)​dT′

Now, we face a mathematical reckoning. The Third Law demands that S(T)S(T)S(T) be a finite number as T→0T \to 0T→0. Look at the integral: we are dividing by T′T'T′. If CpC_pCp​ were to approach a non-zero constant as T′→0T' \to 0T′→0, the integral would behave like ∫(1/T′) dT′=ln⁡(T′)\int (1/T') \,dT' = \ln(T')∫(1/T′)dT′=ln(T′), which diverges to negative infinity at the lower limit. This would be a physical catastrophe! The only way for the entropy to remain finite is if the heat capacity CpC_pCp​ vanishes as T→0T \to 0T→0. Furthermore, it must vanish faster than TTT itself, or the ratio Cp/TC_p/TCp​/T would remain finite and the integral would still diverge.

So, the Third Law makes a bold and universal prediction: as any substance is cooled towards absolute zero, its ability to absorb heat must vanish. It becomes infinitely "reluctant" to warm up. This is precisely what experiments show.

Once again, quantum mechanics provides the beautiful explanation. In classical physics, all vibrational modes of a solid should be active at any temperature, leading to a constant heat capacity (the Law of Dulong and Petit). This classical theory fails spectacularly at low temperatures. Quantum mechanics reveals that energy is quantized. To excite a vibrational mode, you need to provide it with a minimum packet of energy.

  • In the simple ​​Einstein model​​, all atoms vibrate at the same frequency. Below a certain temperature, there isn't enough thermal energy to excite this vibration, and the heat capacity plummets exponentially.
  • In the more realistic ​​Debye model​​, the vibrations are collective sound waves (phonons). While there are low-energy phonons, the number of available modes at low frequencies is severely restricted. This leads to the famous Debye T3T^3T3 law, where the heat capacity vanishes as the cube of the temperature.

In either case, quantum mechanics provides the mechanism for "freezing out" degrees of freedom, ensuring that heat capacities vanish and the Third Law is obeyed.

The Third Law of Thermodynamics is therefore not an isolated rule but a deep nexus connecting the macroscopic world of chemical reactions and heat with the microscopic rules of quantum mechanics. It is a testament to the fact that at the universe's ultimate limit of cold, a profound and elegant order must emerge.

Applications and Interdisciplinary Connections

Having grasped the principle that nature’s engine of disorder, entropy, grinds to a halt at absolute zero, we might wonder: is this just a theoretical curiosity, a mathematical limit at an unreachable temperature? The answer, wonderfully, is no. Nernst's theorem is not a distant abstraction; it is a powerful and practical law whose consequences are etched into the very fabric of matter. It dictates how materials must behave, how chemical reactions proceed, and why the final degree of cold is an unconquerable summit. Let's embark on a journey to see how this one simple idea—that entropy changes vanish at zero temperature—reaches out to touch nearly every corner of the physical sciences.

The Unresponsive World: The Vanishing of Thermal Responses

One of the most immediate and striking consequences of the Nernst theorem is that, as a system approaches absolute zero, it becomes increasingly unresponsive to changes in temperature. We are accustomed to materials expanding when heated and contracting when cooled. But this familiar behavior must cease in the extreme cold. The coefficient of thermal expansion, α\alphaα, which measures the fractional change in volume with temperature, is thermodynamically tied to the way entropy changes with pressure. A clever piece of mathematics known as a Maxwell relation shows that (∂V/∂T)P=−(∂S/∂P)T(\partial V/\partial T)_P = -(\partial S/\partial P)_T(∂V/∂T)P​=−(∂S/∂P)T​. Since Nernst’s theorem demands that entropy SSS becomes independent of pressure as T→0T \to 0T→0, its derivative with respect to pressure, (∂S/∂P)T(\partial S/\partial P)_T(∂S/∂P)T​, must be zero. Consequently, the thermal expansion coefficient α\alphaα must also plummet to zero. This is not an optional feature of some materials; it is a universal thermodynamic command.

This principle of unresponsiveness is beautifully general. It's not just about volume and pressure. The same logic applies to any pair of what physicists call "conjugate variables." Consider a magnetic material placed in a magnetic field HHH. Its magnetization MMM typically changes with temperature—a phenomenon known as the pyromagnetic effect. But just as with thermal expansion, a Maxwell relation connects the pyromagnetic coefficient, (∂M/∂T)P,H(\partial M/\partial T)_{P,H}(∂M/∂T)P,H​, to the way entropy changes with the magnetic field, (∂S/∂H)T,P(\partial S/\partial H)_{T,P}(∂S/∂H)T,P​. At absolute zero, entropy cares not for the magnetic field, so this derivative vanishes, and the material's magnetization becomes insensitive to small temperature changes.

Let's take one more example: a simple elastic polymer, like a rubber band. The tension FFF you feel when you stretch it to a length LLL actually depends on its temperature. But as you cool it towards absolute zero, this dependence must fade away. The rate of change of tension with temperature, (∂F/∂T)L(\partial F/\partial T)_L(∂F/∂T)L​, is forced to zero by the same deep principle, linked this time to how entropy changes with the polymer's length. The message is unified and profound: at the doorstep of absolute zero, matter becomes quiet and numb to the influence of heat.

A Bridge Between States: Isothermal and Adiabatic Processes

The distinction between different kinds of thermodynamic processes also begins to dissolve at low temperatures. An isothermal process is one conducted at a constant temperature, allowing heat to flow in or out. An adiabatic process is one conducted without any heat exchange, which means the entropy remains constant. In our everyday world, these are very different things.

However, near absolute zero, the Nernst theorem implies that the entropy of a system approaches a constant value, independent of other parameters like pressure or volume. This means that a process carried out at a constant temperature of (or near) zero is also, necessarily, a process carried out at constant entropy. The distinction blurs.

This has tangible consequences. Consider how a material compresses. We can measure its isothermal compressibility, κT\kappa_TκT​, by squashing it slowly while it's in a bath that keeps its temperature fixed. Or we can measure its adiabatic compressibility, κS\kappa_SκS​, by squashing it so quickly that no heat has time to escape. These two values are generally different. However, a direct consequence of the Nernst theorem is that as T→0T \to 0T→0, the ratio of these two compressibilities, κT/κS\kappa_T / \kappa_SκT​/κS​, must approach exactly one. At the ultimate cold, the two ways of compressing become one and the same—another testament to the simplifying power of the third law.

A Symphony of Disciplines: Chemistry, Electricity, and Light

The reach of Nernst's theorem extends far beyond the simple properties of materials, acting as a unifying thread connecting disparate scientific disciplines.

  • ​​Chemistry:​​ Chemical reactions are fundamentally governed by changes in entropy. The Gibbs free energy, ΔG=ΔH−TΔS\Delta G = \Delta H - T\Delta SΔG=ΔH−TΔS, which determines whether a reaction proceeds spontaneously, is dominated by the entropy change at high temperatures. Nernst's theorem imposes a critical boundary condition at low temperatures: the entropy of reaction, ΔrS\Delta_r SΔr​S, must approach zero. This seemingly simple rule has subtle and powerful implications, constraining, for example, how the heat capacity of a reaction mixture can change in response to pressure as T→0T \to 0T→0. It provides a fundamental anchor for the entire edifice of chemical thermodynamics.

  • ​​Electrochemistry:​​ A battery is nothing more than a cleverly packaged chemical reaction. Its voltage, or electromotive force EEE, is directly proportional to the change in Gibbs free energy of the reaction (ΔG=−nFE\Delta G = -nFEΔG=−nFE). The change in this voltage with temperature, (∂E/∂T)P(\partial E/\partial T)_P(∂E/∂T)P​, turns out to be directly proportional to the reaction's entropy change, ΔS\Delta SΔS. Since the Nernst theorem demands that ΔS→0\Delta S \to 0ΔS→0 as T→0T \to 0T→0, the voltage of any ideal battery must become perfectly stable and independent of temperature as it approaches absolute zero. While your car battery may struggle in the winter, in the impossible winter of absolute zero, its voltage would be unwavering.

  • ​​Radiation and Quantum Theory:​​ Perhaps the most breathtaking example of the theorem's unifying power is its connection to the quantum world of light. Consider the "vacuum" inside an empty, heated box. It is not truly empty; it is filled with a "gas" of photons known as black-body radiation. From classical electromagnetism, we know that the pressure of this photon gas is one-third of its energy density, P=u/3P = u/3P=u/3. By applying the constraint from Nernst's theorem—that the entropy of the photon gas must go to zero at T=0T=0T=0, which implies (∂P/∂T)V(\partial P/\partial T)_V(∂P/∂T)V​ must also be zero—we can derive a remarkable result. A simple differential equation emerges whose solution is that the energy density uuu must be proportional to the fourth power of temperature: u=σ′T4u = \sigma' T^4u=σ′T4. This is the celebrated Stefan-Boltzmann law, a cornerstone of early quantum mechanics!. It is astounding that a macroscopic law of thermodynamics can reach into the quantum realm and dictate the fundamental behavior of light itself.

The Final Frontier: Quantum Phenomena and the Unattainable Zero

As we push our understanding to the very edge of physics, the consequences of Nernst's theorem become even more profound, shaping the landscape of quantum mechanics and defining the ultimate limits of our exploration.

  • ​​Quantum Phase Transitions:​​ We are familiar with phase transitions driven by temperature, like ice melting into water. But at absolute zero, where thermal fluctuations are absent, a system can still transition between different phases (e.g., from one magnetic ordering to another) by tuning a parameter like pressure or a magnetic field. These are called quantum phase transitions. The boundary line separating two phases in a phase diagram obeys a rule analogous to the Clausius-Clapeyron equation, which relates the slope of the line dB/dTdB/dTdB/dT to the change in entropy ΔS\Delta SΔS and magnetization ΔM\Delta MΔM across the boundary. As we trace this boundary down to T=0T=0T=0, Nernst’s theorem guarantees that the entropy change ΔS\Delta SΔS must vanish. For a first-order transition where the magnetization change ΔM\Delta MΔM remains finite, this forces the slope of the phase boundary, dB/dT=ΔS/ΔMdB/dT = \Delta S / \Delta MdB/dT=ΔS/ΔM, to become zero. The phase boundary must therefore approach the temperature axis horizontally. This universal, visually striking feature of quantum critical points is a direct, geometric manifestation of the third law.

  • ​​The Unattainable Zero:​​ This is the most famous consequence of all. Why can we not reach absolute zero? One powerful technique for getting close is adiabatic demagnetization. In this process, a magnetic material is cooled, a strong magnetic field is applied to align its atomic magnetic moments (releasing heat), the material is thermally isolated, and then the field is switched off. The moments randomize, and in doing so, they draw thermal energy from the material's atomic lattice, causing its temperature to plummet. It seems you could repeat this process, getting colder and colder until you hit zero. But the third law forbids it. The cooling power of this method depends on how much the system's entropy changes with the magnetic field, (∂S/∂B)T(\partial S/\partial B)_T(∂S/∂B)T​. Nernst’s theorem dictates that this very quantity must vanish as T→0T \to 0T→0. As the system gets colder, its entropy becomes less and less sensitive to the magnetic field, and the cooling effect of each demagnetization step diminishes. The cooling curve, whose slope is (∂T/∂B)S(\partial T/\partial B)_S(∂T/∂B)S​, flattens out and approaches zero slope at T=0T=0T=0. Absolute zero acts like a horizon that recedes as you approach it, forever unattainable in a finite number of steps.

A Note on Reality: Disorder and the Limits of the Law

What happens when we apply these ideas to the messy, complex systems of the real world, like a biological protein? Experiments on such systems sometimes reveal a puzzle: when the entropy is measured at low temperatures and extrapolated down to T=0T=0T=0, the result isn't zero. This "residual entropy" seems, at first glance, to defy the third law.

However, this apparent contradiction does not invalidate the law; it clarifies its scope. The third law, in its strictest form, applies to systems in perfect thermodynamic equilibrium. A complex molecule like a protein, when cooled, often fails to find its single, lowest-energy ground state. Instead, its intricate chain gets trapped in a "glassy" state—a frozen, disordered landscape of countless different conformations that all have nearly the same energy. The system is kinetically stuck, like a photograph of high-temperature disorder, because it lacks the thermal energy to overcome the barriers between these states. Since the system is not in true equilibrium, the third law does not demand its entropy be zero. This seeming exception sharpens our understanding, highlighting the crucial difference between the ideal, ordered equilibrium of a perfect crystal and the kinetically arrested reality of complex matter. The law remains a perfect guide, so long as we appreciate the territory it describes.