try ai
Popular Science
Edit
Share
Feedback
  • Thermostats in Molecular Dynamics

Thermostats in Molecular Dynamics

SciencePediaSciencePedia
Key Takeaways
  • Thermostats are essential algorithms in Molecular Dynamics that enable simulations to model systems at a constant temperature (the canonical NVT ensemble), which is more representative of real-world experiments than the isolated constant-energy (NVE) ensemble.
  • While simple methods like the Berendsen thermostat are effective for bringing a system to a target temperature, they fail to generate the correct statistical fluctuations, leading to inaccurate calculations for properties like heat capacity.
  • The Nosé-Hoover thermostat and its extensions, like the Nosé-Hoover chain, provide a physically rigorous method for generating a true canonical ensemble by incorporating the heat bath into the system's fundamental equations of motion.
  • The choice of thermostat significantly influences a simulation's dynamics, which has profound consequences for advanced applications such as Replica Exchange MD, drug binding pathway analysis, and the calculation of transport properties.
  • Thermostats are not universally applicable; for studying processes in isolated systems, such as unimolecular reactions described by RRKM theory, a constant-energy (NVE) simulation without a thermostat is the physically correct approach.

Introduction

Molecular Dynamics (MD) simulation offers a powerful digital microscope, allowing us to watch the intricate dance of atoms and molecules that governs the world around us. By applying the fundamental laws of motion to a collection of particles in a virtual box, we can generate a movie of matter in action. However, the most straightforward simulation of this kind describes an isolated system where total energy is conserved—a scenario known as the microcanonical or NVE ensemble. This creates a knowledge gap, as most real-world chemical and biological processes occur not in isolation, but in contact with a surrounding environment that maintains a constant temperature, a condition described by the canonical or NVT ensemble.

To bridge this divide between computational simplicity and physical reality, we need a special class of algorithms: thermostats. This article explores the vital role of thermostats in MD simulations, explaining how they function as a virtual heat bath to control a system's temperature. First, in "Principles and Mechanisms," we will journey through the evolution of these algorithms, from simple but flawed approaches to the elegant and rigorous solutions like the Nosé-Hoover thermostat, and uncover the subtle but critical importance of getting statistical fluctuations right. Following that, "Applications and Interdisciplinary Connections" will reveal how these thermostats are not just theoretical constructs but practical tools used by scientists to ensure simulation accuracy, enable advanced methods, and sculpt digital realities that lead to profound discoveries across chemistry, physics, and materials science.

Principles and Mechanisms

The Universe in a Box: A Tale of Two Ensembles

Imagine you want to study a protein folding, or water freezing, or a metal bending. The most direct way to ask nature how this happens is simply to watch the atoms. In a computer, we can do just that. We can build a virtual box, fill it with atoms described by some physical laws (a ​​force field​​), give them a push, and watch what happens. By applying Newton's simple law, F=maF=maF=ma, to every single atom at every single instant, we can generate a movie of matter in motion. This is the heart of ​​Molecular Dynamics (MD)​​.

Now, a simulation built this way has a peculiar property. If you add up all the kinetic energy (the energy of motion) and all the potential energy (the energy stored in the bonds and interactions), this ​​total energy​​ will remain constant, barring any small numerical errors. The system is isolated, a tiny universe unto itself. In the language of statistical mechanics, we say it samples the ​​microcanonical ensemble​​, or the ​​NVE ensemble​​, for constant Number of particles (N), Volume (V), and Energy (E).

But here’s the catch. Is an isolated system with constant energy a good model for reality? Think about a real-world chemistry experiment. Is it performed in a perfectly insulated box where the total energy never changes? Almost never! Instead, it sits on a lab bench, in a room that stays at a more or less constant temperature. The little test tube is in thermal contact with the whole room, the building, and effectively the entire planet—a gigantic ​​heat bath​​. It can freely borrow a bit of energy from the room or lend some back to it, all to keep its own temperature stable. This scenario, a system at constant N, V, and ​​Temperature​​ (T), is called the ​​canonical ensemble​​, or ​​NVT ensemble​​.

This puts us in a bind. The most natural way to simulate atoms gives us a constant-energy world (NVE), but the world we want to describe is a constant-temperature one (NVT). How do we bridge this gap? We need a way to build a "heat bath" inside the computer. We need an algorithm that can intelligently add or remove energy to keep our simulated system at the temperature we desire. We need a ​​thermostat​​.

The Digital Demon's Hand

So, what exactly is a thermostat in a simulation? And what do we even mean by "temperature"? In a collection of atoms, temperature is nothing more than a measure of the average kinetic energy. It’s a reflection of how vigorously the atoms are jiggling around. A thermostat, then, is an algorithm whose job is to watch this jiggling and give the atoms a little push or pull to keep the average jiggling constant. Its primary function is to modify the velocities of the particles in a way that steers the system's average kinetic energy towards a value that corresponds to our target temperature, effectively coupling our simulation to a virtual heat bath.

Let's watch one in action. Imagine we start our simulation from a perfectly ordered crystal structure, nearly "frozen" at 0 K0\,\mathrm{K}0K. We want to study it at a warm 300 K300\,\mathrm{K}300K. We turn on the thermostat. What happens? The thermostat sees the atoms are too "cold" and begins injecting kinetic energy, scaling up their velocities. The measured temperature of the system shoots up. Like an overeager driver hitting the gas, it might even overshoot the target of 300 K300\,\mathrm{K}300K slightly before the thermostat's feedback kicks in to cool it down. After a brief period of settling, the system reaches ​​equilibration​​.

And now for a truly beautiful point. Once equilibrated, is the instantaneous temperature exactly 300.000 K300.000\,\mathrm{K}300.000K at every moment? Absolutely not! That would be deeply unphysical. A finite number of particles in contact with a heat bath will always experience fluctuations. The temperature will jitter around the average of 300 K300\,\mathrm{K}300K. These ​​fluctuations​​ are not a mistake or an imperfection in our thermostat; they are a fundamental and correct property of the canonical ensemble. The size of these fluctuations is itself a predictable physical quantity, related to the size of the system. For a system with fff degrees of freedom at temperature TTT, the variance of the temperature is Var⁡(T)=2fT2\operatorname{Var}(T) = \frac{2}{f} T^{2}Var(T)=f2​T2. Only for an infinitely large system would the fluctuations vanish. Seeing these fluctuations tells us our thermostat is allowing the system to "breathe" as it exchanges energy with the virtual heat bath.

A Gallery of Thermostats: From Brute Force to Finesse

How would one go about building such a device? Let's try to invent one.

The most straightforward idea might be what we can call the ​​brute-force method​​, or ​​simple velocity rescaling​​. At every single step of the simulation, we calculate the instantaneous kinetic energy. If it doesn't correspond to our target temperature, we just multiply all particle velocities by whatever factor, λ\lambdaλ, is needed to force it to be correct, instantly.

It's simple, and it works, in a way. The average temperature will be correct. But it's a terrible thermostat for doing science. Why? Because it completely destroys the very fluctuations we just learned are so important! By forcing the kinetic energy to be constant, it creates a bizarre, unphysical state that is not the canonical ensemble. It's like trying to understand crowd behavior by forcing every single person to stand perfectly still. You learn nothing about how they naturally move.

So, we need a gentler touch. This brings us to the ​​Berendsen thermostat​​. Instead of a sledgehammer, it uses a soft nudge. It still rescales velocities by a factor λ\lambdaλ, but this factor is calculated to gently guide the temperature towards the target, T0T_0T0​, over a characteristic time, τT\tau_TτT​. The scaling factor is given by a simple feedback formula: λ2=1+ΔtτT(T0T−1)\lambda^2 = 1 + \frac{\Delta t}{\tau_T} \left( \frac{T_0}{T} - 1 \right)λ2=1+τT​Δt​(TT0​​−1), where Δt\Delta tΔt is the simulation time step. If the system is too hot (T>T0T > T_0T>T0​), the term in the parenthesis is negative, so λ1\lambda 1λ1, and the velocities are scaled down. If it's too cold (TT0T T_0TT0​), λ>1\lambda > 1λ>1, and they are scaled up. It's a beautiful, simple negative feedback loop.

This thermostat is much better. It's wonderfully effective for equilibrating a system—getting it to the desired temperature in the first place. You can even tune its "strength" with the coupling parameter τT\tau_TτT​. A small τT\tau_TτT​ (strong coupling) gets you to the target temperature very quickly, but at the cost of suppressing the natural fluctuations. A large τT\tau_TτT​ (weak coupling) is much gentler, perturbing the system less and allowing for larger, more realistic fluctuations, but it takes longer to equilibrate. This reveals a classic trade-off between speed and physical accuracy.

The Peril of the Plausible: Getting the Right Average, but the Wrong Physics

For a long time, the Berendsen thermostat was a workhorse of the field. It's simple, robust, and it gives the correct average temperature. What more could you ask for?

Well, it turns out you should ask for more. There is a subtle but profound flaw lurking beneath the surface. While the Berendsen thermostat is much gentler than brute-force rescaling, it still artificially suppresses the size of the system's natural energy fluctuations. The distribution of kinetic energies it produces is narrower than the true one predicted by the canonical ensemble.

Why is this a disaster? It's a disaster because in statistical mechanics, some of the most important physical properties are derived not from averages, but from the magnitude of fluctuations! A prime example is the ​​heat capacity​​ (CVC_VCV​), which tells you how much energy a substance can absorb for a given increase in temperature. The formula for heat capacity derived from statistical mechanics is directly proportional to the variance of the total energy: CV=⟨E2⟩−⟨E⟩2kBT2C_V = \frac{\langle E^2 \rangle - \langle E \rangle^2}{k_B T^2}CV​=kB​T2⟨E2⟩−⟨E⟩2​ If your thermostat is tampering with the energy fluctuations, your calculated value of CVC_VCV​ will be wrong.

This is a deep and sobering lesson. To correctly model nature, it is not enough to get one number right (the average temperature). You must get the entire statistical distribution right. The Berendsen thermostat, for all its utility in equilibration, does not generate a true canonical ensemble. It gives you a plausible-looking system that fails under closer scrutiny.

The Hamiltonian's Ghost: A Rigorous and Beautiful Solution

So, how does one create a thermostat that is both gentle and rigorously correct? The answer, when it came, was a stroke of genius. Instead of imposing temperature control from the "outside" with an ad-hoc rule, the ​​Nosé-Hoover thermostat​​ builds the heat bath right into the fundamental laws of motion.

It does this by augmenting the system with a new, fictitious degree of freedom—a "thermostat variable" with its own "mass" and "momentum." This variable couples to the real particles and acts as a dynamic energy reservoir. The whole setup—physical particles plus thermostat variable—is described by an ​​extended Hamiltonian​​. The beauty of this formulation is that the total energy of this extended system is conserved, and the equations of motion that result from it can be proven to generate trajectories for the physical particles that sample the exact canonical (NVT) distribution. The physical energy is no longer conserved; it fluctuates correctly as it's exchanged with the thermostat variable.

The Nosé-Hoover approach is the difference between a clever hack and a fundamental law. It doesn't just nudge the temperature; it generates the correct physics from first principles.

To appreciate its elegance, consider a "corrupt" thermostat that tries to do the opposite. Imagine an algorithm that, at every step, fixes the system's total energy HHH to be exactly equal to the average energy you'd expect in the NVT ensemble. This algorithm would conserve energy precisely but would generate the microcanonical (NVE) ensemble, because it explicitly kills the energy fluctuations that define the canonical ensemble. The brilliance of Nosé-Hoover is that it allows the physical energy to fluctuate while still being part of a larger, deterministic, energy-conserving system.

The Limits of Genius: When Order Resists Chaos

Is the Nosé-Hoover thermostat the final word, the perfect algorithm? For most complex, messy, chaotic systems like liquids or proteins, it is extraordinarily effective. But science always pushes at the boundaries, and a fascinating failure mode was discovered in highly regular, orderly systems.

Consider a perfect harmonic crystal, which can be described as a set of independent, non-interacting harmonic oscillators. Or even just a single 1D harmonic oscillator. If you apply a standard Nosé-Hoover thermostat to such a system, something strange can happen. The dynamics can fail to be ​​ergodic​​. Ergodicity is the crucial assumption that a system, over a long time, will explore all possible configurations it's allowed to access. It's the assumption that lets us substitute a time average from one long simulation for an average over all possible states (an ensemble average).

In the case of the harmonic oscillator, the deterministic Nosé-Hoover dynamics can be too regular. The trajectory gets trapped in a smooth, repetitive loop in its phase space, confined to a surface known as an ​​invariant torus​​. It never visits other accessible states. As a result, even though the thermostat's equations are technically correct, the simulation fails to sample the full canonical distribution simply because it never gets there.

The solution to this problem is as clever as it is beautiful. If one thermostat variable isn't chaotic enough to properly "stir" the system and ensure ergodicity, what's the answer? Add more! This leads to the ​​Nosé-Hoover chain (NHC)​​. In this scheme, the first thermostat is coupled to the physical system. A second thermostat is coupled to the first one. A third is coupled to the second, and so on. This chain of coupled, nonlinear equations creates a robust source of deterministic chaos that is strong enough to break the unwanted regularity of the harmonic system. It ensures the trajectory can explore the entire phase space, restoring ergodicity and guaranteeing that our time averages converge to the true canonical ensemble averages.

This journey—from a simple need, through a series of increasingly sophisticated and elegant solutions, to the discovery of subtle limitations and the invention of yet more clever fixes—is the story of science in miniature. It reveals that controlling a concept as seemingly simple as "temperature" in a simulation is a deep and fascinating challenge, a beautiful dance between physics, mathematics, and computation.

The Sculptor's Hand: Thermostats as Tools of Creation and Discovery

In our journey so far, we have unraveled the clever mechanisms that physicists and chemists have invented to control temperature in the universe of a computer simulation. We've seen them as mathematical gadgets, elegant sets of equations that inject and remove energy to keep our simulated atoms jiggling at just the right pace. But to see a thermostat as merely a heater or a refrigerator is to see a sculptor's chisel as just a sharp piece of metal. In the hands of a master, a simple tool can create worlds. The art and science of molecular simulation is not just about writing code; it's about knowing how to wield these tools to sculpt a digital reality that is a faithful, insightful, and beautiful reflection of the real world. This chapter is about that art. It's about the applications, the connections, and the subtle wisdom required to use thermostats not just to simulate, but to discover.

Beyond the Zero-Kelvin Stillness

Let's start with a simple, almost paradoxical observation. Imagine you have a chemical bond described perfectly by a force field. There's a certain length, let’s call it r0r_0r0​, where the potential energy of the bond is at its absolute minimum. You might naturally assume that if you run a simulation at room temperature, the average length of this bond would be, well, r0r_0r0​. It seems obvious, doesn't it? The bond will vibrate, sometimes shorter, sometimes longer, but on average it should settle at its most comfortable length.

But if you actually perform this experiment in a computer, you find that's not what happens. The average bond length ⟨r⟩\langle r \rangle⟨r⟩ is almost always slightly longer than r0r_0r0​. Why? This isn't a bug in the code or an error in the thermostat. It is a profound consequence of what "temperature" really means. The potential energy well for a bond is not a perfect, symmetric parabola. It's much steeper on the short side (it’s hard to squash two atoms together) and shallower on the long side (it's easier to stretch them apart). At absolute zero, the atoms would sit motionless at the bottom of this well, at r0r_0r0​. But at any finite temperature, thanks to the thermostat, the bond has energy to explore the landscape around the minimum. Because the landscape is asymmetric, the bond spends a little more time in the gentler, stretched-out region than in the steep, compressed region. So, its time-averaged position is shifted outwards. The thermostat has allowed the system to sample a Boltzmann distribution of positions, and for an anharmonic potential, the average of the distribution is not the same as the minimum of the potential. This is our first clue: a thermostat doesn't just make things move; it reveals the true statistical nature of a world warmed by thermal energy.

The Art of the Possible: Ensuring Your Simulated World is Real (Enough)

A thermostat is our portal to the canonical ensemble, the statistical reality of a system in contact with a heat bath. But just because we've turned on a thermostat, does that mean our simulation is instantly a perfect replica of that reality? Alas, no. A common pitfall is to run a simulation for a short time, see that the temperature and energy have stopped drifting, and declare the system "equilibrated."

This is a dangerous assumption. What we've likely found is a stationary state, but it might not be the true thermodynamic equilibrium. The universe of possible configurations for a complex molecule like a protein is vast and rugged, filled with deep valleys separated by high mountain passes. Our simulation might have simply rolled into the nearest valley and gotten stuck. It appears stable, but it's only exploring a tiny, unrepresentative fraction of the world. This is the great dragon of simulation science: the problem of ergodicity. We might be in a metastable state, a local minimum, not the true global one.

How do we slay this dragon? One of the most powerful weapons in our arsenal is a technique called Replica Exchange Molecular Dynamics (REMD). The idea is brilliant: instead of one simulation, we run many copies (replicas) of our system simultaneously, each at a different temperature. The high-temperature replicas have enough energy to fly over those high mountain passes with ease, exploring the entire landscape. The low-temperature replicas explore the local valleys in fine detail. Every so often, we propose a swap: the configuration of a high-temperature replica is given to a low-temperature thermostat, and vice versa. This allows the detailed, low-temperature exploration to "teleport" to new, previously unexplored valleys found by a high-flying replica.

Here, the thermostat plays a dual role. It not only maintains the temperature of each replica but its very correctness is the lynchpin of the entire method. The acceptance rule for a swap is derived from the laws of statistical mechanics, and it presumes that each replica is a perfect canonical ensemble at its designated temperature. If one of our thermostats is faulty—for example, a simple Nosé-Hoover thermostat that can fail to be ergodic for certain systems—it fails to generate a proper canonical sample. This poisons the well. The intricate dance of replica exchanges breaks down, and the entire simulation, for all its computational cost, may yield a biased and incorrect result.

This idea of relying on the thermostat to create a "correct" statistical reality that we can then manipulate is at the heart of many advanced methods. In umbrella sampling, for instance, we want to map out the energy of a rare event, like two molecules binding. This is like trying to measure the height of a mountain pass when you're stuck in a valley. The solution is to add an artificial "umbrella" potential that pushes our system up the hillside. Now the system is evolving under a biased, "unreal" potential. But because our thermostat guarantees that we are sampling the canonical ensemble of this biased world, we can use the mathematics of statistical mechanics (like the Weighted Histogram Analysis Method, or WHAM) to precisely subtract the effect of our umbrella and reconstruct the true, unbiased energy landscape. It is like taking a photograph through a cleverly designed distorting lens, knowing that because you understand the lens's properties perfectly, you can digitally un-distort the photo to see the original, pristine image. The thermostat provides the guarantee that our "lens" is well-behaved.

A Tale of Two Timescales: Sculpting Dynamics vs. Statics

Here we come to one of the most subtle and important dualities in the world of simulation. Thermostats are designed to get the statics right—that is, to ensure that if we take a very long-time average, our system correctly reproduces the properties of the canonical ensemble. But to do this, they must interfere with the dynamics—the moment-to-moment trajectory of the particles. Sometimes this interference is an unwanted side effect; other times, it's a feature we can exploit with astonishing cleverness.

Consider the vital work of drug design. Scientists simulate a drug molecule (a ligand) trying to bind to its target protein, for example, a Cytochrome P450 enzyme involved in metabolizing drugs. For the drug to work, it must be able to get into the protein's active site. This often requires the protein itself to "breathe"—loops of the protein must transiently move out of the way to open a path. This breathing is a dynamical process. The choice of thermostat can have a dramatic impact here. A weakly coupled Nosé-Hoover thermostat might allow for realistic, long-lived fluctuations, while a strongly damped Langevin thermostat, which adds a lot of friction to the system, might slow these motions down, making it seem like the ligand can't get in. The choice is no longer a mere technicality; it directly influences the prediction of whether a drug will be effective.

The influence can be even more subtle. In a Quantum Mechanics/Molecular Mechanics (QM/MM) simulation, we treat a small, important region (like a chromophore that absorbs light) with the full rigor of quantum mechanics, while the surrounding environment (like water) is treated classically. We apply a thermostat only to the classical water molecules. You might think the quantum region is safe from the thermostat's influence. Not so! The thermostat alters the dance of the water molecules. This changes the time-varying electric field that the water molecules exert on the quantum region. While the average properties of the quantum system might be correct, its dynamical response—how it absorbs and emits light over time, for instance—is now indirectly coupled to the artificial dynamics imposed by the thermostat on its environment.

But here is where the art of the computational sculptor truly shines. Sometimes, we can use this "unphysical" aspect of thermostats to our advantage. To make simulations more realistic, so-called polarizable force fields have been developed. In one popular model, the Drude oscillator, electronic polarization is mimicked by attaching a small, charged "Drude particle" to each atom with a spring. This adds fast, difficult-to-manage vibrations to the system. A brilliant solution was found: use two thermostats. One keeps the real atoms at the desired physical temperature (say, 300 K300\,\mathrm{K}300K). Another, separate thermostat keeps the fictitious Drude particles at an extremely low temperature (say, 1 K1\,\mathrm{K}1K). This "cold Drude" setup is profoundly unphysical—it creates a constant, artificial flow of heat from the hot atoms to the cold Drudes. But it works wonders! By keeping the Drudes cold, we force them to stick very close to their optimal positions, effectively taming their wild vibrations and making the simulation vastly more stable. We have sacrificed the realistic dynamics of the fictitious particles to better approximate a more fundamental physical principle—the Born-Oppenheimer separation of nuclear and electronic motion. Static properties remain correct, and the simulation is saved from numerical catastrophe. This is a beautiful act of scientific jujitsu, turning a "problem" with thermostats into a powerful solution.

The Measured Touch: Applications in Engineering and Physics

The thoughtful application of thermostatting extends deep into the worlds of engineering and theoretical physics. Imagine you are a materials scientist studying why materials break. A crucial factor is stress concentration: at the tip of a tiny crack or notch, the stress can be many times higher than in the bulk material. How can we measure this local stress in a simulation?

The naive approach of thermostatting the entire block of metal while pulling on it is wrong. A thermostat adds and removes momentum, scrambling the very quantity—the flux of momentum—that defines the stress tensor. The elegant solution is to perform computational surgery. We divide our system into regions. The atoms far away from the notch are coupled to a thermostat, turning them into a realistic heat sink that absorbs the heat generated by deformation. But the "gauge region" right around the notch is left to evolve under pure, unadulterated Newtonian dynamics (an NVE ensemble). It is in this pristine, untampered region that we measure the stress. This hybrid scheme allows the best of both worlds: stable temperature control for the bulk system, and pure, physical dynamics where it matters most for the measurement.

The subtlety reaches its zenith when we try to compute transport properties like viscosity or thermal conductivity. The famous Green-Kubo relations in physics state that these properties, which describe a system's response to a perturbation, can be calculated from the time-integral of an equilibrium flux autocorrelation function. For example, shear viscosity is related to the integral of the stress-tensor autocorrelation function. This presents a paradox. The property we want, viscosity, is a dynamical one, depending on the system's natural time evolution. But to compute the equilibrium average required by the Green-Kubo formula, we need a long simulation at a stable temperature, which seems to demand a thermostat that will meddle with the dynamics!

The resolution is a testament to the power of linear response theory. It turns out that if we are careful, we can have our cake and eat it too. As long as our thermostat is (1) weakly coupled, meaning its characteristic timescale is much longer than the decay time of the flux correlations, and (2) it respects the fundamental conservation laws of the system (for instance, to calculate viscosity, the thermostat must not break total momentum conservation), then the "damage" it does to the dynamics is minimal and primarily at high frequencies. The zero-frequency component of the flux spectrum—which is equivalent to the time integral of the correlation function—remains miraculously intact. The thermostat's meddling affects the short-time wiggles of the system, but the long-time collective behavior that determines transport is preserved. This allows us to use a thermostatted simulation to calculate a property of the underlying, unperturbed Hamiltonian dynamics.

Knowing When to Put the Chisel Down: The Limits of Temperature

A master sculptor knows not only how to use their chisel but also when to put it down. The same is true for the computational scientist and their thermostat. A thermostat is a tool for simulating a system in contact with a heat bath—an open system. What if our question is about an isolated system?

Consider the theory of unimolecular chemical reactions in the gas phase. The foundational RRKM theory calculates the rate constant k(E)k(E)k(E) for a reaction at a specific, fixed total energy EEE. This describes an isolated molecule with no environment to exchange energy with. If we want to simulate this process, the physically correct ensemble is the microcanonical (NVE) ensemble, where energy is strictly conserved.

It is tempting to think one could run a canonical (NVT) simulation with a thermostat and then simply "reweight" the results to get the microcanonical rate. This is fundamentally wrong. A thermostat doesn't just manage the system's energy; it completely changes its equations of motion. The rate of a reaction depends on the dynamical flux of trajectories crossing from reactant to product. Because a thermostat adds friction and random forces, it alters these very trajectories. It changes how the molecule explores its own potential energy surface. The dynamics generated by a thermostat are not the dynamics of an isolated molecule. Therefore, trying to calculate k(E)k(E)k(E) with a thermostat is like trying to study the behavior of a single, isolated star by observing it while it's inside a giant oven. The context is wrong, and the results will be meaningless.

Here, the proper action is to put the thermostat away and run a true, energy-conserving NVE simulation. It reminds us that for all their power and sophistication, thermostats are tools designed for a specific physical picture. The first and most important step in any simulation is to ask: what is the physical reality I am trying to capture? Choosing the right tool—or choosing to use no tool at all—is the beginning of wisdom.