try ai
Popular Science
Edit
Share
Feedback
  • Mott Variable-Range Hopping

Mott Variable-Range Hopping

SciencePediaSciencePedia
Key Takeaways
  • Mott's variable-range hopping describes how electrons in disordered solids conduct electricity via phonon-assisted quantum tunneling between distant but energetically favorable localized states.
  • The theory predicts a unique temperature-dependent conductivity, σ∝exp⁡[−(T0/T)1/(d+1)]\sigma \propto \exp[-(T_0/T)^{1/(d+1)}]σ∝exp[−(T0​/T)1/(d+1)], whose form provides a clear experimental signature that depends on the system's dimensionality (d).
  • Analysis of hopping conduction is a powerful tool to probe microscopic properties, like electron localization length and density of states, from macroscopic resistance measurements.
  • In systems with strong electron-electron repulsion, conduction transitions to Efros-Shklovskii hopping, which has a universal temperature dependence of σ∝exp⁡[−(T1/T)1/2]\sigma \propto \exp[-(T_1/T)^{1/2}]σ∝exp[−(T1​/T)1/2].

Introduction

In the orderly world of a perfect crystal, electrons glide effortlessly through a repeating lattice, creating the familiar flow of electrical current. But what happens when this order breaks down? In materials like glass, amorphous semiconductors, or conducting polymers, the atomic structure is chaotic, trapping electrons in localized energy 'islands.' At low temperatures, these materials should be perfect insulators. Yet, a current can still flow. This article addresses the fundamental question of how electricity navigates such a disordered landscape. We will explore the elegant theory of variable-range hopping (VRH), first proposed by Sir Nevill Mott, which provides the answer. The following chapters will first illuminate the quantum mechanical principles and trade-offs that govern this hopping process. We will then discover the theory's far-reaching applications and interdisciplinary connections, from designing next-generation electronics to probing the fundamental properties of the disordered state.

Principles and Mechanisms

Imagine you are standing in a vast, rugged, and dark landscape. You want to get from where you are to a distant point, but the ground is not flat. It is a chaotic terrain of deep valleys and isolated peaks. This is the world of an electron in a disordered solid—a material like a pane of glass or a doped semiconductor, where the perfect, repeating lattice of a crystal is lost. In such a material, electrons are not free to roam. Instead, they find themselves trapped, or ​​localized​​, on atomic-scale "islands" of stability. How, then, can electricity flow?

Life on the Islands: The World of Localized States

In a perfectly ordered crystal, electron waves can propagate freely, forming what we call ​​extended states​​. These are like interstate highways for charge. But heavy disorder shatters these highways. As the physicist P.W. Anderson first showed, if the disorder is strong enough, electron wavefunctions can become confined to small regions. These are ​​localized states​​. An electron in such a state is stuck. At the absolute zero of temperature, it has no energy to move, and the material is a perfect insulator, even if there are plenty of available states at the electron's energy (the Fermi level).

But what if we turn up the heat, even just a little? The atoms in the material are constantly vibrating, creating little packets of energy called ​​phonons​​. These phonons can give an electron just the little "kick" it needs to do something remarkable: to disappear from its current island and reappear on another. This is quantum tunneling, and when it's assisted by a phonon, we call it ​​phonon-assisted hopping​​. This is the only way to travel in the rugged landscape of a disordered insulator at low temperatures.

The Hopper's Dilemma: A Tale of Two Penalties

Now, an electron sitting on its island at the Fermi level looks around for a place to hop. It faces a fundamental dilemma, a trade-off between two competing factors. Think of it like trying to visit a friend at a huge, noisy, multi-story party.

First, there is a ​​spatial penalty​​. The probability of quantum tunneling drops off exponentially with distance. Hopping to a nearby island is easy, just like talking to someone standing next to you at the party. Hopping to a distant island is incredibly unlikely, like shouting across the entire building. The hopping probability contains a factor of roughly exp⁡(−2R/ξ)\exp(-2R/\xi)exp(−2R/ξ), where RRR is the hopping distance and ξ\xiξ is the ​​localization length​​, a measure of the size of the electron's island.

Second, there is an ​​energy penalty​​. Hopping usually requires a change in energy, ΔE\Delta EΔE, because the destination island will almost certainly have a slightly different energy level. To make this hop, the electron needs to absorb a phonon of exactly that energy. The availability of such phonons is governed by the temperature TTT, and the probability has a Boltzmann factor of exp⁡(−ΔE/kBT)\exp(-\Delta E / k_B T)exp(−ΔE/kB​T), where kBk_BkB​ is the Boltzmann constant. Hopping to an island with a similar energy (a small ΔE\Delta EΔE) is easy, as it only requires a low-energy, common phonon. This is like finding a friend on the same floor of the party. Hopping to a state with a much different energy requires a rare, high-energy phonon, making it difficult—like having to climb several flights of stairs.

The electron is therefore caught. A nearby site is probably at a very different energy. A site at a similar energy is probably very far away. What is the optimal strategy?

Mott's Solution: The Optimal Journey

The genius of Sir Nevill Mott was to realize that the electron does not simply hop to its nearest spatial neighbor. Instead, it effectively "surveys" its surroundings and chooses the hop that is, on average, the easiest—the one that maximizes the total probability by finding the best compromise in the trade-off. Because the electron considers hopping to sites at various distances, this mechanism is called ​​variable-range hopping (VRH)​​.

The electron wants to minimize the sum of the penalties in the overall probability exponent, S=2R/ξ+ΔE/(kBT)S = 2R/\xi + \Delta E / (k_B T)S=2R/ξ+ΔE/(kB​T). The key insight is that the typical energy difference, ΔE\Delta EΔE, is itself related to the hopping distance RRR. If you are willing to look in a larger radius RRR, you are searching a larger volume and are more likely to find a state that is very close in energy.

Let’s see how this works. Suppose the ​​density of states​​ (the number of available islands per unit volume per unit energy) is a constant, g(EF)g(E_F)g(EF​), near the Fermi level. If we consider a search region of volume proportional to RdR^dRd in a system of dimension ddd, the number of states available in an energy window ΔE\Delta EΔE is proportional to g(EF)RdΔEg(E_F) R^d \Delta Eg(EF​)RdΔE. The typical energy spacing between states at a distance RRR is the energy ΔE\Delta EΔE for which this number is about one. This gives us a crucial link: ΔE≈1/(g(EF)Rd)\Delta E \approx 1 / (g(E_F) R^d)ΔE≈1/(g(EF​)Rd).

Now we can write the penalty exponent purely in terms of the distance RRR: S(R)≈2Rξ+1g(EF)RdkBTS(R) \approx \frac{2R}{\xi} + \frac{1}{g(E_F) R^d k_B T}S(R)≈ξ2R​+g(EF​)RdkB​T1​ The first term wants RRR to be small. The second term, the thermal penalty, wants RRR to be large. There must be an optimal distance, RoptR_{opt}Ropt​, that minimizes this sum. A little calculus shows that this optimal hopping distance depends on temperature: as the temperature TTT drops, the second term becomes more punishing, and it becomes worthwhile for the electron to tunnel over a longer distance to find a state with a smaller energy gap. The percolation network of these optimal hops also has a characteristic length scale that grows as the temperature is lowered.

A Universal Law Emerges: Dimensionality and the Signature of Hopping

When we plug this optimal distance RoptR_{opt}Ropt​ back into the exponent SSS, we find a beautiful and surprising result. The conductivity, σ\sigmaσ, which is proportional to the hopping probability, follows a universal law: σ(T)∝exp⁡[−(T0T)p]withp=1d+1\sigma(T) \propto \exp\left[ - \left(\frac{T_0}{T}\right)^p \right] \quad \text{with} \quad p = \frac{1}{d+1}σ(T)∝exp[−(TT0​​)p]withp=d+11​ The exponent ppp depends directly on the dimensionality ddd of space!

  • In a 3D material, conduction follows the famous ​​Mott T−1/4T^{-1/4}T−1/4 law​​.
  • In a thin film or at a 2D interface, it follows a T−1/3T^{-1/3}T−1/3 law.
  • In a long, thin wire, it follows a T−1/2T^{-1/2}T−1/2 law.

This provides an elegant experimental "fingerprint". To see if a material obeys the 3D Mott law, for instance, an experimentalist doesn't plot conductivity versus temperature. Instead, they plot the natural logarithm of the resistivity (which is 1/σ1/\sigma1/σ) against T−1/4T^{-1/4}T−1/4. If they see a straight line, they have a strong indication that variable-range hopping is at play.

The parameter T0T_0T0​ is called the ​​characteristic temperature​​. It's not a physical temperature the sample ever reaches. Rather, it is a measure of the "difficulty" of hopping, set by the material's properties: kBT0k_B T_0kB​T0​ is proportional to 1/(g(EF)ξd)1 / (g(E_F) \xi^d)1/(g(EF​)ξd). A large T0T_0T0​ means the states are highly localized (small ξ\xiξ) or sparse (small g(EF)g(E_F)g(EF​)), making hopping very difficult. Indeed, practical calculations show that T0T_0T0​ can be enormous, often hundreds of thousands or even millions of Kelvin, underscoring just how insulating these materials can be.

The Bigger Picture: Hopping in Context

Variable-range hopping doesn't happen in a vacuum; it's part of a larger story of electronic transport that changes dramatically with temperature.

At very high temperatures, electrons have so much thermal energy that they don't need to hunt for an optimal long-distance hop. They have enough energy to be thermally excited into the ​​extended states​​ above the ​​mobility edge​​, EcE_cEc​, where they can travel freely. This leads to a simple, activated behavior known as Arrhenius conduction, where σ∝exp⁡(−Ea/kBT)\sigma \propto \exp(-E_a/k_B T)σ∝exp(−Ea​/kB​T).

At intermediate temperatures, it might be too cold to reach the extended states, but warm enough that hopping to the nearest neighboring site is the most efficient process. This ​​nearest-neighbor hopping (NNH)​​ also follows an Arrhenius-like law, but with a smaller activation energy related to the average energy difference between adjacent sites.

Only at very low temperatures, where thermal energy is scarce, does the system become picky. The long-range, energy-optimizing strategy of variable-range hopping finally takes over and becomes the dominant way electricity flows. The dimensionality of this flow can even be a matter of scale. On the surface of a thin cylinder, for example, hopping at higher temperatures might be 2D because the hops are short. As the temperature drops, the optimal hop length grows, and when it becomes comparable to the cylinder's circumference, the transport effectively becomes 1D!.

When Electrons Talk Back: The Coulomb Gap and a New Kind of Hopping

Our story so far has a hidden assumption: that the electrons ignore each other. This is never strictly true. Electrons are charged particles, and they repel each other via the Coulomb force. This interaction adds a fantastic new wrinkle to the tale of hopping.

Consider an electron hopping away from a site. It leaves behind a positive charge (a "hole"). The electron-hole pair has a Coulomb interaction energy. This means that to create a low-energy excitation, the electron can't just find a state with low single-particle energy; it also needs to move far away from the hole to minimize the Coulomb penalty. This long-range interaction has a profound effect: it suppresses the density of states near the Fermi level, creating a soft gap called the ​​Efros-Shklovskii (ES) Coulomb gap​​.

With this new feature in the landscape, the rules for optimal hopping change. The optimization game, when re-played, yields a different hopping law: σ(T)∝exp⁡[−(T1T)1/2]\sigma(T) \propto \exp\left[ - \left(\frac{T_1}{T}\right)^{1/2} \right]σ(T)∝exp[−(TT1​​)1/2] Amazingly, the exponent is now 1/21/21/2, regardless of the system's dimension (d≥2d \ge 2d≥2). The new characteristic temperature, T1T_1T1​, is set by the strength of the Coulomb interaction.

This means that at sufficiently low temperatures, the Mott VRH we've been discussing should give way to ES VRH. We can even control this crossover. By placing a metal plate (a gate) near the material, we can screen the Coulomb interaction. Hops over distances larger than the gate distance no longer feel the full Coulomb repulsion. Consequently, at very low temperatures where the optimal hop distance would exceed the gate distance, the ES behavior is switched off, and Mott's law is restored!.

From a simple puzzle of conduction in a disordered world, we have uncovered a rich tapestry of physics—a story of quantum leaps, strategic compromises, and the profound influence of dimensionality and interaction, all written in the simple language of how resistance changes with temperature.

Applications and Interdisciplinary Connections

Having journeyed through the theoretical landscape of Mott's variable-range hopping, we now arrive at a crucial question: So what? What good is this elegant piece of physics? Like any profound scientific principle, its true beauty is revealed not just in its internal logic, but in its power to illuminate the world around us. Mott's law is not an abstract curiosity; it is a workhorse tool for the physicist and a design guide for the engineer. It is the key that unlocks the electronic secrets hidden within the fascinating world of disordered materials.

A Window into the Disordered World

How do we even know if a material is playing by Mott's rules? The first and most direct application of the theory is as an analytical tool. Imagine you have a strange new material—an amorphous semiconductor, a sheet of conducting polymer, or a composite of nanoparticles—and you want to understand how electrons move within it. You cool it down and measure its electrical conductivity, σ\sigmaσ, at different temperatures, TTT. The raw data might look like a messy curve. But then, you take a leap of faith guided by Mott's theory. You re-plot your data not as σ\sigmaσ versus TTT, but as the natural logarithm of conductivity, ln⁡σ\ln\sigmalnσ, versus T−1/(d+1)T^{-1/(d+1)}T−1/(d+1) (where ddd is the dimensionality, often 3).

If the electrons in your material are indeed engaged in the hopping dance we've described, a miracle occurs: the messy curve transforms into a beautiful straight line. This is the tell-tale signature of variable-range hopping. But it's more than just a pretty graph. That straight line is a message from the microscopic world. Its slope is directly related to −T01/(d+1)-T_0^{1/(d+1)}−T01/(d+1)​, where T0T_0T0​ is the characteristic Mott temperature. By measuring this slope, we have captured a single, powerful number that encapsulates the essence of the disorder in our material.

This is where the real magic begins. The theory tells us that T0T_0T0​ is not just an arbitrary parameter; it is built from fundamental properties of the material. For instance, in a three-dimensional system, T0T_0T0​ is given by T0=β/(kBN(EF)ξ3)T_0 = \beta / (k_B N(E_F) \xi^3)T0​=β/(kB​N(EF​)ξ3), where N(EF)N(E_F)N(EF​) is the density of available states for hopping and ξ\xiξ is the localization length—the characteristic size of the quantum "cloud" of a trapped electron. If we have a reasonable estimate for the density of states, we can use our experimentally determined T0T_0T0​ to calculate the localization length, ξ\xiξ. Suddenly, from a macroscopic measurement with a voltmeter and a thermometer, we have deduced a quantum-mechanical length scale on the order of nanometers! This powerful technique allows scientists to probe the degree of electron confinement in a vast array of materials, from glassy semiconductors to organic electronics.

Engineering Disorder for Modern Technology

Once we can measure and understand disorder, the next logical step is to control it. This is where physics becomes engineering, and Mott's law transforms from an analytical tool into a design principle for cutting-edge technologies.

A stunning example of this is in ​​Phase-Change Memory (PRAM)​​, a revolutionary type of non-volatile memory that could one day replace the flash memory in your phone or computer. These devices use materials like the alloy Ge₂Sb₂Te₅, which can be rapidly switched between a highly ordered crystalline state (a good conductor, the "1" bit) and a disordered amorphous state (a poor conductor, the "0" bit).

In the amorphous "0" state, conduction is dominated by Mott hopping. The degree of disorder—and thus the resistivity of the "0" state—can be precisely tuned by the protocol used to create it, for example, by how quickly the material is cooled, or "quenched," from its molten form. A faster quench might create a more disordered state, with a higher T0T_0T0​ value and more strongly localized electrons. This, in turn, results in a much lower conductivity. By understanding this relationship through the lens of Mott's law, engineers can fine-tune the material's properties to maximize the resistance contrast between the "0" and "1" states, leading to more reliable and efficient memory devices.

The extreme sensitivity of hopping conduction to external factors is a feature that can be exploited in many types of sensors.

Consider the effect of temperature. The exponential dependence of resistivity on (T0/T)1/(d+1)(T_0/T)^{1/(d+1)}(T0​/T)1/(d+1) means that even a small change in temperature can cause a very large change in resistance. We can quantify this sensitivity using the ​​Temperature Coefficient of Resistance (TCR)​​, α=(1/ρ)(dρ/dT)\alpha = (1/\rho)(d\rho/dT)α=(1/ρ)(dρ/dT). A quick calculation based on Mott's formula reveals that α\alphaα is large, negative, and strongly dependent on temperature itself. This high sensitivity makes materials in the hopping regime excellent candidates for highly sensitive thermometers, or bolometers, which can detect minute temperature changes, such as those caused by absorbing a single photon of infrared radiation.

Now, what if we stretch the material? Consider a disordered semiconducting polymer, which you can think of as a tangled mess of molecular chains. If we apply a small uniaxial strain, we slightly change the average distance between the localized states where electrons are trapped. Since the hopping probability is exponentially sensitive to this distance, the overall resistance of the polymer will change. Mott's framework allows us to make this quantitative, predicting the ​​piezoresistive coefficient​​—the fractional change in resistance for a given strain. This effect is the basis for creating flexible and sensitive strain gauges from "plastic" electronics.

Broader Horizons and Deeper Connections

The influence of Mott's hopping principle extends far beyond simple electrical resistance. It provides a unified framework for understanding various transport phenomena in the presence of disorder, often revealing surprising deviations from the behavior seen in conventional metals and semiconductors.

For example, if you heat one end of a metal bar and cool the other, a voltage develops across it—the ​​Seebeck effect​​. This is the basis of thermoelectric generators. Does this also happen in a hopping system? Yes, but the physics is more subtle. In a perfectly symmetric system, hopping is equally likely in all directions, so no net voltage appears. However, if the density of available electronic states has a slight energy dependence—if it's not perfectly flat around the Fermi level—an asymmetry is introduced. An extended version of Mott's theory can predict the resulting Seebeck coefficient, which, unsurprisingly, shows a unique temperature dependence that serves as another distinct signature of hopping transport.

Another classic principle of metal physics is the ​​Wiedemann-Franz law​​, which states that the ratio of thermal conductivity to electrical conductivity is a universal constant for all metals. This reflects the fact that the same mobile electrons are responsible for carrying both charge and heat. But in a hopping system, this law spectacularly fails. An electron hopping from one site to another carries its energy with it, contributing to thermal conduction. However, the convoluted and constrained nature of this process completely decouples the flow of heat from the flow of charge. The Lorenz number, the supposedly "constant" ratio in the Wiedemann-Franz law, becomes strongly temperature-dependent, signaling in the clearest possible way that we have left the familiar world of metallic conduction and entered a new physical regime.

We can even ask how this hopping dance is affected by a magnetic field. A magnetic field can interact with the intrinsic spin of the electrons. This ​​Zeeman effect​​ can shift the energy levels of spin-up and spin-down electrons. If the density of states has some curvature, this energy shift can alter the number of available final states for hopping, leading to a change in the overall resistance. This effect, known as ​​magnetoresistance​​, provides yet another handle for physicists to probe the subtle features of the electronic landscape within a disordered material.

The Percolation Maze

So far, we've painted a picture of an electron intelligently choosing its best hop. But the reality is more chaotic. The material isn't a simple chain of sites; it's a vast three-dimensional maze of possible hops, a network of random resistors. How does a current find its way through this mess?

The answer lies in ​​percolation theory​​. Imagine flooding the network with water. At first, you only fill small, isolated puddles. But as you raise the water level, at a specific critical point, a single, continuous, wet path suddenly forms, connecting one end of the network to the other. This is the percolation threshold. Electrical conduction in a hopping system is analogous. The overall conductivity of the entire sample is dominated not by the average hop, but by the "easiest" interconnected path that just manages to span the material.

This perspective gives us even deeper insights. For instance, the low-frequency electrical ​​noise​​—the tiny, random fluctuations in the current—is not just random static. The percolation model tells us that this noise is dominated by the fluctuations of a few "critical" resistors on this tenuous conducting backbone. This powerful idea predicts a universal scaling relationship between the magnitude of the noise and the overall conductivity of the sample. By listening to the electrical "crackle" of a hopping system, we are, in a sense, learning about the fractal geometry of the percolation cluster that carries the current!.

This percolation viewpoint also helps us understand what happens in finite systems. As we lower the temperature, the optimal hopping distance grows. What happens if our sample is a tiny sphere, and the optimal hop distance becomes comparable to the sphere's radius? The electron can't hop any further! At this point, the finite size of the system begins to dominate, and the transport behavior crosses over into a new regime. Mott's theory allows us to predict the ​​crossover temperature​​ at which this transition occurs, beautifully linking the microscopic hopping length to the macroscopic dimensions of the system.

Finally, it is essential to recognize the limits of our simplest model. We have largely ignored the Coulomb repulsion between electrons. But in some systems, like strongly ​​compensated semiconductors​​, these interactions are paramount. Here, a large number of ionized donor and acceptor atoms create strong potential fluctuations that enhance localization, making an otherwise metallic material insulating. Even more profoundly, the repulsion between electrons can tear open a "Coulomb gap" in the density of states right at the Fermi level. This modification changes the hopping rules, leading to a different temperature dependence known as Efros-Shklovskii VRH. This serves as a beautiful reminder that our models are stepping stones, and understanding their limitations—like the role of compensation—points the way toward a richer and more complete picture of nature.

From explaining lab data to designing next-generation memory, and from predicting thermal transport to revealing connections with the abstract mathematics of percolation, the simple law derived by Sir Nevill Mott continues to be a source of profound insight. It demonstrates the remarkable power of a simple physical idea to unify a vast range of phenomena, showcasing the inherent beauty and utility of physics in action.