try ai
Popular Science
Edit
Share
Feedback
  • Keldysh Formalism

Keldysh Formalism

SciencePediaSciencePedia
Key Takeaways
  • The Keldysh formalism uses a closed-time path to define Green's functions that track both the available states and their occupation in non-equilibrium systems.
  • It provides a rigorous derivation of the fluctuation-dissipation theorem, connecting a system's fluctuations to its dissipative properties in equilibrium.
  • It is a cornerstone for calculating electronic transport properties, such as current, Fano resonances, and shot noise, in nanoelectronic devices.
  • The framework is broadly applicable, unifying the description of charge, heat, and spin transport, and providing a microscopic basis for open quantum system theories.

Introduction

In the quantum realm, many systems exist in a state of placid thermal equilibrium, where their properties are stable and well-understood. However, the most interesting and technologically relevant phenomena—from a transistor switching on to a chemical reaction unfolding—occur when systems are driven far from this equilibrium. Standard theoretical tools, designed for static conditions, often fail to capture this dynamic evolution. This creates a significant knowledge gap in our ability to describe and predict the behavior of quantum systems in action.

The Keldysh formalism emerges as a powerful and elegant solution to this very problem. It is a theoretical framework designed specifically for non-equilibrium quantum statistical mechanics, providing a language to describe how systems evolve in time. This article serves as a comprehensive introduction to this cornerstone of modern condensed matter physics. In the first part, "Principles and Mechanisms," we will journey through the conceptual foundations of the formalism, exploring the ingenious closed-time contour, the different types of Green's functions that capture a system's full history, and the powerful matrix algebra that makes calculations tractable. Following this, the "Applications and Interdisciplinary Connections" section will demonstrate the framework's immense practical utility, showcasing how it is used to analyze everything from electronic transport in nanodevices to the subtle dynamics of quantum information, revealing a unified picture of the quantum world in motion.

Principles and Mechanisms

A Journey Through Time, and Back Again

Imagine trying to understand a complex story—not just what happens at the end, but the intricate web of cause and effect, the relationships between events unfolding at different moments. A simple chronological timeline wouldn't be enough. You might want to jump back and forth, comparing a scene from the beginning with one from the end to see how a character has changed.

This is the challenge we face in non-equilibrium physics. The theoretical tools developed for systems in tranquil, unchanging thermal equilibrium, like the elegant imaginary-time Matsubara formalism, are like a single, ordered timeline. They are powerful for describing static properties, but they fall short when we want to watch a system in action—a transistor switching on, a molecule reacting, or a current developing after we flip a switch. These are ​​transient phenomena​​, where the system is actively evolving from one state to another, and its properties depend on the absolute time that has passed since the process began. To capture this drama, we need a more powerful storytelling device.

Enter the ​​Keldysh formalism​​, a brilliant conceptual leap imagined independently by Julian Schwinger, Leonid Keldysh, and others. The central idea is as simple as it is profound: to keep track of a system's full history, we must travel not only forward in time but also backward. We construct a "closed-time" path, often called the ​​Keldysh contour​​. It starts in the distant past (t→−∞t \to -\inftyt→−∞), runs forward to the distant future (t→+∞t \to +\inftyt→+∞), and then, like a film being rewound, it runs all the way back to the start.

This two-way journey allows us to ask much more nuanced questions. Instead of just asking "what is the state of the system now?", we can ask "if we poke the system at time t′t't′, what is the correlation with an observation at time ttt?" The contour provides a natural bookkeeping system for this. By placing our "poke" and "observation" on different parts of the contour, we can define different kinds of physical correlations. This leads to two fundamental quantities known as Green's functions:

  1. The ​​greater Green's function (G>G^>G>)​​: This function is typically associated with placing the observation on the forward branch and the poke on the backward branch. Intuitively, it describes the correlation of "particle-like" excitations. You can think of it as being related to the empty states available in the system—the "holes" a particle could occupy. It answers the question, "If a state at energy ω\omegaω is available, what is the correlation?"

  2. The ​​lesser Green's function (G<G^<G<)​​: This function corresponds to placing the poke on the forward branch and the observation on the backward branch. It describes the correlation of "hole-like" excitations and is related to the states that are already occupied by particles. It answers the question, "If a state at energy ω\omegaω is occupied, what is the correlation?"

Together, G>G^>G> and G<G^<G< give us a complete picture of the system's quantum state: not just which energy levels exist, but which ones are filled and which are empty. This is the essential information needed to describe a system that is not in a simple thermal equilibrium state.

The Rosetta Stone: Fluctuation and Dissipation

Before we can trust our new, sophisticated time-travel machine in the uncharted territory of non-equilibrium, we must test it in a familiar landscape: a system in perfect thermal equilibrium. If the formalism is sound, it should reproduce the celebrated results of equilibrium statistical mechanics. And it does, in a most beautiful way.

In an equilibrium system, nothing is really happening on a macroscopic scale. The forward and backward paths in time are not independent. They are intimately linked by the ​​Kubo-Martin-Schwinger (KMS) condition​​. This condition is a deep statement about thermal equilibrium, and for fermions like electrons, it provides a simple algebraic relation between the Green's functions' Fourier transforms:

G>(ω)=−eβℏωG(ω)G^>(\omega) = -e^{\beta \hbar \omega} G^(\omega)G>(ω)=−eβℏωG(ω)

where β\betaβ is the inverse temperature (1/kBT1/k_B T1/kB​T). This equation tells us that in equilibrium, the probability of creating an excitation of energy ℏω\hbar\omegaℏω is related to the probability of destroying one by a simple Boltzmann factor and a sign factor characteristic of fermions.

This single relation is a "Rosetta Stone" that allows us to connect two seemingly different aspects of the system. Let's define two combinations of our Green's functions that make this connection clear:

  • The ​​spectral function, A(ω)A(\omega)A(ω)​​: Defined as A(ω)=i(G>(ω)−G(ω))A(\omega) = i(G^>(\omega) - G^(\omega))A(ω)=i(G>(ω)−G(ω)), this function tells us about the available states in the system at energy ℏω\hbar\omegaℏω. It represents the system's capacity to have excitations. In a material, it maps out the electronic band structure. It is directly related to the system's ability to absorb energy, or ​​dissipation​​.

  • The ​​Keldysh function, GK(ω)G^K(\omega)GK(ω)​​: Defined as GK(ω)=G>(ω)+G(ω)G^K(\omega) = G^>(\omega) + G^(\omega)GK(ω)=G>(ω)+G(ω), this function tells us how those available states are actually occupied. It is a measure of the statistical population of states, or the quantum ​​fluctuations​​.

Using the KMS condition, we can easily find a direct relationship between these two quantities. It is none other than the famous ​​fluctuation-dissipation theorem​​ (for fermions):

GK(ω)=−itanh⁡(βℏω2)A(ω)G^K(\omega) = -i \tanh\left(\frac{\beta\hbar\omega}{2}\right) A(\omega)GK(ω)=−itanh(2βℏω​)A(ω)

This is a stunning result. It says that in equilibrium, the fluctuations in a system (its "content," GKG^KGK) are completely determined by its dissipative properties (its "capacity," AAA) and the temperature. The random thermal jiggling of particles is not random at all in this statistical sense; it is rigidly dictated by how the system responds to external pokes. The Keldysh formalism not only contains this profound wisdom but derives it as a natural consequence of its time-contour structure.

A Practical Toolkit for the Time-Traveler

While the greater and lesser functions (G^, G^) are physically intuitive, they can be cumbersome for calculations. It turns out to be incredibly useful to reshuffle our toolkit into a basis that respects causality more explicitly. This is the ​​retarded-advanced-Keldysh (R/A/K) basis​​.

  1. The ​​retarded Green's function, GR(τ)G^R(\tau)GR(τ)​​: This is non-zero only for times τ0\tau 0τ0. It describes the system's response to a perturbation in the past. It answers the question, "If I poke the system now, what will happen in the future?" It is built from the difference GR(t−t′)=θ(t−t′)(G(t−t′)−G(t−t′))G^R(t-t') = \theta(t-t')(G^(t-t') - G^(t-t'))GR(t−t′)=θ(t−t′)(G(t−t′)−G(t−t′)).

  2. The ​​advanced Green's function, GA(τ)G^A(\tau)GA(τ)​​: This is non-zero only for times τ0\tau 0τ0. It describes how the present state was influenced by past events.

  3. The ​​Keldysh Green's function, GKG^KGK​​: This is our old friend, describing the statistical occupation of states.

The true magic appears when we assemble these components into a 2×22 \times 22×2 matrix. This Keldysh-space matrix has a remarkably simple and powerful structure:

Gˇ=(GRGK0GA)\check{G} = \begin{pmatrix} G^R G^K \\ 0 G^A \end{pmatrix}Gˇ=(GRGK0GA​)

This upper-triangular form is a deep mathematical representation of causality. The retarded and advanced functions on the diagonal describe causal propagation forward and backward in time, while the Keldysh function sits in the off-diagonal "statistical" slot. The zero in the lower-left corner dramatically simplifies the algebra of non-equilibrium calculations, ensuring that our results are always causally well-behaved. Perturbative expansions, which would be a nightmare of contour time-orderings, become a clean and systematic matrix algebra. This matrix structure is the physicist's equivalent of an organized toolbox, where every tool has its proper place.

The Engine of Change: Self-Energy

We now have the language to describe the state of a system. But what drives the system away from equilibrium? What is the engine of change? In the Keldysh formalism, this role is played by the ​​self-energy, Σ\SigmaΣ​​.

The self-energy is a catch-all term that encapsulates every possible interaction a particle can experience. It might scatter from another particle, create a lattice vibration (a phonon), or, crucially for transport problems, tunnel from the system into an external reservoir like a wire. Just like the Green's function, the self-energy is a matrix in Keldysh space, Σˇ\check{\Sigma}Σˇ, with its own retarded, advanced, and Keldysh components.

The evolution of the system is then governed by the ​​Dyson equation​​, which relates the full, interacting Green's function Gˇ\check{G}Gˇ to the "bare" Green's function of the non-interacting system gˇ\check{g}gˇ​ and the self-energy Σˇ\check{\Sigma}Σˇ:

Gˇ=gˇ+gˇΣˇGˇ\check{G} = \check{g} + \check{g} \check{\Sigma} \check{G}Gˇ=gˇ​+gˇ​ΣˇGˇ

This equation is the heart of the theory. It tells us how the simple propagation of a free particle (gˇ\check{g}gˇ​) is modified, or "dressed," by all the complex interactions (Σˇ\check{\Sigma}Σˇ) to produce the true behavior of the particle in the interacting system (Gˇ\check{G}Gˇ).

The real power of this framework becomes apparent when we look at the Keldysh component of the Dyson equation. Consider a system that starts out "empty" or at zero temperature, so its initial Keldysh component is zero. What is the particle distribution later on, after the interactions are turned on? The equation gives a stunningly simple and intuitive answer:

GK=GRΣKGAG^K = G^R \Sigma^K G^AGK=GRΣKGA

This compact equation is a universe of physics. It tells us that the final particle distribution (GKG^KGK) is created by a source term—the statistical part of the self-energy, ΣK\Sigma^KΣK—which is then propagated through the system according to its causal response, described by GRG^RGR and GAG^AGA. The self-energy is the source of particles and holes, and the Green's functions are the conduits that carry them.

To make this less abstract, let's consider the prime example of a nanoelectronic device: a single molecule (a "quantum dot") connected between two metal contacts, Left and Right, held at different voltages. This is a system far from equilibrium. The self-energy \Sigma^ describes the rate at which electrons tunnel from the contacts onto the molecule. The Keldysh formalism allows us to calculate it directly. The result is perfectly intuitive:

Σ(ω)=i[ΓLfL(ω)+ΓRfR(ω)]\Sigma^(\omega) = i \left[ \Gamma_L f_L(\omega) + \Gamma_R f_R(\omega) \right]Σ(ω)=i[ΓL​fL​(ω)+ΓR​fR​(ω)]

Here, ΓL\Gamma_LΓL​ and ΓR\Gamma_RΓR​ are the tunneling rates to the left and right contacts, and fL(ω)f_L(\omega)fL​(ω) and fR(ω)f_R(\omega)fR​(ω) are the Fermi functions of the contacts, which simply tell us the probability that a state at energy ℏω\hbar\omegaℏω is occupied in that contact. The self-energy is literally just summing up the streams of incoming electrons from each contact, weighted by their coupling strength. The more general form of the Keldysh-Dyson equation correctly describes this process even in systems that don't start empty.

A Note on Making Sense

The Keldysh formalism is a complete and exact framework. In practice, however, we can never solve the full equations for any realistically complex system. We must make approximations. For instance, we might calculate the self-energy only to a certain order in the interaction strength. Herein lies a subtle trap.

A physical theory must respect the fundamental symmetries of nature. The most sacred of these in transport is ​​charge conservation​​. Our calculations must not create or destroy charge out of thin air. An arbitrary or naive approximation can easily violate this principle, leading to nonsensical results like a steady-state current flowing into a device that doesn't equal the current flowing out.

Fortunately, the formalism itself contains the antidote: ​​Ward Identities​​. These identities are the mathematical embodiment of charge conservation within the theory of Green's functions. They establish a rigid consistency condition between the self-energy Σ\SigmaΣ and the "vertex function," which describes how the system's particles couple to an electromagnetic field (i.e., how they constitute a current).

The Ward identity dictates that if you use an approximate, "dressed" self-energy to calculate your Green's function, you cannot get away with using a "bare," non-interacting vertex to calculate your current. The approximation for the vertex must be consistent with the approximation for the self-energy. A "conserving approximation" is one that respects the Ward identity. For example, in many common approximations, the vertex corrections needed to preserve conservation take the form of "ladder diagrams" that mirror the structure of the self-energy calculation. Failing to include these consistent vertex corrections is a common pitfall that leads to unphysical results. The Ward identities are the essential guardrails that keep our theoretical explorations on the path of physical reality.

Applications and Interdisciplinary Connections

Having journeyed through the intricate machinery of the Keldysh formalism, with its curious time contours and matrix Green's functions, one might be left wondering, "What is this all for?" It is a fair question. The true power of a physical theory lies not in its abstract elegance alone, but in its ability to describe, predict, and unify the phenomena we observe in the world around us. Now, we shall pivot from the "how" to the "what," exploring the vast landscape of problems that the Keldysh formalism unlocks. We will see that this is not merely a tool for specialists, but a universal language for describing the vibrant, dynamic life of quantum systems driven far from the quiet slumber of equilibrium. Our tour will take us from the heart of modern electronics to the frontiers of quantum computing, revealing a beautiful unity across seemingly disparate fields of science.

The Heart of the Matter: Electronic Transport in Nanostructures

The most natural home for the Keldysh formalism is in the world of mesoscopic physics and nanoelectronics, where quantum devices operate under finite voltages. Here, the flow of electrons is not a simple classical drift but a delicate quantum dance.

Imagine a single molecule or a tiny island of semiconductor—a quantum dot—sandwiched between two metallic contacts, a source and a drain. How much current flows when we apply a voltage? The Keldysh formalism provides the definitive answer through the celebrated Meir-Wingreen formula. It allows us to calculate the full current-voltage (I−VI-VI−V) characteristic, which is the fundamental "fingerprint" of any electronic device. We can account for remarkable details, such as how the applied voltage itself can electrostatically shift the energy levels within the dot, creating a fascinating feedback loop that shapes the flow of current. The central quantity that emerges is the transmission function, T(ω)T(\omega)T(ω), which tells us the probability for an electron of energy ω\omegaω to make it through the device. The total current is then an integral of this transmission over the energy window opened up by the applied voltage.

But the story gets more interesting. What if the electron has more than one path to take? Consider a quantum wire with a quantum dot coupled to its side. An electron traveling down the wire can either pass directly or it can take a brief detour into the dot and back out. These two pathways interfere, just like light waves in a double-slit experiment. This quantum interference gives rise to a strikingly asymmetric feature in the transmission known as a Fano resonance. The Keldysh Green's function approach elegantly captures this interference, predicting the exact shape of the resonance and revealing the coherent nature of quantum transport.

An average current, however, doesn't tell the whole story. The flow of discrete charges is inherently a noisy process. If we could "listen" to the current, we would hear not a smooth hum, but a series of clicks corresponding to individual electrons passing through. This is shot noise, and its characteristics reveal profound information about the transport process. Using the Keldysh formalism, we can go beyond the average current and compute the full power spectrum of these current fluctuations [@problemid:1058668]. The magnitude of the noise tells us whether electrons are moving independently, like shoppers in a sparse checkout line, or in correlated bunches.

This ability to handle correlations is crucial when we consider another of the electron's properties: its spin. This tiny quantum-mechanical magnetic moment is the foundation of spintronics, a field aiming to build electronics that manipulate spin as well as charge. Picture a current of spin-polarized electrons tunneling through a barrier into a tiny magnet. The Keldysh framework predicts that this spin current will exert a torque on the magnet, a phenomenon known as spin-transfer torque. It’s like a stream of tiny spinning tops hitting a larger one, causing it to precess and even flip its orientation. This very effect, first calculated with Green's function techniques, is the principle behind modern magnetic random-access memory (MRAM).

Broadening the Horizon: Beyond Simple Electronics

The true genius of a fundamental framework is its generality. The logic of Keldysh—of reservoirs held at different potentials driving a current of quasiparticles—is not limited to electrons.

Consider heat transport at the nanoscale. Heat in a crystal is carried by quantized lattice vibrations called phonons. We can construct a phononic device analogous to our quantum dot, with a central scattering region connected to two heat baths at different temperatures, TLT_LTL​ and TRT_RTR​. The Keldysh formalism can be adapted seamlessly to describe this scenario. The electron Green's functions are replaced by phonon Green's functions, and the Fermi-Dirac distributions are replaced by Bose-Einstein distributions. The result is a Landauer-like formula for heat current, where a transmission function for phonons, T(ω)\mathcal{T}(\omega)T(ω), determines the flow of thermal energy. This demonstrates a deep unity between the transport of charge and the transport of heat.

The formalism truly shines when we connect our nano-device to more exotic materials, like a superconductor. At low energies, an electron from a normal metal cannot enter a superconductor alone because of the superconducting energy gap. Instead, it can trigger a remarkable process called Andreev reflection: the incoming electron joins with another electron from the normal metal to form a Cooper pair, which then enters the superconductor. To conserve charge, a "hole"—the absence of an electron—is reflected back into the normal metal. The net result is the transfer of a charge of 2e2e2e across the interface for every incoming electron. The Keldysh formalism, when formulated in a particle-hole basis (the Nambu space), is perfectly designed to handle this melding of particles and holes. It correctly predicts that the shot noise in such a junction corresponds to a charge carrier of q∗=2eq^* = 2eq∗=2e, providing unambiguous proof of this beautiful, two-particle transport mechanism.

Connecting Frameworks and Fundamental Principles

Beyond specific applications, the Keldysh formalism serves as a powerful bridge connecting different theoretical constructs and validating the fundamental symmetries of nature.

Many of the most fascinating phenomena in condensed matter physics arise from the strong repulsive interaction between electrons. The single-impurity Anderson model is a cornerstone problem that captures this competition between electron hopping and on-site repulsion. The Keldysh machinery allows us to take this equilibrium problem and drive it, asking what happens when a voltage is applied. In certain regimes, a remarkable thing can happen: the current flow itself can cause the impurity to spontaneously magnetize, a purely non-equilibrium many-body effect that can be explored through self-consistent Green's function calculations.

Any valid physical theory must respect the fundamental symmetries of the universe, like time-reversal symmetry. The Onsager-Casimir reciprocity relations are a macroscopic manifestation of this microscopic reversibility. They relate, for instance, transport coefficients measured with a magnetic field B\mathbf{B}B to those measured with a field −B-\mathbf{B}−B. The Keldysh formalism not only respects these symmetries but provides a direct path to derive them from first principles, even for complex quantities like the cross-correlations of current fluctuations in multi-terminal devices. This provides a stringent check on the internal consistency of the theory.

Furthermore, Keldysh formalism can serve as a rigorous foundation for more phenomenological theories. In quantum optics and quantum information, the evolution of an open quantum system is often described by a Lindblad master equation, which models dissipation through "jump operators" that occur at certain rates. Where do these rates come from? The Keldysh real-time formalism provides the answer. By microscopically modeling the system and its reservoir, we can compute the transition rates for particles tunneling in and out, thereby deriving the Lindblad dissipator from first principles. This connects the two major frameworks for open quantum systems, lending microscopic justification to the phenomenological model.

The Quantum Frontier: Information and Entanglement

Perhaps the most exciting applications of the Keldysh formalism today lie at the cutting edge of quantum science and technology. To build a quantum computer or a secure quantum communication network, we must master the control of quantum systems in the face of environmental noise, which causes decoherence.

Consider an entanglement swapping protocol, a key building block for a "quantum repeater" that can distribute entanglement over long distances. The protocol relies on quantum memories that store fragile entangled states. In any realistic solid-state device, these memories are plagued by noise. The Keldysh formalism provides the tools to model the effect of realistic, non-Markovian noise sources—such as the ubiquitous 1/f1/f1/f noise common in electronics—on the stored quantum information. By calculating how the entanglement of the final state decays over time, we can understand the fundamental limits of our quantum hardware and learn how to design more robust protocols and devices.

From the simple flow of charge in a nanoscale transistor to the subtle decay of entanglement in a quantum memory, the Keldysh formalism provides a single, powerful, and unified language. It is the language of the quantum world in motion. It allows us to peer into the complex, dynamic, and often counter-intuitive behavior of systems pushed far from equilibrium, and in doing so, it not only helps us understand the world but also gives us the tools to engineer it in new and powerful ways.