try ai
Popular Science
Edit
Share
Feedback
  • Lesser Green's Function

Lesser Green's Function

SciencePediaSciencePedia
Key Takeaways
  • The lesser Green's function, G<(t,t′)G^<(t,t')G<(t,t′), measures the correlation of occupied quantum states across two different points in time, providing a dynamic "movie" of particle populations and quantum coherences.
  • In thermal equilibrium, the Fluctuation-Dissipation Theorem provides a simple link between the lesser Green's function, the spectrum of available states, and the Fermi-Dirac occupation factor.
  • For systems out of equilibrium, G<G^<G< is essential for calculating physical observables like electrical current (via the Meir-Wingreen formula) and spin accumulation in nanoelectronic and spintronic devices.
  • By analyzing its time dependence, the lesser Green's function can be used to model transient phenomena, such as the charging of a quantum dot or a system's response to a sudden change (quantum quench).

Introduction

In the quantum realm, a static snapshot is not enough to capture the intricate dance of many interacting particles. To understand the flow of electrons in a microchip or the response of a material to light, we need more than a census of where particles are; we need a "movie" that tracks their dynamic behavior. The lesser Green's function, often denoted as G<G^<G<, is the physicist's primary tool for creating this movie. It moves beyond static properties to describe how quantum states are occupied and how these occupations are correlated across time. This article bridges the gap between the abstract concept of particle populations and the measurable, dynamic phenomena they produce.

This article will guide you through the fundamental nature and practical power of the lesser Green's function. In the first section, ​​Principles and Mechanisms​​, we will dissect the definition of G<G^<G<, revealing its deep connection to the density matrix and exploring its behavior for a single quantum level. We will then see how it provides a complete picture of occupied states in both the time and energy domains through powerful concepts like the Fluctuation-Dissipation Theorem. Following this theoretical foundation, the second section, ​​Applications and Interdisciplinary Connections​​, will demonstrate how this single mathematical object is used to explain and predict a vast range of physical phenomena, from the current flowing through a single-molecule transistor to the light measured in spectroscopic experiments and the transient response of quantum systems.

{'br': {'br': "This immediately tells us how to calculate the total number of electrons in our quantum dot. We just sum up the occupied states over all energies:\n\nlanglenrangle=int−inftyinftyfracdomega2piA(omega)f(omega)\n\n\\langle n \\rangle = \\int_{-\\infty}^{\\infty} \\frac{d\\omega}{2\\pi} A(\\omega) f(\\omega)\n\nlanglenrangle=int−inftyinfty​fracdomega2piA(omega)f(omega)\n\nThe lesser Green's function, through the fluctuation-dissipation theorem, gives us the exact tool to see not just which states could be occupied, but which ones are.\n\n### The Full Ensemble: Particles, Holes, and Detailed Balance\n\nIf there is a G<G^<G<, logic suggests there must be a G>G^>G>. And indeed there is. The ​​greater Green's function​​ is defined as G^>_{ij}(t,t') = -i \\langle \\hat{c}_i(t) \\hat{c}_j^\\dagger(t') \\rangle. Notice the reversed order of operators. While G<G^<G< describes the propagation of particles (in occupied states), G>G^>G> describes the propagation of ​​holes​​ (in unoccupied states). It correlates the annihilation of a particle at t′t't′ with its creation at ttt.\n\nAt equal times, these two functions reveal the heart of the Pauli exclusion principle. We saw that −iGpp<(t,t)-i G^<_{pp}(t,t)−iGpp<​(t,t) is the probability of state ppp being occupied. The greater function, -i G^>_{pp}(t,t) = 1 - \\langle \\hat{c}_p^\\dagger(t) \\hat{c}_p(t) \\rangle, gives the probability of state ppp being unoccupied. The two must sum to one! More generally, the fundamental anticommutation relation for fermions, \\{\\hat{c}_p(t), \\hat{c}_q^\\dagger(t)\\} = \\delta_{pq}, leads directly to a profound sum rule for the Green's functions:\n\nGpq>(t,t)−Gpq<(t,t)=−ideltapq\n\nG^>_{pq}(t,t) - G^<_{pq}(t,t) = -i \\delta_{pq}\n\nGpq>​(t,t)−Gpq<​(t,t)=−ideltapq​\n\nThis shows that G<G^<G< and G>G^>G> are not independent; they are two sides of the same coin, perfectly balancing the occupied and unoccupied parts of the quantum world.\n\nThis balancing act becomes even clearer in the energy domain. In a system at thermal equilibrium, there is a continuous, dynamic exchange of energy with the environment. Particles are excited into higher states, and they de-excite into lower ones. This principle of ​​detailed balance​​ is encoded in the ​​Kubo-Martin-Schwinger (KMS) condition​​. For fermions, it leads to a simple, powerful relation between the Fourier transforms of the greater and lesser functions:\n\nG>(omega)=−ebetahbaromegaG<(omega)\n\nG^>(\\omega) = -e^{\\beta \\hbar \\omega} G^<(\\omega)\n\nG>(omega)=−ebetahbaromegaG<(omega)\n\nAt a given temperature, the ratio of creating a hole to creating a particle at energy omega\\omegaomega is fixed by the Boltzmann factor ebetahbaromegae^{\\beta \\hbar \\omega}ebetahbaromega. It's much easier to find a hole (create an excitation) at high energy than it is to find a particle there.\n\nUltimately, all these different Green's functions—lesser (G<G^<G<), greater (G>G^>G>), retarded (GRG^RGR), advanced (GAG^AGA), and the so-called Keldysh function (GK=G<+G>G^K = G^< + G^>GK=G<+G>)—are just different projections of a single, unified mathematical object. They can be expressed as linear combinations of each other, revealing the deep internal consistency and elegance of the theory.\n\nThe lesser Green's function, then, is our key. It is the thread that ties the quantum dynamics of phase evolution to the statistical mechanics of occupation. It provides the script for the movie of the quantum world, revealing not just the static population of states, but the very flow and rhythm of quantum life.", 'applications': '## Applications and Interdisciplinary Connections\n\nWe have spent some time getting to know a rather curious mathematical object, the lesser Green's function, G^. You might be thinking of it as a meticulous, if somewhat abstract, census of the quantum world—a list telling us precisely which electronic states are occupied and which are empty. A fascinating piece of bookkeeping, to be sure. But does it do anything? Does this knowledge help us build, or predict, or understand the world around us?\n\nThe answer is a spectacular testament to the power of theoretical physics. This function, G^, is no dusty museum piece. It is a master key, a versatile tool that unlocks a profound understanding of how the quantum realm behaves, responds, and evolves. It is our guide on a journey from simply observing the quantum world to watching it in action. So, let's take this key and open a few doors. We will find that the same concept connects phenomena as diverse as the light from a television screen, the current in a microchip, the storage of data in magnetic memory, and the fundamental crackle of electronic noise.\n\n### Peeking into the Electron Sea: Spectroscopy\n\nPerhaps the most direct and intuitive application of the lesser Green's function is in answering a very simple question: "What's in there?" Imagine you have a new material, a crystal, or a metallic surface. How can you map out its electronic landscape? The most powerful technique is called ​​photoemission spectroscopy​​. The idea is wonderfully simple: you shine a beam of light (high-energy photons) onto your material. When a photon strikes an electron, it can give the electron enough energy to be kicked completely out of the material. We can then catch this escaping electron and measure its kinetic energy with great precision.\n\nBy knowing the energy of the photon we sent in and the energy of the electron that came out, we can work backward to figure out the energy the electron had when it was inside the material. If we repeat this for billions of electrons, we can build a map of the material's occupied energy levels. The brightness of the signal at a given energy is directly proportional to how many electrons were sitting at that energy level, ready to be kicked out.\n\nAnd what theoretical quantity tells us exactly that—the population of electrons at each energy? The lesser Green's function! Specifically, the photoemission intensity I(omega)I(\\omega)I(omega) measured in an experiment is directly proportional to the imaginary part of the local lesser Green's function, which we know is related to the density of available states rho(omega)\\rho(\\omega)rho(omega) multiplied by the probability they are occupied, f(omega)f(\\omega)f(omega). So, when an experimentalist measures a photoemission spectrum, they are, in a very real sense, measuring a quantity that theorists can calculate directly from G^. It provides a beautiful and direct bridge between an intricate quantum field theory calculation and a real-world experimental measurement.\n\n### The Flow of Discovery: Quantum Transport and Nanoelectronics\n\nObserving a system in equilibrium is one thing, but the real excitement often begins when we push it. What happens if we apply a voltage across a tiny quantum device? This is the central question of ​​nanoelectronics​​, the field that seeks to build electronic components out of single molecules or nanoscale structures.\n\nImagine a single quantum dot—a tiny "artificial atom"—sandwiched between two large metal contacts, which we'll call the left and right leads. This is the blueprint for a single-molecule transistor. If we apply a voltage, we create a difference in the chemical potential (the "electron sea level") between the two leads, say muLmuR\\mu_L \\mu_RmuL​muR​. Electrons in the left lead are now "uphill" from the right lead and will try to flow through the quantum dot to get there.\n\nThis is a system far from equilibrium. The lesser Green's function is the perfect tool for the job. The occupation of the dot, encoded in its G^, is now determined by a tug-of-war. It's being fed electrons from the left lead (with its energy distribution fL(omega)f_L(\\omega)fL​(omega)) and also from the right lead (with its distribution fR(omega)f_R(\\omega)fR​(omega)). The famous Keldysh equation, G^ = G^R \\Sigma^ G^A, becomes the physicist's engine for calculation. The term Sigma=i[GammaLfL(omega)+GammaRfR(omega)]\\Sigma^ = i[\\Gamma_L f_L(\\omega) + \\Gamma_R f_R(\\omega)]Sigma=i[GammaL​fL​(omega)+GammaR​fR​(omega)] acts as the source, describing the injection of electrons from both leads, while the retarded and advanced functions (GRG^RGR, GAG^AGA) describe how the dot accommodates these electrons.\n\nBy solving this, we can calculate precisely how the occupation of the dot changes as we tune the voltage or the dot's energy level. Even more importantly, we can calculate the net flow of charge—the electrical current. This leads to one of the cornerstone results of mesoscopic physics, the ​​Meir-Wingreen formula​​, which expresses the current directly in terms of the dot's Green's functions and the lead properties. This is not just a theoretical triumph; it is the working equation used to design and understand transport through quantum dots, molecules, and other nanoscale junctions.\n\nThis transport story has a fascinating sequel: ​​spintronics​​. Electrons, as you know, have an intrinsic property called spin. What if one of our leads is a ferromagnet, acting as a gatekeeper that only allows, say, spin-up electrons to pass through easily? We can then use a voltage to inject a spin-polarized current. The lesser Green's function handles this with beautiful ease; we simply use a separate function for each spin, G^_\\uparrow and G^_\\downarrow. By calculating the occupation for each spin on the dot, \\langle n_\\uparrow \\rangle and \\langle n_\\downarrow \\rangle, we can determine the net ​​spin accumulation​​ on the dot. This ability to control and detect spin currents is the foundation of spintronic devices like the giant magnetoresistance (GMR) read heads in hard drives and future MRAM memory technologies.\n\n### The Quantum Stopwatch: Dynamics and Transients\n\nSo far, we have looked at static pictures or steady flows. But the world is full of change. What happens in the moments after we flick a switch? The lesser Green's function, in its full time-dependent glory, becomes our quantum stopwatch.\n\nConsider an empty quantum dot, isolated from the world. At time t=0t=0t=0, we suddenly connect it to an electron reservoir. How does the dot fill up? Will electrons rush in instantly? The equal-time lesser Green's function, G(t,t)G^(t, t)G(t,t), gives us the occupation langlen(t)rangle\\langle n(t) \\ranglelanglen(t)rangle at any instant. The calculation reveals a wonderfully intuitive result: the occupation grows exponentially towards its steady-state value, langlen(t)ranglepropto(1−exp(−Gammat))\\langle n(t) \\rangle \\propto (1 - \\exp(-\\Gamma t))langlen(t)ranglepropto(1−exp(−Gammat)). This is exactly analogous to the way a capacitor charges in a simple RC circuit! It is a beautiful example of the unity of physics, where the same mathematical behavior describes the charging of a macroscopic capacitor and the filling of a single quantum level.\n\nWe can ask even more subtle questions. What if the dot is already happily connected to its reservoir, and we suddenly change its energy level—like tuning a guitar string while it's vibrating? The system has to readjust. For a short time, it "remembers" its old state. This memory is encoded in the transient parts of the full two-time Green's function, G(t,t)ˊG^(t, t\')G(t,t)ˊ​. These transients fade away as the system forgets its past and settles into a new rhythm, a new steady state determined by its new energy level. This ability to track quantum memory is essential for understanding ultrafast processes in materials, such as those triggered by laser pulses.\n\nThis "quantum quench" idea is not limited to a single dot. Imagine a perfect, one-dimensional wire of atoms, representing a pristine electrical conductor. Now, at t=0t=0t=0, we suddenly switch on a single impurity—a "pothole" at one location. The river of electrons flowing through the wire is disrupted. It scatters, reflects, and rearranges itself into a new, complex, and stationary flow pattern. The lesser Green's function allows us to calculate the properties of this new non-equilibrium steady state, such as the final electron density right at the impurity site, long after the initial disturbance has passed.\n\n### The Social Life of Electrons: Interactions and Noise\n\nFor simplicity, we have often imagined electrons as polite, independent particles that move without acknowledging each other. This is, of course, a fairy tale. Electrons are charged, and they repel each other. This Coulomb interaction is the source of nearly all the complexity—and richness—of chemistry and materials science.\n\nThe Green's function framework provides a systematic way to deal with this "social" behavior. In a first, simple approximation known as the Hartree-Fock method, we can say that an electron of, say, spin-up, feels an average repulsion from all the spin-down electrons. This repulsion effectively raises its energy level. To calculate this energy shift, a quantity called the ​​self-energy​​, we need to know the average occupation of the spin-down electrons. And where do we get that from? From the lesser Green's function of the non-interacting system!\n\nThis reveals a beautiful, bootstrap-like structure that is central to many-body physics. We use the simple, non-interacting G^_0 to calculate the first correction due to interactions. We can then use this corrected system to build an even better Green's function, and so on, iteratively building up the effects of these complex correlations.\n\nFinally, let us listen to the sound of electricity. An electrical current is not a smooth, continuous fluid. It is composed of a stream of discrete electrons. This granularity means the current is not perfectly constant; it fluctuates. This is called ​​shot noise​​, the quantum equivalent of the sound of rain on a roof. These fluctuations are not just bothersome static; they carry profound information.\n\nIt turns out that while the average current is given by a simple integral over G^, the current-current correlation function—the noise—is given by a more complex combination of Green's functions. Using the full Keldysh formalism, we can compute this noise. For a quantum conductor, this leads to a remarkable prediction: the shot noise is largest not when the channel is fully open or fully closed, but when its transmission probability is exactly one-half. The noise peaks when the system is most uncertain about whether to let an electron pass or to reflect it. It is a direct manifestation of the probabilistic nature of quantum mechanics, made audible in the static of a quantum circuit.\n\nFrom seeing the filled states in a metal, to watching a single-electron transistor in action, to clocking the system's response to a sudden jolt, to taming the complex dance of interacting electrons, and even to listening to the quantum crackle of current—the lesser Green's function has been our constant companion. It is a testament to the remarkable power of a single theoretical idea to unify a vast and dazzling landscape of physical phenomena, reminding us that in the intricate machinery of the quantum world, there is a deep and profound order.', '#text': '> The spectrum of occupied states (G<G^<G<) is simply the spectrum of available states (AAA) weighted by the probability of their occupation (fff).'}, '#text': '## Principles and Mechanisms\n\nImagine you are trying to understand a bustling city. You could take a snapshot—a census—to see how many people are in each building at a single moment. This gives you a static picture of ​​population​​. But this misses the life of the city: the flow of people from home to work, the currents of traffic, the dynamic patterns of life. To truly understand the city, you need to track not just where everyone is, but where they are coming from and where they are going. You need a movie, not a photograph.\n\nThe quantum world of many particles is much like this city. The ​​lesser Green's function​​, which we can call G<G^<G<, is the physicist's tool for making that movie. It goes beyond a simple census of occupied quantum states to describe the beautiful and complex dynamics of how electrons—the inhabitants of our quantum city—move, correlate, and flow.\n\n### A Tale of Two Times: The Correlation of Occupied States\n\nLet's start with the census. In quantum mechanics, the one-particle density matrix, boldsymbolrho(t)\\boldsymbol{\\rho}(t)boldsymbolrho(t), is our census-taker. For a set of states indexed by iii and jjj, its elements \\rho_{ij}(t) = \\langle \\hat{c}_j^\\dagger(t) \\hat{c}_i(t) \\rangle tell us everything about the single-particle properties at a specific time ttt. The operator hatci(t)\\hat{c}_i(t)hatci​(t) annihilates a particle in state iii at time ttt, and \\hat{c}_j^\\dagger(t) creates one in state jjj.\n\nThe diagonal elements, \\rho_{ii}(t) = \\langle \\hat{c}_i^\\dagger(t) \\hat{c}_i(t) \\rangle, are the most intuitive: they are simply the average ​​population​​ or occupation number of state iii. The off-diagonal elements, rhoij(t)\\rho_{ij}(t)rhoij​(t) for ineqji \\neq jineqj, are more subtle and purely quantum mechanical. They measure the ​​coherence​​ between states iii and jjj—the degree to which the system exists in a definite phase relationship between these two states, much like how two waves can be in or out of sync.\n\nThis is a great start, but it's still just a snapshot. To capture the dynamics, we need to ask a more sophisticated question: If we find a particle in a state at time ttt, what is the probability amplitude that this is correlated with a particle having been "injected" into another state at an earlier time t\'? This is a two-time correlation question, and it is the very heart of the lesser Green's function.\n\nThe ​​lesser Green's function​​ is defined as:\n\nG^<_{ij}(t,t\') = i \\langle \\hat{c}_j^\\dagger(t\') \\hat{c}_i(t) \\rangle\n\nThe name "lesser" comes from the Keldysh formalism where this function is associated with a particular ordering on a time contour. But its physical meaning is far more evocative: it describes the correlation of occupied states across time and space. Look what happens when we set the times to be equal, t=t\':\n\nG^<_{ij}(t,t) = i \\langle \\hat{c}_j^\\dagger(t) \\hat{c}_i(t) \\rangle = i \\rho_{ij}(t)\n\nIn an instant, we see the profound connection. The equal-time lesser Green's function is the density matrix (times a factor of iii). The populations and coherences we started with are simply the instantaneous, equal-time slice of this richer, two-time object. All the information about particle number and quantum coherence is contained within G<G^<G<. The total number of particles in a system, for instance, is simply the sum of all the populations: N(t)=sumirhoii(t)=−isumiGii<(t,t)=−imathrmTr[G<(t,t)]N(t) = \\sum_i \\rho_{ii}(t) = -i \\sum_i G^<_{ii}(t,t) = -i \\mathrm{Tr}[G^<(t,t)]N(t)=sumi​rhoii​(t)=−isumi​Gii<​(t,t)=−imathrmTr[G<(t,t)].\n\n### The Simplest Beat: A Solo Performance\n\nTo truly appreciate this, let's listen to the simplest possible sound in our quantum city: a single, isolated fermionic energy level, epsilon\\epsilonepsilon. Imagine it's in a room at a fixed temperature TTT. What is its lesser Green's function?\n\nWe can calculate this from first principles. The time evolution of the creation and annihilation operators in the Heisenberg picture is wonderfully simple for this non-interacting system: hatc(t)=hatc(0)exp(−iepsilont/hbar)\\hat{c}(t) = \\hat{c}(0) \\exp(-i\\epsilon t/\\hbar)hatc(t)=hatc(0)exp(−iepsilont/hbar) and \\hat{c}^\\dagger(t\') = \\hat{c}^\\dagger(0) \\exp(i\\epsilon t\'/\\hbar). The operators pick up a phase that oscillates at a frequency determined by the energy. Plugging this into the definition of G<(t,t)ˊG^<(t,t\')G<(t,t)ˊ​:\n\nG^<(t, t\') = i \\langle \\hat{c}^\\dagger(t\') \\hat{c}(t) \\rangle = i \\langle \\hat{c}^\\dagger(0) \\exp(i\\epsilon t\'/\\hbar) \\hat{c}(0) \\exp(-i\\epsilon t/\\hbar) \\rangle\n\nSince the operators for different times now act at time zero, we can group them. The time-dependent exponentials are just numbers and can be pulled out of the expectation value:\n\nG^<(t,t\') = i \\langle \\hat{c}^\\dagger(0)\\hat{c}(0) \\rangle \\exp\\left(-\\frac{i\\epsilon(t-t\')}{\\hbar}\\right)\n\nThis expression is a small poem. It splits the physics into two beautiful parts.\n1. The term exp(−iepsilon(t−t)ˊ/hbar)\\exp(-i\\epsilon(t-t\')/\\hbar)exp(−iepsilon(t−t)ˊ​/hbar) is the pure ​​quantum dynamics​​. It's the rhythmic, unitary evolution of the quantum state's phase, beating solely with the energy epsilon\\epsilonepsilon.\n2. The term \\langle \\hat{c}^\\dagger(0)\\hat{c}(0) \\rangle is the pure ​​statistical mechanics​​. It's the average occupation of the level, which for fermions in thermal equilibrium is given by the famous ​​Fermi-Dirac distribution​​, f(epsilon)=frac1exp(betaepsilon)+1f(\\epsilon) = \\frac{1}{\\exp(\\beta \\epsilon) + 1}f(epsilon)=frac1exp(betaepsilon)+1 (where beta=1/kBT\\beta = 1/k_B Tbeta=1/kB​T).\n\nSo, for our simple solo performer, the lesser Green's function is:\n\nG<(t,t)ˊ=if(epsilon)expleft(−fraciepsilon(t−t)ˊhbarright)\n\nG^<(t, t\') = i f(\\epsilon) \\exp\\left(-\\frac{i\\epsilon(t-t\')}{\\hbar}\\right)\n\nG<(t,t)ˊ​=if(epsilon)expleft(−fraciepsilon(t−t)ˊ​hbarright)\n\nThis function tells us that if we find the state occupied, its "memory" of being occupied in the past decays not in amplitude, but by oscillating with a precise quantum phase.\n\n### The Spectrum of What Is: From Time to Energy\n\nWhile the time-domain view is powerful, physicists often gain deeper insight by using a Fourier transform to switch to the frequency (or energy) domain. What does the "spectrum" of occupied states look like? This is analogous to taking the complex sound of an orchestra and breaking it down into the pure notes—the frequencies—that compose it.\n\nLet's imagine our single level is now a quantum dot connected to wires, as in a real electronic device. The connection to the outside world blurs the sharp energy level epsilon0\\epsilon_0epsilon0​ into a range of available energies. We describe this range with the ​​spectral function​​, A(omega)A(\\omega)A(omega). It tells you the density of states available for an electron at energy omega\\omegaomega. It's a map of the "available rooms" in our quantum city.\n\nBut just because a room is available doesn't mean it's occupied. We still need the Fermi-Dirac distribution, f(omega)f(\\omega)f(omega), which tells us the probability that a room at energy omega\\omegaomega is occupied at a given temperature and chemical potential.\n\nIn thermal equilibrium, there is a monumentally important relationship known as the ​​Fluctuation-Dissipation Theorem​​, which connects these ideas. For our Green's functions, it takes the form:\n\nG<(omega)=−f(omega)left[GR(omega)−GA(omega)right]=if(omega)A(omega)\n\nG^<(\\omega) = -f(\\omega) \\left[ G^R(\\omega) - G^A(\\omega) \\right] = i f(\\omega) A(\\omega)\n\nG<(omega)=−f(omega)left[GR(omega)−GA(omega)right]=if(omega)A(omega)\n\nHere, GRG^RGR and GAG^AGA are the retarded and advanced Green's functions, and their difference, i[GR(omega)−GA(omega)]i[G^R(\\omega) - G^A(\\omega)]i[GR(omega)−GA(omega)], is precisely the spectral function A(omega)A(\\omega)A(omega). This equation is one of the most beautiful statements in many-body physics. It says:'}