try ai
Popular Science
Edit
Share
Feedback
  • Second-order coherence

Second-order coherence

SciencePediaSciencePedia
Key Takeaways
  • The second-order coherence function at zero delay, g(2)(0)g^{(2)}(0)g(2)(0), classifies light as bunched (>1>1>1), coherent (=1=1=1), or antibunched (111), revealing the source's nature.
  • Photon antibunching, where g(2)(0)1g^{(2)}(0) 1g(2)(0)1, is a definitive non-classical effect that serves as the gold-standard proof of a single-photon source.
  • Chaotic thermal light from sources like stars exhibits photon bunching (g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2), which forms the basis for intensity interferometry to measure astronomical objects.
  • Measuring the statistical signature of light provides a powerful tool for probing physical interactions, from nonlinear optical effects to complex nuclear decay processes.

Introduction

Light is more than just brightness and color; it possesses a hidden statistical "texture" that tells the story of its creation. While the light from a steady laser and a chaotic lightbulb may appear identical to the naked eye, their underlying photon streams behave in drastically different ways. This article explores the concept of ​​second-order coherence​​, a powerful tool in physics that allows us to quantify these statistical differences and unlock profound insights into the nature of light and its source. We will bridge the gap between the classical and quantum worlds by understanding how a simple measurement of photon arrival times can distinguish between waves and particles.

This article will guide you through the fundamental principles and far-reaching applications of this concept. In the first section, ​​"Principles and Mechanisms,"​​ we will define the second-order coherence function, g(2)(τ)g^{(2)}(\tau)g(2)(τ), and discover how it classifies light into three families: the bunched light of thermal sources, the random light of lasers, and the uniquely quantum antibunched light of single emitters. Following this, the ​​"Applications and Interdisciplinary Connections"​​ section will demonstrate how these principles are applied in the real world, from measuring the size of distant stars to verifying the building blocks of quantum computers and probing the heart of the atomic nucleus.

Let's begin by examining the core ideas behind this powerful concept and learning how to read the statistical signature of light.

Principles and Mechanisms

Imagine you're at a party, standing by the door and watching people arrive. Do they come in a steady, random stream, one by one, like they're just showing up whenever? Or do they arrive in tight-knit groups and clusters? Or perhaps there's a strict doorman who only lets one person in every few minutes, forcing them to arrive spaced out. By simply observing the timing of arrivals, you could learn a lot about the social dynamics of the guests.

In the world of physics, we can do the same with light. Instead of people, we watch for the arrival of photons at a detector. The "social life" of photons—whether they tend to clump together, arrive randomly, or keep their distance—tells us something incredibly deep about the nature of the light source that produced them. The tool we use for this is the ​​normalized second-order correlation function​​, a fancy name for a simple idea. We denote it by g(2)(τ)g^{(2)}(\tau)g(2)(τ). It answers the question: "Given that I've just detected a photon, what is the relative probability of detecting another one a time delay τ\tauτ later?"

The most revealing moment is the instant of the first detection, at zero time delay, τ=0\tau=0τ=0. The value of g(2)(0)g^{(2)}(0)g(2)(0) classifies light into three fundamental families:

  • ​​Bunched Light (g(2)(0)>1g^{(2)}(0) > 1g(2)(0)>1):​​ Photons are gregarious; they like to arrive in groups. Detecting one makes it more likely that you'll detect another one right away.
  • ​​Coherent or Random Light (g(2)(0)=1g^{(2)}(0) = 1g(2)(0)=1):​​ Photons are indifferent. The arrival of one has no bearing on when the next will arrive. They follow a random, Poissonian pattern, like raindrops in a steady shower.
  • ​​Antibunched Light (g(2)(0)1g^{(2)}(0) 1g(2)(0)1):​​ Photons are shy. Detecting one makes it less likely that another will arrive immediately. They are lone wolves.

This simple number, g(2)(0)g^{(2)}(0)g(2)(0), is a powerful fingerprint, a key that unlocks the story of how light is born.

The Chaos of the Crowd: Thermal Light and Photon Bunching

Let's start with the most common type of light in the universe: the chaotic glow of a hot object, like a star or the filament in an old-fashioned light bulb. This is ​​thermal light​​. It's the product of a colossal number of independent atoms, each emitting a little light wave at a random time and with a random phase. These countless wavelets add up, interfering with each other—sometimes constructively, creating a bright flash, and sometimes destructively, creating a dim spot. The result is a light field whose intensity fluctuates wildly and randomly in time and space.

Now, imagine your tiny photodetector sitting in this field. If it clicks, signaling the arrival of a photon, it’s most likely because it was just hit by a statistical "hotspot"—a fleeting moment of high intensity. But if it's in a hotspot, it stands to reason that another photon is probably close behind. This is the intuitive origin of ​​photon bunching​​. The photons aren't actually "attracted" to each other; they just tend to be born in bursts from the random, chaotic interference of their parent waves.

When physicists first did the math for this, they found a beautiful and universal result. For a single, well-defined mode of thermal light, the probability of detecting two photons at the same instant is exactly twice what you'd expect from random chance. In our language, this means ​​g(2)(0)=2g^{(2)}(0) = 2g(2)(0)=2​​. This value is a hallmark of single-mode thermal chaos, a signature as clear as a fingerprint.

What happens if we look for the second photon a little later, at a time τ>0\tau > 0τ>0? That initial bright spot will have flickered and faded, its internal correlations lost. As τ\tauτ increases, the memory of the first photon is lost, and the probability of finding a second one returns to being random. So, the function g(2)(τ)g^{(2)}(\tau)g(2)(τ) starts at a "bunching peak" of 2 at τ=0\tau=0τ=0 and decays down to 1 for longer times. The characteristic time it takes to decay is the light's ​​coherence time​​, τc\tau_cτc​. This is no coincidence. For chaotic light, there's a direct connection between the intensity fluctuations and the phase stability of the wave, a relationship known as the ​​Siegert relation​​: g(2)(τ)=1+∣g(1)(τ)∣2g^{(2)}(\tau) = 1 + |g^{(1)}(\tau)|^2g(2)(τ)=1+∣g(1)(τ)∣2. The term g(1)(τ)g^{(1)}(\tau)g(1)(τ) measures the phase coherence, and its decay time is the coherence time. The shape and width of the bunching peak, therefore, tell us directly about the spectral properties of the source.

This raises a wonderful puzzle. If the light from a lamp is so fiercely bunched and fluctuating, why does it look so perfectly smooth and steady to our eyes? The answer lies in the sheer scale of the chaos. A light bulb doesn't produce one clean mode of light; it spews out an astronomical number of independent modes—different frequencies, different directions, different polarizations. Our eye, or any large detector, collects all of them. As problem shows, if you average over MMM independent thermal modes, the fluctuations are suppressed, and the resulting second-order coherence becomes g(2)(0)=1+1Mg^{(2)}(0) = 1 + \frac{1}{M}g(2)(0)=1+M1​. For the sun, MMM is effectively infinite. The violent bunching in each individual mode is washed out in the massive crowd, and the resulting light stream appears perfectly random, with g(2)(0)=1g^{(2)}(0) = 1g(2)(0)=1. The chaos is hidden by its own immensity.

The Lone Emitter: Photon Antibunching and the Quantum Signature

Classical wave theory, with its interfering wavelets, can explain bunching and randomness perfectly well. It can describe any situation where g(2)(0)≥1g^{(2)}(0) \ge 1g(2)(0)≥1. But there, it hits a wall. No classical wave, no matter how you shape it, can produce intensity fluctuations that are less than random. A hotspot can't be "less than a hotspot." So, if we ever measure a light source with g(2)(0)1g^{(2)}(0) 1g(2)(0)1, we have witnessed something that is fundamentally impossible in the classical world. We have witnessed a quantum effect.

This phenomenon is ​​photon antibunching​​, and it is the definitive proof of the particle nature of light. The quintessential source of antibunched light is a single quantum emitter, for instance, a single atom held in a trap. Let's walk through the process. We shine a weak laser on the atom, giving it energy.

  1. The atom absorbs a quantum of energy and jumps to an excited state.
  2. After a short, unpredictable time, it spontaneously relaxes, spitting out a single photon and falling back to its ground state.
  3. Our detector clicks. ​​At that very instant, we know the atom is in the ground state.​​ It has given up its energy quantum.
  4. For the atom to emit a second photon, it must first absorb another quantum of energy from the laser and get re-excited. This process is not instantaneous; it takes time.

Therefore, it is physically impossible for the atom to emit a second photon at the exact same time as the first. The probability of detecting a second photon at a time delay of τ=0\tau=0τ=0 is zero. For a perfect single-emitter, ​​g(2)(0)=0g^{(2)}(0) = 0g(2)(0)=0​​.

The photons are forced to come out one by one, separated in time. There is a built-in "refractory period" after each emission. This is the ultimate signature of a ​​single-photon source​​. It’s not a wave breaking up; it's a single entity doing one thing at a time. The full function g(2)(τ)g^{(2)}(\tau)g(2)(τ) beautifully tells this story: it starts at 0, rises as the atom gets a chance to be re-excited by the laser, and eventually levels off at 1 for long time delays, when the atom has completely forgotten about the first emission event.

A Spectrum of Statistics: From Super-Bunching to Coherent Light

The universe of light is not just composed of these three pure cases. There exists a rich continuum of statistical behaviors. What happens when we mix light from different kinds of sources? Suppose we take the chaotic, bunched light from a thermal source (g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2) and superimpose on it the steady, random light from an ideal laser (g(2)(0)=1g^{(2)}(0)=1g(2)(0)=1). As you might intuitively guess, the resulting field will have statistics somewhere in between. If the two sources contribute equal average intensity, the mixture exhibits a reduced amount of bunching, with g(2)(0)=1.25g^{(2)}(0) = 1.25g(2)(0)=1.25. By varying the mixing proportion, one can dial in any degree of bunching between 1 and 2, creating custom-tailored light fields.

Is thermal light the most "bunched" that light can be? Surprisingly, no. The value g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2 is a direct consequence of the specific exponential statistics of thermal light intensity. It is possible to conceive of classical light sources with even wilder fluctuations. For instance, a hypothetical classical field whose amplitude (not intensity) follows a Gaussian distribution would exhibit what can be called "super-bunching," with an even more dramatic value of ​​g(2)(0)=3g^{(2)}(0) = 3g(2)(0)=3​​. This reminds us that nature can be more extreme than our everyday examples suggest.

Even more fascinating are the quantum states that bridge the gap between the quantum and classical worlds. Consider the exotic state created by taking a single, antibunched photon (g(2)(0)=0g^{(2)}(0)=0g(2)(0)=0) and superimposing it on a strong, coherent laser field. This "displaced single-photon state" is a truly quantum object. The remarkable thing is that by simply tuning the strength of the background laser field (the displacement α\alphaα), we can continuously morph the light's statistics. As calculated in problem, when the displacement is zero, we have a pure single photon with g(2)(0)=0g^{(2)}(0)=0g(2)(0)=0. As we make the laser field overwhelmingly strong, the quantum "hiccup" of the single photon is washed out, and the statistics smoothly approach those of a perfect laser, with g(2)(0)→1g^{(2)}(0) \to 1g(2)(0)→1.

So we see that this single number, g(2)(0)g^{(2)}(0)g(2)(0), takes us on a grand tour of the nature of light. From the perfect solitude of an antibunched photon at g(2)(0)=0g^{(2)}(0)=0g(2)(0)=0, through the orderly randomness of a laser at g(2)(0)=1g^{(2)}(0)=1g(2)(0)=1, to the chaotic clamor of thermal light at g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2, and even beyond. It's a simple measurement—just counting photons and looking at their timing—but it reveals whether the source is a lonely quantum atom, a disciplined army of photons in a laser, or the riotous mob from a star.

Applications and Interdisciplinary Connections

In the previous chapter, we explored the fascinating and sometimes counter-intuitive principles of second-order coherence. We learned that the value of g(2)(τ)g^{(2)}(\tau)g(2)(τ), the normalized intensity correlation function, is far more than a mere number. It's a fingerprint, a unique signature that reveals the very soul of a light source. For the perfectly ordered stream of photons from an ideal laser, we found g(2)(τ)=1g^{(2)}(\tau)=1g(2)(τ)=1. For the chaotic, haphazard jostle of photons from a thermal source like a light bulb or a star, we saw them "bunch" together, leading to g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2. And in the most quantum of cases, a single atom emitting one photon at a time, we discovered "antibunching," where g(2)(0)=0g^{(2)}(0)=0g(2)(0)=0.

Now, we move from principle to practice. If this concept is truly as fundamental as we claim, it must do more than fascinate us in thought experiments; it must allow us to do something. It must connect to the real world, solve problems, and open doors to new frontiers of science and technology. And indeed, it does. In this chapter, we will embark on a journey to see how measuring the statistical "texture" of light has revolutionized fields from the cosmic to the quantum. We will see how this single idea provides a unified language to describe phenomena in astronomy, quantum computing, nuclear physics, and even at the edge of a black hole.

Peering into the Cosmic Abyss: The Astronomical Revolution

For centuries, astronomers have been limited by the resolving power of their telescopes. A distant star, no matter how large, appears as a mere point of light. So, how could one possibly measure its size? In the 1950s, Robert Hanbury Brown and Richard Twiss proposed a radically new approach. Instead of trying to form a better image (a task limited by first-order coherence and atmospheric turbulence), they suggested measuring the correlation in the intensity fluctuations between two separate, widely spaced detectors. This technique, known as intensity interferometry, is a direct application of second-order coherence.

Imagine a distant binary star system. To a conventional telescope, it might be a single, unresolved blur. But if we point two detectors at it, separated by a distance ddd, and measure how the intensity at one detector correlates with the intensity at the other, a hidden pattern emerges. The thermal light from each star is chaotic, causing its intensity to fluctuate. These fluctuations, arriving at the two detectors, create a correlation pattern that depends on the detector separation. For a binary system, this correlation will oscillate, with the peaks and valleys of the oscillation revealing the angular separation of the two stars and even their relative brightness.

This is a profound idea. We are not "seeing" the stars directly. Instead, we are listening to the statistical "chatter" of the photons arriving on Earth. By analyzing the structure of this chatter, we can reconstruct the structure of the source. The van Cittert-Zernike theorem, which we know relates the spatial coherence of the field to the source's structure, has a powerful cousin in the world of intensity correlations. The measured second-order coherence pattern is directly related to the spatial profile of the light source. What's more, the characteristic scale over which the intensity fluctuates is intrinsically different from the scale over which the electric field itself is correlated. This distinction is at the heart of why intensity interferometry can succeed where traditional methods fail. A simple laboratory experiment using a thermal source and a double-slit setup can reproduce this phenomenon beautifully, demonstrating how bunches of photons create correlated intensity patterns in the far field. By measuring these correlations, we can work backward to deduce the properties of the source, effectively using quantum statistics as a cosmic ruler.

The Quantum Signature: Bunching, Antibunching, and the Nature of Light

The HBT experiment revealed that photons from stars tend to arrive in groups, a signature of their thermal origin. This "photon bunching" is a hallmark of all chaotic light sources, and it has tangible consequences. When chaotic light, like that from a thermal source, illuminates a rough surface, it creates a random pattern of bright and dark spots called a speckle pattern. If you were to measure the intensity correlation at a single point, you would find g(2)(0)=2g^{(2)}(0)=2g(2)(0)=2, a direct consequence of this bunching tendency. In stark contrast, if you shine an ideal laser on the same surface, the resulting speckle pattern is statistically "smoother". The laser's Poissonian photon statistics mean there is no bunching, and g(2)(τ)=1g^{(2)}(\tau)=1g(2)(τ)=1 everywhere. The difference between these two sources is not just an abstract number; it's written into the very texture of the light they produce.

But the story of second-order coherence holds an even more dramatic character: the single-photon emitter. Consider a single atom, continuously excited by a laser. It absorbs a photon, jumps to an excited state, and then, after a short time, decays by emitting a fluorescence photon. If we place a detector to capture this photon, what is the probability of detecting a second photon immediately after the first? The answer must be zero. The atom has just given up its energy to emit the first photon; it is back in the ground state. Before it can emit another, it must first be re-excited. This creates a "quiet time" after each emission event.

This phenomenon, known as photon antibunching, is the quintessential signature of a single quantum emitter. When measured, it yields the striking result g(2)(0)=0g^{(2)}(0)=0g(2)(0)=0. It's a direct statement that the source emits photons one by one, not in random bunches or a steady stream. This is not a subtle effect; it is an absolute prohibition rooted in the quantum nature of the atom. Today, measuring g(2)(0)g^{(2)}(0)g(2)(0) and finding a value very close to zero is the gold standard for verifying the creation of a single-photon source—an essential building block for quantum computing, quantum cryptography, and secure communications.

Probing Physical Processes: From Nonlinear Optics to the Atomic Nucleus

So far, we have used g(2)g^{(2)}g(2) as a tool to characterize the source of light. But we can turn this idea on its head and use light with known statistics to probe the physical processes it passes through.

Let's venture into the world of nonlinear optics. Imagine taking a beam of chaotic thermal light, with its characteristic bunching (gin(2)(0)=2g^{(2)}_{in}(0)=2gin(2)​(0)=2), and focusing it into a special crystal. This crystal has the property that it can generate light at three times the frequency of the input light, a process called Third-Harmonic Generation (THG). The intensity of this new light, I3HI_{3H}I3H​, is proportional to the cube of the input intensity, IfI_{f}If​. Now what happens to the photon statistics? Since the input intensity fluctuates, the output intensity will fluctuate even more wildly—a small peak in the input thermal light becomes a massive peak in the third-harmonic output. If we calculate the second-order coherence of this new light, we find a staggeringly large value. For a thermal input, the theoretical value is g3H(2)(0)=20g^{(2)}_{3H}(0) = 20g3H(2)​(0)=20! The light has become "super-bunched." By measuring the change in g(2)g^{(2)}g(2), we learn about the nonlinear nature of the interaction in the crystal.

This tool is not limited to optics labs. Consider the Mössbauer effect in nuclear physics. Certain radioactive nuclei embedded in a crystal can emit gamma-ray photons. Some of these emissions happen "recoillessly," transferring no momentum to the crystal lattice. This light is extremely monochromatic, with a coherence time related to the natural lifetime of the nucleus. Other emissions involve the crystal lattice, creating vibrations (phonons) and producing a much broader, less coherent spectrum of light. The total radiation is a mixture of these two independent channels. How can we dissect this? By measuring the total second-order coherence function, gtotal(2)(τ)g^{(2)}_{\text{total}}(\tau)gtotal(2)​(τ). The result is a beautiful mixture of the signatures of each process. It contains a term corresponding to the bunching from the recoilless channel and another term for the bunching from the recoil channel, each weighted by its relative contribution. By fitting this function to experimental data, physicists can untangle the complex physics of the nucleus and its interaction with the surrounding crystal.

The Final Frontier: Ghost Imaging and Black Holes

The applications of second-order coherence continue to push into ever more exotic and mind-bending territory. One such area is "ghost imaging." In a typical ghost imaging setup, light from a chaotic source is split into two paths. One path travels through an object (like a double-slit mask) and then to a simple "bucket" detector that measures total intensity but has no spatial resolution. The other path goes to a high-resolution camera, but this path never interacts with the object. Miraculously, by measuring the intensity correlation between the bucket detector and the pixels of the camera, an image of the object can be reconstructed. This works precisely because the spatial correlations inherent in the chaotic light field—the same correlations revealed by g(2)g^{(2)}g(2)—contain information about the object that can be extracted non-locally.

Finally, we arrive at one of the most profound predictions in all of physics. In the 1970s, Stephen Hawking showed that, due to quantum effects near their event horizons, black holes are not truly black. They should emit radiation as if they were hot bodies, with a temperature inversely proportional to their mass. This "Hawking radiation" is predicted to be perfectly thermal. If this is true, then it must carry the unmistakable fingerprint of thermal light: photon bunching. A measurement of the second-order coherence of Hawking radiation should yield g(2)(0)=2g^{(2)}(0) = 2g(2)(0)=2. This simple number connects the quantum fluctuations of the vacuum, the arcane geometry of curved spacetime, and the fundamental laws of thermodynamics. While measuring this directly is far beyond our current technological reach, it stands as a testament to the unifying power of physics. The same principle that allows us to measure the size of a star, verify a single-photon source for a quantum computer, and probe the heart of a nonlinear crystal may one day allow us to confirm one of the most extraordinary predictions about the nature of the cosmos itself. The story of light's texture is, in many ways, the story of physics itself.