try ai
Popular Science
Edit
Share
Feedback
  • Local Hidden Variables

Local Hidden Variables

SciencePediaSciencePedia
Key Takeaways
  • Local realism combines the intuitive ideas of definite properties (realism) and no faster-than-light influence (locality) to form a classical worldview.
  • Bell's theorem, particularly the CHSH inequality, sets a statistical limit (C≤2\mathcal{C} \le 2C≤2) on the strength of correlations possible within any local realist theory.
  • Quantum mechanics predicts, and experiments confirm, correlations for entangled particles (C=22\mathcal{C} = 2\sqrt{2}C=22​) that violate Bell's inequality, proving the universe is not locally real.
  • The disproof of local hidden variables has practical applications, enabling technologies like Device-Independent Quantum Key Distribution (DIQKD) by using nonlocality as a security guarantee.

Introduction

Our everyday intuition tells us that the world is solid, predictable, and local—an object has definite properties whether we look at it or not, and actions here don't instantly affect things far away. This 'common-sense' worldview, known as local realism, was long assumed to apply to the universe at large. However, the bizarre predictions of quantum mechanics presented a profound challenge, suggesting that reality at the smallest scales behaves in ways that defy this intuition. This article explores the deep conflict between classical intuition and quantum reality by examining the concept of local hidden variables—a final attempt to reconcile quantum strangeness with our classical worldview.

We will first delve into the "Principles and Mechanisms," unpacking the core ideas of local realism, the Einstein-Podolsky-Rosen paradox, and the introduction of hidden variables. We will then see how John Bell's theorem forged an undeniable mathematical line in the sand with his famous inequality, leading to a direct, non-statistical contradiction with the GHZ paradox. Following this, the "Applications and Interdisciplinary Connections" chapter will explore the profound consequences of this conflict, from the experimental tests that have definitively sided with quantum mechanics to the practical use of nonlocality as a resource in quantum cryptography and information theory. This journey reveals why the failure of local hidden variables is not just a philosophical footnote, but a cornerstone of modern physics and technology.

Principles and Mechanisms

Imagine you are a detective investigating a mysterious event. You have two fundamental rules you trust above all else. First, an object has definite properties—a ball is red, a car is heavy—even when you're not looking at it. Second, an action you take right here cannot instantly affect something a mile away. These two ideas, which we can call ​​realism​​ and ​​locality​​, are the bedrock of our everyday intuition. They feel less like scientific theories and more like the basic rules of the game. For a long time, we physicists thought they were the rules of the universe's game, too. The story of local hidden variables is the story of how quantum mechanics forced us to realize that the universe plays by a very different, and much stranger, set of rules.

The Common-Sense Worldview: Local Realism

Let's sharpen these intuitive ideas. The first, ​​realism​​, is the philosophical stance that the world "out there" is concrete and has definite properties, independent of our observation. The moon is still there when nobody looks. A more precise version of this used in physics is ​​counterfactual definiteness​​. This is a fancy term for a simple idea: if you could have measured a property, it must have had a value. Suppose you have a particle and you can measure its spin along the z-axis or the x-axis. You choose to measure the z-axis spin and get +1+1+1. Counterfactual definiteness asserts that even though you didn't measure it, the spin along the x-axis also had a definite value at that moment, either +1+1+1 or −1-1−1. This un-measured value is simply unknown to you.

The second idea, ​​locality​​, is rooted in Einstein's theory of relativity. It states that no influence can travel faster than the speed of light. An event here cannot instantaneously affect an event over there; there must be a time delay of at least the distance divided by the speed of light, L/cL/cL/c. This forbids "spooky action at a distance."

When you put these two ideas together, you get a powerful and intuitive worldview called ​​local realism​​. Albert Einstein, Boris Podolsky, and Nathan Rosen (EPR) used this very worldview to argue that quantum mechanics must be incomplete. In a famous thought experiment, they imagined a particle decaying into two smaller particles that fly apart. If the original particle was at rest, then by conservation of momentum, the two new particles must have equal and opposite momenta. If an observer, Alice, measures the momentum of her particle to be p⃗1\vec{p}_1p​1​, she instantly knows that the other particle, now light-years away with Bob, must have momentum p⃗2=−p⃗1\vec{p}_2 = -\vec{p}_1p​2​=−p​1​ (or more generally K⃗−p⃗1\vec{K} - \vec{p}_1K−p​1​ if the total momentum was K⃗\vec{K}K).

According to local realism, since Alice's measurement can't affect Bob's particle (locality), this momentum value must have been an element of reality for Bob's particle all along (realism). This led to the idea of ​​hidden variables​​: perhaps quantum particles are like little machines carrying around a hidden instruction set, a set of variables (let's call them λ\lambdaλ) that pre-determines the outcome of any measurement you could possibly perform on them. The apparent randomness of quantum mechanics, in this view, is just ignorance—our ignorance of these hidden variables.

Forging a Classical Reality: Hidden Variables

If this hidden variable idea is right, can we build a model that acts like the quantum world? Let's try. The physicist John Bell used a wonderful analogy: Dr. Bertlmann's socks. Dr. Bertlmann is known to always wear mismatched socks. If you see that one of his feet has a pink sock, you know, with 100% certainty and faster than the speed of light, that the other is not pink. There is no spooky action at a distance. The "hidden variable" was simply the color of each sock, determined when he got dressed in the morning.

We can create a more sophisticated "Bertlmann's socks" model for entangled particles. Imagine a source that emits pairs of particles. The hidden variable λ⃗\vec{\lambda}λ is a random direction in space. For each pair, particle A is set to be "spin-up" along λ⃗\vec{\lambda}λ and particle B is "spin-down" along λ⃗\vec{\lambda}λ. If Alice and Bob both measure the spin along the same axis, they will always find opposite results, a perfect anti-correlation, just like in the simple sock analogy and just like quantum mechanics predicts for that specific case.

But what if they measure along different axes? Let the angle between Alice's axis a⃗\vec{a}a and Bob's axis b⃗\vec{b}b be θ\thetaθ. For a simple local hidden variable (LHV) model, one might plausibly guess that the chance of getting a disagreement in outcomes is proportional to how far apart their measurement angles are. A simple model predicts the probability of disagreement is just θπ\frac{\theta}{\pi}πθ​. It's linear—the more you turn the detector, the more the disagreement changes, which seems reasonable. However, quantum mechanics predicts the probability of disagreement is sin⁡2(θ/2)\sin^2(\theta/2)sin2(θ/2). These two functions are very different! For small angles, they are close, but they diverge significantly as the angle increases.

This is a clue. It seems that while LHV theories can explain perfect correlations, they struggle to reproduce the in-between correlations of quantum mechanics. But this is just one specific LHV model. Could some other, more clever LHV model succeed?

The Line in the Sand: Bell's Inequality

This is where John Bell made his monumental contribution in the 1960s. He found a way to test not just one LHV model, but the entire framework of local realism. The result is a theorem, encapsulated in an inequality, that acts as a definitive line in the sand.

The setup is a thought experiment known as a Bell test. A source emits entangled pairs. Alice can choose to measure one of two properties of her particle, let's call the settings xxx and x′x'x′. Bob can likewise measure along one of two settings, yyy or y′y'y′. For each pair, they record their settings and their outcomes (+1+1+1 or −1-1−1). After many runs, they calculate the average correlation for each pair of settings, E(x,y)E(x,y)E(x,y), which is the average value of the product of their outcomes.

To derive Bell's theorem, we need the assumptions of local realism and one other, subtle assumption: ​​measurement independence​​ (or "freedom of choice"). This means that the choice of measurement setting an experimenter makes is not secretly correlated with the hidden variables of the particles just produced. To violate this would require a grand conspiracy where the particle source "knows" what settings you are going to choose in the future and prepares the particles accordingly. Most scientists find this possibility too far-fetched, so they accept measurement independence as a reasonable condition for any sensible scientific experiment.

With these assumptions in place, we can construct the Clauser-Horne-Shimony-Holt (CHSH) version of Bell's inequality. Consider a single particle pair described by some hidden variables λ\lambdaλ. The outcomes A(x,λ)A(x, \lambda)A(x,λ), A(x′,λ)A(x', \lambda)A(x′,λ), B(y,λ)B(y, \lambda)B(y,λ), and B(y′,λ)B(y', \lambda)B(y′,λ) are all pre-determined to be either +1+1+1 or −1-1−1. Let's look at the quantity S(λ)=A(x,λ)B(y,λ)−A(x,λ)B(y′,λ)+A(x′,λ)B(y,λ)+A(x′,λ)B(y′,λ)S(\lambda) = A(x, \lambda)B(y, \lambda) - A(x, \lambda)B(y', \lambda) + A(x', \lambda)B(y, \lambda) + A(x', \lambda)B(y', \lambda)S(λ)=A(x,λ)B(y,λ)−A(x,λ)B(y′,λ)+A(x′,λ)B(y,λ)+A(x′,λ)B(y′,λ). We can rearrange this to get S(λ)=A(x,λ)(B(y,λ)−B(y′,λ))+A(x′,λ)(B(y,λ)+B(y′,λ))S(\lambda) = A(x, \lambda)(B(y, \lambda) - B(y', \lambda)) + A(x', \lambda)(B(y, \lambda) + B(y', \lambda))S(λ)=A(x,λ)(B(y,λ)−B(y′,λ))+A(x′,λ)(B(y,λ)+B(y′,λ)).

Now, notice something remarkable. Since B(y,λ)B(y, \lambda)B(y,λ) and B(y′,λ)B(y', \lambda)B(y′,λ) are just ±1\pm 1±1, one of the terms in the parentheses must be 0 and the other must be ±2\pm 2±2. For instance, if B(y,λ)=B(y′,λ)B(y, \lambda) = B(y', \lambda)B(y,λ)=B(y′,λ), then the first parenthesis is 0 and the second is ±2\pm 2±2. Since A(x′,λ)A(x', \lambda)A(x′,λ) is ±1\pm 1±1, the whole expression S(λ)S(\lambda)S(λ) becomes ±2\pm 2±2. If B(y,λ)=−B(y′,λ)B(y, \lambda) = -B(y', \lambda)B(y,λ)=−B(y′,λ), the second parenthesis is 0 and the first is ±2\pm 2±2. And again, the whole expression S(λ)S(\lambda)S(λ) becomes ±2\pm 2±2. No matter what the values are, for any single event, ∣S(λ)∣=2|S(\lambda)| = 2∣S(λ)∣=2.

If every single event gives a value whose magnitude is 2, then the average over many events, which we call the CHSH parameter C=∣E(x,y)−E(x,y′)+E(x′,y)+E(x′,y′)∣\mathcal{C} = |E(x,y) - E(x,y') + E(x',y) + E(x',y')|C=∣E(x,y)−E(x,y′)+E(x′,y)+E(x′,y′)∣, cannot possibly be greater than 2. This is the famous Bell-CHSH inequality: C≤2\mathcal{C} \le 2C≤2 This is a universal speed limit for any theory built on local realism. It doesn't matter how complex the hidden variables are or what their probability distribution is. If the world is locally real, this bound is unbreakable. Indeed, one can construct explicit LHV models that give a value of exactly 2, showing the bound is tight.

Quantum Mechanics Throws Down the Gauntlet

So, what does quantum mechanics predict for this value C\mathcal{C}C? By choosing the measurement settings cleverly (e.g., a1=0a_1=0a1​=0, a2=π/2a_2=\pi/2a2​=π/2, b1=π/4b_1=\pi/4b1​=π/4, b2=3π/4b_2=3\pi/4b2​=3π/4), the predictions of quantum mechanics for a spin-singlet state yield: CQM=22≈2.828\mathcal{C}_{QM} = 2\sqrt{2} \approx 2.828CQM​=22​≈2.828 This is the bombshell. Quantum mechanics predicts a result that is unequivocally larger than 2. It predicts a violation of the "speed limit" for local realism. The two worldviews—the intuitive, common-sense world of local realism and the strange, mathematical world of quantum mechanics—make demonstrably different predictions for the same experiment. One of them has to be wrong.

Over the past few decades, increasingly sophisticated experiments have been performed. They have had to be carefully designed to close loopholes, for instance, by ensuring the choices of measurement settings and the measurements themselves happen so quickly that no light-speed signal could travel between the detectors to coordinate the results—closing the ​​locality loophole​​. And the results are in. Over and over again, experiments have vindicated quantum mechanics. The measured correlations violate Bell's inequality, just as predicted. The universe is not locally real.

A Contradiction Set in Stone: The GHZ Paradox

The Bell-CHSH inequality is a statistical argument. It relies on averaging over many measurements. You might wonder, is there a way to show the conflict without statistics? Is there a single, all-or-nothing contradiction? In 1989, Daniel Greenberger, Michael Horne, and Anton Zeilinger found one.

Imagine a source that emits triplets of particles, one each to Alice, Bob, and Charlie, in a special entangled state called the ​​GHZ state​​. Each observer can choose to measure spin along an x-axis or a y-axis. The strange nature of the GHZ state leads to a series of perfect correlations predicted by quantum mechanics and confirmed by experiment:

  1. If Alice measures x-spin, and Bob and Charlie measure y-spin, the product of their outcomes (+1+1+1 or −1-1−1) is always +1+1+1.
  2. If Bob measures x-spin, and Alice and Charlie measure y-spin, the product is always +1+1+1.
  3. If Charlie measures x-spin, and Alice and Bob measure y-spin, the product is always +1+1+1.

Now, let's put on our "local realism" hat one last time. This means there are pre-determined values for all these possible measurements: let's call them vA(x),vA(y),vB(x)v_A(x), v_A(y), v_B(x)vA​(x),vA​(y),vB​(x), etc. The experimental facts above translate into simple algebraic equations for these pre-determined values:

  1. vA(x)vB(y)vC(y)=+1v_A(x) v_B(y) v_C(y) = +1vA​(x)vB​(y)vC​(y)=+1
  2. vA(y)vB(x)vC(y)=+1v_A(y) v_B(x) v_C(y) = +1vA​(y)vB​(x)vC​(y)=+1
  3. vA(y)vB(y)vC(x)=+1v_A(y) v_B(y) v_C(x) = +1vA​(y)vB​(y)vC​(x)=+1

Now watch what happens when we multiply these three equations together. The left side is a jumble of terms, but notice that each y-spin value (vA(y)v_A(y)vA​(y), vB(y)v_B(y)vB​(y), vC(y)v_C(y)vC​(y)) appears twice. Since each value is ±1\pm 1±1, its square is just 111. So, (vA(y))2=1(v_A(y))^2 = 1(vA​(y))2=1, (vB(y))2=1(v_B(y))^2=1(vB​(y))2=1, and (vC(y))2=1(v_C(y))^2=1(vC​(y))2=1. The product simplifies beautifully: (vA(x)vB(y)vC(y))×(vA(y)vB(x)vC(y))×(vA(y)vB(y)vC(x))=vA(x)vB(x)vC(x)(v_A(x) v_B(y) v_C(y)) \times (v_A(y) v_B(x) v_C(y)) \times (v_A(y) v_B(y) v_C(x)) = v_A(x) v_B(x) v_C(x)(vA​(x)vB​(y)vC​(y))×(vA​(y)vB​(x)vC​(y))×(vA​(y)vB​(y)vC​(x))=vA​(x)vB​(x)vC​(x) The right side of the product is just (+1)×(+1)×(+1)=+1(+1) \times (+1) \times (+1) = +1(+1)×(+1)×(+1)=+1. Therefore, local realism, combined with the first three experimental facts, forces us to the logical conclusion that vA(x)vB(x)vC(x)=+1v_A(x) v_B(x) v_C(x) = +1vA​(x)vB​(x)vC​(x)=+1.

But now we reveal the fourth experimental fact about the GHZ state: 4. If Alice, Bob, and Charlie all measure x-spin, the product of their outcomes is always −1-1−1.

In the language of local realism, this means vA(x)vB(x)vC(x)=−1v_A(x) v_B(x) v_C(x) = -1vA​(x)vB​(x)vC​(x)=−1. We have a direct contradiction. Logic derived from local realism says the product must be +1+1+1. Experiment says the product is −1-1−1. Our comfortable, common-sense worldview has shattered, not on a statistical subtlety, but on a head-on collision with an undeniable fact. Somewhere in the chain of reasoning, a fundamental assumption—realism, locality, or both—must be abandoned. The universe, at its deepest level, is far stranger than we ever imagined.

Applications and Interdisciplinary Connections

Now, it is only natural to ask: So what? We have journeyed through the abstract landscape of local realism, navigated the twists and turns of Bell's inequalities, and seen how the predictions of quantum mechanics stand in stark opposition. But is this merely a philosopher's plaything, a curious footnote in the grand textbook of physics? Or does this strange conflict between the local and the quantum echo out into the real world, shaping our technology and deepening our understanding of the universe?

The answer, you will not be surprised to hear, is a resounding "yes!" The clash a Local Hidden Variable (LHV) theorist has with reality is not a polite disagreement; it is a fundamental schism with consequences that are as practical as they are profound. Let's trace some of these echoes, from the laboratory bench to the very foundations of information and thermodynamics.

The Acid Test: Nature's Verdict

The first and most crucial application is simply to ask Nature for the answer. A theory is only as good as its experimental verification. Bell's theorem, in its Clauser-Horne-Shimony-Holt (CHSH) formulation, doesn't just make a philosophical point; it makes a quantitative, testable prediction. Local Hidden Variable theories are bound by the rule ∣S∣≤2|S| \le 2∣S∣≤2. Quantum mechanics, for certain entangled states, predicts S=22≈2.828S=2\sqrt{2} \approx 2.828S=22​≈2.828. The gauntlet is thrown down. Who is right?

Laboratories around the world have taken up this challenge. In a typical experiment, physicists create millions of entangled particle pairs, sending them to two separate detectors. At each detector, an experimenter (our old friends, Alice and Bob) randomly chooses one of two measurement settings. After collecting vast amounts of data, they compute the correlations and calculate the CHSH parameter, SSS. Now, the real world is a noisy place. Measurements are never perfect, detectors have efficiencies, and randomness introduces statistical fluctuations. One never measures a perfect 2.8282.8282.828. Instead, one gets a result like Sexp=2.080S_{\text{exp}} = 2.080Sexp​=2.080 with some statistical uncertainty, say σS=0.035\sigma_S = 0.035σS​=0.035.

Is this enough? A value of 2.0802.0802.080 is certainly greater than 222, but is it significantly greater? Could it just be a statistical fluke? This is where the real work of an experimentalist lies. They must quantify their confidence. They calculate how many standard deviations their result is from the classical boundary. In this hypothetical case, the result is (2.080−2)/0.035≈2.29(2.080 - 2) / 0.035 \approx 2.29(2.080−2)/0.035≈2.29 standard deviations away from the LHV limit. While this is suggestive, a physicist would demand more—typically a "five-sigma" result (five standard deviations) to claim a discovery. Over decades, starting with the pioneering work of Alain Aspect and followed by others like Anton Zeilinger, experiments have improved, closing loopholes and pushing the measured value of SSS higher and higher, with statistical certainty that is now beyond any reasonable doubt. Nature has spoken, and its verdict is unambiguously quantum.

The Quantum-Classical Frontier

The experimental results tell us that our universe is, at its heart, quantum. But we live in a world that appears classical. What happens at the border? What does it take to wash away the "spooky action at a distance" and recover our comfortable, local reality? This question is not academic; it is the central challenge in building any quantum technology.

Imagine we have a source that produces a mixture: partly a perfectly entangled "singlet state" and partly just random noise (a "maximally mixed state"). The purity of the quantum state is measured by a "visibility" parameter, ppp. If p=1p=1p=1, we have a perfect quantum state. If p=0p=0p=0, we have pure noise. As we add more and more noise (decreasing ppp), the quantum correlations are diluted. At some point, the correlations become so weak that they no longer violate the Bell inequality. They become... classical. For the CHSH inequality, this happens at a critical visibility of pc=12≈0.707p_c = \frac{1}{\sqrt{2}} \approx 0.707pc​=2​1​≈0.707. Below this threshold, even though the system still has quantum components, its correlations can be perfectly mimicked by a Local Hidden Variable model. The quantum magic has been erased by noise.

This principle is the foundation for one of the most exciting applications of Bell's theorem: ​​Device-Independent Quantum Key Distribution (DIQKD)​​. Imagine Alice and Bob want to create a secret cryptographic key. They can do this by measuring an entangled state. An eavesdropper, Eve, might try to intercept their communication or, even more cunningly, supply them with compromised measurement devices. How can Alice and Bob trust their key?

The trick is to not trust the devices at all. They only need to trust the laws of quantum mechanics. They use their (potentially compromised) devices to play the CHSH game. If they measure a value of S>2S > 2S>2, they know for a fact that their system possesses genuine quantum correlations that no LHV model—and therefore no classical computer controlled by Eve—could have faked. The magnitude of the violation can also be used to bound the amount of information an eavesdropper could possibly have.

However, even this clever scheme has subtleties. What if Eve supplies devices with a built-in memory? For instance, a device's output could be influenced by its previous measurement setting. Such a memory effect, which is entirely local and classical, can be exploited to fake a violation of Bell's inequality. It has been shown that a local model with memory can achieve a CHSH value that exceeds the classical limit of 2, tricking Alice and Bob into believing they have a secure quantum link when they don't. This illustrates the delicate dance of experimental quantum physics: Bell's theorem provides a powerful security guarantee, but only if all physical assumptions of the test—including the absence of memory in the devices—hold true.

Nonlocality as a Resource

So far, we have seen nonlocality as a strange phenomenon to be tested and a security feature to be exploited. But a modern perspective, born from the fusion of physics and computer science, sees it as something more: a resource.

Let's try to quantify this resource. We know that no LHV model can reproduce the correlations of a singlet state, which can achieve S=22S=2\sqrt{2}S=22​. But what if we give the classical model a little help? Suppose Alice, after receiving her input, is allowed to send a classical message to Bob before he makes his measurement. How much communication would they need to fake the quantum result? You might think it would require a lot of information to coordinate their "spooky" actions. The astonishing answer, discovered by Toner and Bacon, is that it takes just ​​one bit​​.

Think about what this means. A single bit, sent from Alice to Bob, is enough to boost a classical strategy from its limit of S=2S=2S=2 all the way to the quantum maximum of S=22S=2\sqrt{2}S=22​. It's as if the shared entangled state provides Alice and Bob with a correlated resource equivalent to one bit of communication, a channel that transcends the space between them. This reframes nonlocality not as a paradox, but as an advantage in information processing. It is a tangible property that quantifies how much "more powerful" quantum correlations are than anything classical.

The Deep Connections: Subtlety, Freedom, and Thermodynamics

The journey does not end with technology. The questions raised by Bell's theorem push us to the very limits of what we mean by physical reality, forging unexpected links between disparate fields of science.

First, a lesson in humility. It is tempting to declare all LHV models dead and buried. But the situation is more subtle. It is possible to construct an LHV model that perfectly reproduces the quantum correlations for any set of measurements, as long as all the measurement directions lie in the same plane. This was, in fact, noted by John Bell himself. The model fails spectacularly as soon as one observer tilts their measurement apparatus out of that plane. This tells us something profound: quantum nonlocality is an intrinsically three-dimensional phenomenon. The conflict with local realism is not absolute; it is tied to the full rotational freedom of our world.

This leads us to the deepest assumption of all, one we have taken for granted: "freedom of choice," or measurement independence. We assume that the experimenters' choice of settings is independent of the hidden variable λ\lambdaλ that determines the outcome. What if this is false? What if the universe is "superdeterministic," and the choice you are about to make is already correlated with the state of the particles you are about to measure? In such a universe, the statistical arguments of Bell's theorem collapse. It is possible to build an LHV model that violates the CHSH inequality by correlating the hidden variable with the measurement settings. This "conspiracy loophole" seems philosophically unpalatable to most scientists—it undermines the very notion of a controlled experiment—but it is logically possible.

Can we say anything more about such a conspiracy? Remarkably, yes. If one insists on explaining a Bell violation with a local model by sacrificing freedom of choice, it comes at a thermodynamic cost. Creating the necessary correlations between the measurement settings and the hidden state requires information, and processing information generates entropy. A beautiful theoretical result connects the magnitude of a Bell violation, SSS, to the minimum irreversible entropy production, ΔSirr\Delta S_{irr}ΔSirr​, required for any local model to simulate it. The relationship is given by ∣S∣≤2cosh⁡(ΔSirr/2)|S| \le 2 \cosh(\sqrt{\Delta S_{irr}/2})∣S∣≤2cosh(ΔSirr​/2​). To reproduce a quantum violation of S>2S>2S>2, you must "pay" a non-zero thermodynamic cost in the form of heat and disorder. The "spookiness" of quantum mechanics is not free; trying to explain it away classically requires an investment of entropy, a real physical quantity.

Finally, what happens when we move beyond two particles? The plot thickens, and the story becomes even more dramatic. For three particles shared between Alice, Bob, and Charlie, one can formulate a similar test called the Mermin inequality. For this test, LHV theories predict a bound of 222. Quantum mechanics, however, predicts a value of 444. The violation is no longer a factor of 2\sqrt{2}2​, but a factor of two! The gap between the classical and quantum worlds grows wider and more undeniable as we consider more complex entangled systems.

From experimental tests to the security of our communications, from the theory of computation to the laws of thermodynamics, the legacy of Local Hidden Variables is one of rich and continuing discovery. It began as an attempt to restore a classical, intuitive picture of the world. Instead, it has become one of our sharpest tools for proving just how wonderfully non-intuitive the world is, and for harnessing that strangeness to build a new understanding of reality and a new generation of technology.