try ai
Popular Science
Edit
Share
Feedback
  • Quasi-Energy

Quasi-Energy

SciencePediaSciencePedia
Key Takeaways
  • In periodically driven quantum systems where energy is not conserved, quasi-energy emerges as a conserved quantity that governs the system's dynamics.
  • The stability of a driven system is determined by its Floquet multipliers, with instability indicated by complex quasi-energies that lead to exponential growth.
  • Periodic driving, or Floquet engineering, allows for the creation of effective Hamiltonians and novel phases of matter, such as topological insulators and time crystals, that do not exist in static systems.
  • Generic many-body driven systems are expected to absorb energy and heat to an infinite temperature state, but can exhibit long-lived, prethermal states at high driving frequencies.

Introduction

In the study of physics, the law of energy conservation is a foundational pillar, providing a powerful framework for understanding closed systems. However, this bedrock principle appears to crumble when systems are subjected to external, time-varying forces, such as an atom in a laser field or a material shaken by an oscillating potential. This raises a critical question: in the absence of energy conservation, what principles govern the evolution and stability of these periodically driven quantum systems?

This article introduces the concept of quasi-energy, a remarkable analogue to energy that emerges from Floquet theory to restore order to periodically driven systems. We will embark on a journey to understand this powerful idea, beginning with its fundamental principles. The first chapter, ​​"Principles and Mechanisms,"​​ will demystify how quasi-energy arises, how it dictates system stability, and how approximate methods allow us to engineer effective, static Hamiltonians from complex, time-dependent ones. Following this, the chapter on ​​"Applications and Interdisciplinary Connections"​​ will reveal the tangible impact of these concepts, showcasing how Floquet engineering is used to control quantum particles, forge exotic states of matter like topological insulators and time crystals, and even provide new insights into quantum computation. By exploring both the theory and its groundbreaking applications, this article illuminates how manipulating systems in time opens a new frontier for controlling the quantum world.

Principles and Mechanisms

In our journey through physics, we grow to love and cherish conserved quantities. Energy, in particular, is our steadfast companion. In a closed system, its total value never changes. This simple fact, the conservation of energy, is the bedrock upon which much of classical and quantum mechanics is built. It gives us a powerful lens to understand everything from a falling apple to the spectrum of a distant star.

But what happens when the system is not closed? What if we are constantly fiddling with it? Imagine a child on a swing. To keep them going, you must push them periodically. The total energy of the swing is certainly not conserved; it increases with each push. Or consider an atom bathed in the oscillating electric field of a laser. The "rules" of the game, encoded in the system's ​​Hamiltonian​​ (HHH), are changing from moment to moment. In such cases, where the Hamiltonian is a function of time, H(t)H(t)H(t), the familiar comfort of energy conservation seems to vanish. Does this mean we are lost in a sea of chaos, with no guiding principles?

Fortunately, nature is more elegant than that. When the disturbance is periodic—like the steady pushes on a swing or the sinusoidal wave of a laser—a new, more subtle kind of order emerges. This is the world of Floquet theory, and its central character is a new concept that plays the role of energy: the ​​quasi-energy​​.

The Rotating Frame Trick: Making Time Stand Still

Let's try to get a feel for this. One of the most powerful tools in a physicist's toolkit is choosing a clever point of view. If a problem looks complicated, perhaps we are just looking at it from the wrong angle.

Consider a classic textbook case: a single electron spin, which behaves like a tiny magnetic compass needle, placed in a magnetic field that is rotating in a circle. Let's say the field has a static part B0B_0B0​ pointing up (the zzz-direction) and a part of strength B1B_1B1​ rotating in the x−yx-yx−y plane with frequency ω\omegaω. In our lab, we see this poor spin being twisted and turned by a field that is constantly changing direction. The Hamiltonian H(t)H(t)H(t) is a mess of sines and cosines. Trying to solve the Schrödinger equation directly is a headache.

But what if we jump on the carousel? Let's switch to a reference frame that rotates right along with the magnetic field at the same frequency ω\omegaω. From this new perspective, the rotating part of the field looks completely stationary! Of course, this change of perspective isn't free. Just as jumping onto a moving carousel creates the illusion of a "centrifugal force" pushing you outward, switching to a rotating quantum frame of reference introduces a new, fictitious term into the Hamiltonian. But the payoff is immense. The new, effective Hamiltonian, let's call it H′H'H′, is completely time-independent.

We have traded a difficult time-dependent problem for a simple time-independent one. We know how to solve that! We find the energy eigenvalues of this new, static Hamiltonian H′H'H′. It turns out that these eigenvalues are precisely the ​​quasi-energies​​ of our original system. For our spinning electron, the time-independent problem reveals two energy levels, and the difference between them is the quasi-energy splitting, Δε\Delta\varepsilonΔε:

Δε=ℏ(ω0−ω)2+ω12\Delta\varepsilon = \hbar\sqrt{(\omega_0-\omega)^2 + \omega_1^2}Δε=ℏ(ω0​−ω)2+ω12​​

Here, ω0\omega_0ω0​ and ω1\omega_1ω1​ are the Larmor frequencies associated with the static and rotating fields, respectively. This famous formula describes the phenomenon of magnetic resonance. Notice something fascinating: the splitting depends on the difference between the natural precession frequency of the spin, ω0\omega_0ω0​, and the driving frequency, ω\omegaω. When the drive matches the natural frequency (ω=ω0\omega = \omega_0ω=ω0​), we are "in resonance", and even a small driving field (ω1\omega_1ω1​) can have a large effect.

This "rotating frame" trick is a general and powerful idea. It allows us to find the quasi-energies for a particle on a ring driven by a rotating electric field and for many other periodically driven systems. The principle is always the same: find a point of view from which the relentless passage of time seems to freeze, and in that frozen world, discover the quasi-energies that govern the true dynamics.

Stability and Instability: When Pushing Goes Wrong

So, we have this new quantity, quasi-energy. What is it good for? One of its most critical roles is in determining the stability of a system. If you push a swing, you can get it to soar higher and higher. But if you get the timing wrong, you can stop it dead. Periodic driving can both stabilize and destabilize.

To understand this, let's look at the system stroboscopically. Instead of watching the continuous evolution, we'll just take a snapshot at the end of each period: at times t=T,2T,3T,…t=T, 2T, 3T, \dotst=T,2T,3T,…. The evolution over one full period, from t=0t=0t=0 to t=Tt=Tt=T, can be described by a single matrix, the ​​monodromy matrix​​ MMM. If the state of the system at time t=0t=0t=0 is described by a vector x(0)\mathbf{x}(0)x(0), then after one period it is x(T)=Mx(0)\mathbf{x}(T) = M\mathbf{x}(0)x(T)=Mx(0). After nnn periods, it's simply x(nT)=Mnx(0)\mathbf{x}(nT) = M^n \mathbf{x}(0)x(nT)=Mnx(0).

The entire long-term behavior of the system is governed by the powers of the matrix MMM. The stability, therefore, depends entirely on the eigenvalues of MMM, which are called ​​Floquet multipliers​​. If all the multipliers have a magnitude of 1 or less, the system's state will remain bounded—it is stable. But if even one multiplier has a magnitude greater than 1, the state will grow exponentially with each cycle, flying off to infinity. The system is unstable.

This leads to a wonderfully counter-intuitive result. You might think that to check for stability, you should look at the system's instantaneous tendency to grow or shrink at every moment within the cycle. But you would be wrong! A system can be instantaneously unstable at every single moment during its cycle, yet be perfectly stable over the long run. Imagine a simple two-step process. In the first half of the cycle, we stretch the system in the xxx-direction by a factor of 2 and squash it in the yyy-direction by a factor of 3. Unstable! In the second half, we do the reverse: squash in xxx by 3 and stretch in yyy by 2. Also unstable! But the net effect of the full cycle is to squash the system in both directions by a factor of 23\frac{2}{3}32​. The system is, in fact, incredibly stable. What matters is not the instantaneous behavior, but the cumulative effect over a full period, which is captured perfectly by the Floquet multipliers.

What does an unstable multiplier mean in terms of quasi-energy? Remember that the state evolves with a phase factor e−iϵt/ℏe^{-i\epsilon t/\hbar}e−iϵt/ℏ. If the quasi-energy ϵ\epsilonϵ becomes a complex number, ϵ=ϵR+iϵI\epsilon = \epsilon_R + i\epsilon_Iϵ=ϵR​+iϵI​, then this factor becomes:

e−i(ϵR+iϵI)t/ℏ=eϵIt/ℏ×e−iϵRt/ℏe^{-i(\epsilon_R + i\epsilon_I)t/\hbar} = e^{\epsilon_I t/\hbar} \times e^{-i\epsilon_R t/\hbar}e−i(ϵR​+iϵI​)t/ℏ=eϵI​t/ℏ×e−iϵR​t/ℏ

The imaginary part of the quasi-energy, ϵI\epsilon_IϵI​, leads to an exponential growth (if ϵI>0\epsilon_I > 0ϵI​>0) or decay (if ϵI<0\epsilon_I < 0ϵI​<0) of the state's amplitude. An unstable system is one with complex quasi-energies.

A classic example of this is ​​parametric resonance​​. This is precisely how a child on a swing builds up amplitude: they "pump" by shifting their center of mass not at the frequency of the swing, but at twice the natural frequency. A small periodic modulation at the right frequency can feed a huge amount of energy into an oscillator. In a quantum field context, this instability can lead to the spontaneous creation of particles from the vacuum. For a scalar field whose mass is modulated at exactly twice its natural frequency, unstable modes appear with a well-defined growth rate, given by the imaginary part of their quasi-energy.

Approximate Realities: The Engineer's View

In the real world, we often deal with situations where the periodic drive is either very fast or very slow compared to the natural timescales of the system. In these limits, we can develop powerful approximations.

When the driving frequency ω\omegaω is very high, the system can't keep up with the rapid oscillations. It's like trying to read a sign on a car that's whizzing past. You don't see the details, you just see a blur. The system effectively responds only to the time-averaged effect of the drive. This intuition can be made precise: for high frequencies, the complicated time-dependent Hamiltonian H(t)H(t)H(t) can be replaced by a simpler, time-independent ​​effective Hamiltonian​​ HeffH_{\text{eff}}Heff​. This powerful idea, known as Floquet engineering, allows us to use periodic drives to design new systems with desired properties.

A tangible example is the ​​AC Stark shift​​. When you shine a non-resonant laser on an atom, its energy levels shift. This shift is a direct consequence of the periodic drive. Using perturbation theory for the quasi-energy, we can calculate this shift precisely. The second-order correction to the quasi-energy of a state ∣2⟩|2\rangle∣2⟩ coupled to states ∣1⟩|1\rangle∣1⟩ and ∣3⟩|3\rangle∣3⟩ is:

ϵ2(2)=2Δ(g12−g22)Δ2−(ℏω)2\epsilon_2^{(2)} = \frac{2\Delta(g_1^2 - g_2^2)}{\Delta^2 - (\hbar\omega)^2}ϵ2(2)​=Δ2−(ℏω)22Δ(g12​−g22​)​

This formula shows how the energy level is "pushed" up or down depending on whether the drive frequency ω\omegaω is below or above the transition frequency Δ/ℏ\Delta/\hbarΔ/ℏ. High-frequency drives can even be used to turn an unstable system (one with gain and loss) into a stable one by effectively modifying its properties.

Conversely, when the drive is very slow, the system can adjust itself "adiabatically" to the slowly changing conditions. In this limit, the quasi-energy is, to a good approximation, simply the time-average of the instantaneous energy of the system. However, subtle geometric effects related to Berry's phase can also appear, though in certain simple cyclic evolutions, the leading correction to this average can be exactly zero.

The Ultimate Fate: Heating to Oblivion?

Let's now ask the ultimate question. What happens if we take a complex, interacting system—like a block of metal, with its countless vibrating atoms and scurrying electrons—and we shake it periodically, forever?

The picture that emerges from modern physics is both simple and profound. Energy is not conserved. The drive is a boundless source of energy. The system, being a complex many-body object, has an unbelievably dense forest of available energy states. The drive, no matter its frequency ω\omegaω, will inevitably find ​​resonances​​—pairs of many-body states whose energy difference is an integer multiple of the energy of a drive "photon," Em−En≈kℏωE_m - E_n \approx k \hbar\omegaEm​−En​≈kℏω.

A generic, non-integrable system is "chaotic" in the sense that it readily makes transitions between any states it's allowed to. The drive provides the means, and the dense spectrum provides the opportunity. The system will hop from resonance to resonance, steadily absorbing energy and climbing the infinite ladder of energy levels. The inevitable conclusion is that the system will heat up. And it won't stop heating until it has explored every possible configuration, reaching a state of maximum entropy—a featureless, infinitely hot thermal equilibrium. This is the generic fate of a periodically driven many-body system: a "heat death."

But there is a glorious plot twist. If the driving frequency is very high, these resonant transitions are rare and inefficient. For a very, very long time—a time that can be exponentially long in the drive frequency—the system is oblivious to this slow heating. It behaves as if it were governed by a new, approximately conserved effective Hamiltonian, HeffH_{\text{eff}}Heff​. During this intermediate era, the system settles into a seemingly normal thermal state, but one described by HeffH_{\text{eff}}Heff​. This is a long-lived state of ​​prethermalization​​. It's a quasi-stable island in time, before the inevitable, slow slide into infinite-temperature chaos.

The concept of quasi-energy, born from the simple problem of a periodic drive, thus leads us on a grand tour. It gives us a new way to think about energy, provides tools to engineer and stabilize quantum systems, and ultimately forces us to confront the fundamental questions of thermalization and the arrow of time in a world that is perpetually in motion.

Applications and Interdisciplinary Connections

In our journey so far, we have met a curious new character on the stage of quantum mechanics: the quasi-energy. We've seen that for a system tossed about by a periodic force, the old, comfortable notion of a fixed energy level gives way to a shimmering, dynamic landscape of quasi-energies. You might be tempted to think this is just a mathematical trick, a convenient fiction for a messy situation. But nothing could be further from the truth! This concept is not merely a calculational tool; it is a key that unlocks a new realm of physics, a realm where we don't just find materials with interesting properties, but we create those properties on demand by "sculpting" matter with time. We are about to see how this idea allows us to stop particles in their tracks, forge exotic materials that don't exist in nature, and even build clocks that tick in a new way.

Controlling the Quantum World, One Pulse at a Time

Let's start with a single atom. In its quiet, undisturbed state, it has a clear ladder of energy levels. An electron can jump from a lower rung to a higher one by absorbing a photon of just the right energy—no more, no less. This gives atoms their characteristic, sharp absorption spectra, their unique fingerprints. But what happens if we don't just nudge the atom with a gentle probe of light, but instead blast it with a powerful, rhythmically oscillating laser field?

The atom is no longer its own master. The atom and the powerful light field become a single, inseparable entity—a "dressed atom." This new composite object has its own energy levels, and these are precisely the quasi-energies we have been discussing. If we now come along with a second, weak light probe to perform spectroscopy, we find something remarkable. The old absorption lines have shifted, and new ones have appeared out of thin air! The atom can now absorb a probe photon while also absorbing or emitting one, two, or more photons from the strong driving field. These are the "sidebands," and their existence is a direct confirmation that the quasi-energy states are physically real. The strength of these new transitions can be calculated, and they are beautifully described by mathematical functions known as Bessel functions, which depend on the strength and frequency of the driving field. We have, in effect, redesigned the atom's energy structure with light.

This control can lead to even more astonishing effects. Imagine an electron in a semiconductor superlattice, which is like a perfectly ordered egg carton, a long chain of quantum wells. An electron placed in one well can normally "tunnel" through the barriers to its neighbors. This tunneling is what allows for electrical current. Now, let's shake this whole system by applying a uniform, oscillating electric field. What do you expect? Surely, shaking the system should make the electron jiggle around more and help it move. But that's not what happens.

Under certain, finely tuned conditions, the exact opposite occurs: the tunneling is completely suppressed! The electron becomes frozen in its well, unable to move to its neighbors, and the material effectively becomes an insulator. This bizarre phenomenon is called ​​dynamical localization​​. The periodic drive, averaged over a full cycle, creates an effective tunneling rate that can be tuned. And just as you can push a child on a swing with just the right timing to stop them, the driving field can be tuned to make the effective tunneling precisely zero. Again, the condition for this beautiful effect is governed by the zeros of a Bessel function, J0(α)=0J_0(\alpha) = 0J0​(α)=0, where α\alphaα is a parameter relating the field strength to its frequency. Shaking a system to make it stand still—this is the kind of profound and counter-intuitive magic that Floquet engineering makes possible.

Forging New Phases of Matter

The power of quasi-energy goes far beyond manipulating single particles. It allows us to create entirely new, collective phases of matter—materials with properties that are impossible in any static, equilibrium system.

Perhaps the most spectacular success story is the creation of ​​Floquet topological insulators​​. A normal insulator, as you know, does not conduct electricity. A topological insulator is a strange beast: it is an insulator in its bulk, but its surface or edges are forced by the laws of quantum topology to be perfect conductors. These edge currents are remarkably robust; they can flow around imperfections without scattering.

The astonishing discovery was that one can start with a completely ordinary, "trivial" insulating material and, simply by shining a carefully patterned laser light on it, turn it into a topological insulator. The topology isn't found in the "snapshots" of the Hamiltonian at any instant in time, but in the twisting, winding nature of the quantum evolution over a full cycle. The system can go from topologically trivial to nontrivial by driving it across a phase transition, which is marked by the closing and reopening of a gap in the quasi-energy spectrum.

This opens a door to "on-demand" topological materials. We can switch these exotic edge states on and off with a laser pulse. This idea isn't just for electrons. In ​​topological photonics​​, similar principles are used to create robust channels for light, guiding photons along paths that are immune to defects. Even more excitingly, these techniques might be a path toward realizing ​​Majorana fermions​​—exotic particles that are their own antiparticles—in driven superconducting wires. Since Majorana fermions are predicted to be a key ingredient for building a fault-tolerant quantum computer, Floquet engineering has placed us at the forefront of this technological revolution.

As if that weren't enough, periodic driving has given birth to a phase of matter once thought to be impossible: the ​​time crystal​​. We are familiar with crystals in space, like a salt crystal, where a pattern of atoms repeats itself over and over. A time crystal is a system whose lowest-energy state exhibits a pattern that repeats in time. A discrete time crystal, realized in a periodically driven quantum system, is even more peculiar. When you drive it with a period TTT, it doesn't respond with period TTT. Instead, it spontaneously chooses to oscillate with a period of 2T2T2T, or 3T3T3T, or some integer multiple of the driving period. It spontaneously breaks the discrete time-translation symmetry of the drive. This subharmonic ticking is a robust phase of matter, not just a simple resonance, and its stability is protected by a gap in its quasi-energy spectrum. It is a fundamentally new form of order, a collective quantum state that ticks with its own internal rhythm.

A New Look at Quantum Computation

The unifying power of a great physical concept is revealed when it appears in unexpected places. The idea of quasi-energy provides a surprising and profound new perspective on, of all things, quantum algorithms.

Consider Grover's algorithm, a famous quantum recipe for searching an unstructured database, like finding a specific name in a giant phone book. The algorithm consists of applying the same two-step operation—an "oracle" kick followed by a "diffusion" mix—over and over again. Each application is an iteration, a step in time.

But wait! A process that repeats the same operation over and over is exactly the kind of stroboscopic dynamics we see in a Floquet system. We can, therefore, view the Grover operator as a Floquet operator, the unitary matrix that evolves the system over one "period" of the algorithm. With this lens, the search for the marked state becomes a problem in Floquet physics! The two special quantum states that are important for the algorithm (the starting state and the marked state) define the two special eigenstates of this Floquet operator, each with its own quasi-energy. The entire dynamics of the search is nothing more than a simple rotation in the two-dimensional space spanned by these two eigenstates, with the angle of rotation directly related to the quasi-energy difference. This beautiful connection gives us a powerful physical intuition for how the algorithm works and why its speed is what it is. It shows that the deep ideas of quantum dynamics are woven into the very fabric of quantum information.

From controlling atoms to forging new states of matter and understanding quantum algorithms, the concept of quasi-energy has proven to be far more than a mathematical curiosity. It is the language of a new kind of physics—the physics of systems in motion, sculpted and engineered by time itself. It is a testament to the fact that sometimes, by shaking things up, we discover a deeper and more beautiful order than we ever thought possible.