🏠 Home 📘 Track 1: Quantum Basics L15 — Three Superpowers Together L16 — Decoherence L17 — No-Cloning Theorem
L16 §4 · Build Something Real ~20 min

Decoherence — The Enemy of Quantum

Quantum computers must be kept colder than outer space. They fail if a single stray photon touches them. They require the most isolated environments ever built by human hands — and still decay in microseconds. The reason is decoherence. Understanding it tells you almost everything about why quantum computing is hard to build.

✦ One Idea Decoherence is when the environment accidentally measures your qubit before you do — destroying the superposition and erasing the quantum information you were trying to compute with. It is not a hardware failure. It is physics.
decoherenceT₁ amplitude dampingT₂ dephasingcoherence timeNISQ eraquantum error correctionfault tolerance
Section 01
① Hook

The Problem That Nearly Killed Quantum Computing

💀
Test your intuition first
This reveals the most brutal practical constraint in quantum computing.

A superconducting qubit in a modern quantum computer can stay in a coherent superposition for approximately how long before the environment destroys it?

In the early 1990s, quantum computing was mostly theoretical. Physicists had shown quantum algorithms could outperform classical ones on paper. But there was a quiet, devastating objection almost everyone took seriously.

Real physical systems are noisy. Atoms vibrate. Photons stray. Magnetic fields fluctuate. Every qubit you build is sitting inside a physical universe that is constantly bumping into it, poking it, and observing it — completely by accident. And you already know what observation does to a qubit.

It collapses the superposition.

If a qubit loses its quantum state faster than you can compute with it, quantum computing is not just hard. It is impossible.

🤔
The objection that shook the field — Rolf Landauer, 1995
Rolf Landauer — one of the founders of information theory — argued publicly that decoherence would always destroy quantum states faster than useful computation could run. He was not a sceptic of quantum mechanics. He understood it deeply. He simply thought keeping a qubit coherent long enough was a physical impossibility. He turned out to be wrong — but it took decades of extraordinary engineering to prove it.

This lesson is about decoherence — what it is, why it happens, and what engineers do about it. It is also about why the existence of working quantum computers today is one of the greatest experimental achievements in the history of physics.

Section 02
② Intuition

What Decoherence Actually Is

You learned earlier that measuring a qubit collapses its superposition. Decoherence is that same process — but done by the environment, not by you, and not on purpose.

🌊 The Perfect Wave
Imagine a perfect, clean wave on the surface of a still pond — precise shape, crests and troughs at exact positions, a specific wavelength. Beautiful. Coherent.

Now the pond is not perfectly still. Wind ruffles the surface. A fish moves. A raindrop falls. Each disturbance adds a tiny random ripple, interfering with your wave — smearing it, distorting it, eventually washing it out entirely.

After enough disturbances, there is no wave left. Just choppy noise.

That is decoherence. The qubit's superposition is the perfect wave. The environment — stray photons, thermal vibrations, fluctuating electromagnetic fields — is the wind and the rain. Each tiny interaction smears the quantum state a little more, until nothing coherent remains.

The key insight — and the reason decoherence is so hard to fight — is that you do not need a dramatic interaction to destroy a quantum state. A single photon touching a qubit is enough. A tiny fluctuation in a nearby electric field. A vibration of the surrounding metal. Any interaction that leaks even one bit of information about the qubit's state into the environment begins destroying the superposition.

The most important thing to understand about decoherence
Decoherence is not a quantum computer breaking. It is a quantum computer being accidentally measured — by the universe itself — before you finish your computation. The environment has no intention. It simply obeys physics. And physics says: any interaction that carries information about a system's state is a measurement. The universe is full of unintended measurements.
Section 03
③ Framework

Why the Environment Is Always Listening

The physical universe is not obliging. It does not stay out of the way because you are trying to compute. Every particle in your quantum processor is surrounded by other particles, fields, and vibrations that are constantly probing it.

🎻 The Violin in a Noisy Room
A violin string vibrating at a pure, precise frequency in a silent concert hall resonates freely — holding its note, vibrating for a long time, maintaining its exact character.

Move that violin to a construction site. Sound waves from pneumatic drills hit it. Air currents from passing trucks disturb it. Vibrations from the building floor couple into it. The pure note gets muddied. Other frequencies creep in. The clean vibration degrades into noise.

The violin has not broken. But the precise, coherent vibration it was maintaining has been destroyed — not by any single catastrophic event, but by constant, unavoidable interaction with a noisy environment. The qubit's situation is identical — and the construction site is everything that exists.

At room temperature, thermal energy is enormous on the quantum scale. Every atom around the qubit is vibrating with enough energy to completely obliterate the qubit's quantum state in nanoseconds. This is why real quantum computers look the way they do.

15 mK
Operating temperature of superconducting quantum computers — 180× colder than outer space (2.7 K) and 20,000× colder than room temperature
~100 µs
Typical T₂ coherence time for a superconducting qubit — the window before dephasing destroys the ability to interfere
~10 ns
Time for a typical two-qubit gate — must be much faster than coherence time to compute before the state collapses

A superconducting qubit stays coherent for around 100 microseconds before decoherence destroys it. Each gate operation takes about 10 nanoseconds. So you have roughly 10,000 gate operations before the quantum state falls apart. For many useful algorithms, that is not nearly enough — especially before factoring in the extra gates needed for error correction.

Section 04
④ Theory

T₁ and T₂ — Two Ways a Qubit Can Decay

Physicists distinguish two fundamentally different kinds of decoherence, each with its own timescale and mechanism. Understanding the difference matters because they have different physical causes and respond differently to engineering solutions.

T₁ — Amplitude Damping
⬇️
Energy relaxation — the qubit falls to its ground state
The qubit loses energy to the environment and relaxes from $|1\rangle$ toward $|0\rangle$. Think of a ball rolling to the lowest point in a bowl. Whatever superposition the qubit held, it drifts toward the lowest-energy state — destroying any quantum information. T₁ is the timescale for this energy relaxation. State-of-the-art: ~500 µs for superconducting, minutes for trapped ions.
T₂ — Phase Damping (Dephasing)
🌀
Phase randomisation — interference is destroyed
The probabilities of measuring 0 or 1 stay the same — but the relative phase between $|0\rangle$ and $|1\rangle$ drifts randomly. Since quantum algorithms depend entirely on phase for interference, dephasing destroys computation. Always $T_2 \leq 2T_1$ — and typically $T_2 \ll T_1$. Dephasing is usually the binding constraint for algorithms.
💡
Why dephasing (T₂) matters more than energy loss (T₁) for computation
Quantum algorithms work by engineering precise phase relationships between states — interference is the entire computational mechanism. Phase damping scrambles those relationships randomly. Even if the qubit has not "fallen" to a wrong energy state, dephasing corrupts the phase information that drives interference. A qubit with perfect T₁ but short T₂ is useless for computation — it holds correct probabilities but generates random nonsense when used in a circuit.

Not all qubits are equal — technology comparison

Different physical implementations have very different coherence properties. There is no perfect technology yet. Every platform makes tradeoffs between coherence, gate speed, connectivity, and scalability.

TechnologyT₁ (energy)T₂ (phase)Gate timeConnectivityScalability
Superconducting~100–500 µs~50–200 µs10–50 ns ✓Limited (2D grid)Best current (IBM, Google)
Trapped IonMinutes–hours ✓Seconds–minutes ✓1–100 µs (slow)All-to-all ✓Hard to scale
PhotonicImmune in transit ✓Immune in transit ✓Fast ✓Very hard to entangleNo good 2-qubit gate
Neutral AtomSeconds ✓Milliseconds~1 µsProgrammable ✓Emerging
Spin (Silicon)~1 s ✓~1 ms~50 ns ✓Very limitedCMOS-compatible ✓

The pattern is clear: longer coherence and better connectivity usually come at the cost of slower gates or harder manufacturing. The field has not yet found the perfect qubit — and doing so may require combining multiple technologies.

Section 05
④ Theory

How Engineers Fight Back — and Where We Stand

Decoherence is not a solved problem. But engineers have developed a toolkit of strategies that, together, have pushed coherence times from nanoseconds in the 1990s to hundreds of microseconds today — and climbing every year.

1
🧊 Physical Isolation
Remove the environment — make it as absent as possible
Cool to near absolute zero to eliminate thermal photons. Surround the qubit in vacuum to eliminate air molecules. Shield it from electromagnetic radiation. Vibration-isolate the entire apparatus.

IBM's quantum computers sit inside a dilution refrigerator — a machine the size of a car, shaped like a giant chandelier of metal discs, each colder than the last. The qubits at the bottom are at 15 millikelvin. And it is still not cold enough to eliminate decoherence entirely.

Think of a surgical operating theatre — extraordinary isolation effort, extraordinary contamination prevention — and still not zero-risk. The goal is not perfect isolation. It is good enough isolation to finish the algorithm before the state decays.
2
⚡ Dynamical Decoupling
Fight slow noise with fast pulses — average it out before it accumulates
A technique borrowed from NMR spectroscopy. Instead of waiting passively while noise accumulates, apply rapid sequences of carefully timed pulses to the qubit. If the noise is slower than the pulse sequence, the noise averages out before it can accumulate. The quantum state survives longer.

The analogy: balancing a stick on your finger. Stand still — it falls quickly. Make constant small corrective movements — it stays upright far longer. The stick is the quantum state. The corrective movements are the pulse sequence. Dynamical decoupling does not stop decoherence — it buys time against slow noise.
3
🛡 Quantum Error Correction
Accept that errors happen — detect and fix them before they matter
The most powerful approach. The idea seems paradoxical: you cannot measure a qubit without destroying its superposition — so how do you detect an error without collapsing the state?

The answer: spread the information of one logical qubit across many physical qubits. Measure the relationships between qubits (parity checks, stabilisers) — not their individual states. These syndrome measurements detect errors without revealing the logical qubit's state. Find the error, apply a correction gate, continue. The logical qubit survives even though individual physical qubits were corrupted.

The threshold theorem (1996) proved that if physical error rates fall below ~1%, arbitrarily long quantum computations are possible. Today's best systems already meet this threshold on individual gates. The barrier is now the engineering cost of implementing full error correction at scale.
🔮
The overhead is enormous — and entirely worth it
Current estimates: a fully fault-tolerant logical qubit requires 1,000–10,000 physical qubits. To run Shor's algorithm at scale to threaten real encryption would require millions of physical qubits. Today's largest quantum computers have a few thousand. The gap is real. The work is ongoing. And the people doing it are among the most talented engineers and physicists alive — at Google, IBM, Microsoft, IonQ, PsiQuantum, and dozens of academic labs worldwide.

Where we are now — the NISQ era

Today's quantum computers live in what researchers call the NISQ era — Noisy Intermediate-Scale Quantum. Coined by physicist John Preskill in 2018, the term captures exactly where the field stands.

Now — NISQ Era
🔬
~1,000–5,000 physical qubits, no full error correction
Noisy: decoherence and gate errors present and significant. Intermediate-scale: too large to simulate classically, too small for full error correction. Quantum: genuinely quantum, with superposition and entanglement — just imperfect. Useful for specific research. Cannot run Shor's at practical scale.
Future — Fault-Tolerant Era
🚀
Millions of physical qubits, full error correction
Logical qubits with arbitrarily long coherence through full quantum error correction. Can run Grover's, Shor's, and quantum chemistry at practically useful scale. Timeline: widely estimated at 10–20 years. Every year the hardware gets measurably closer.
💡
Decoherence is an obstacle — not a wall
In 1995, Peter Shor proved quantum error correction was theoretically possible — that decoherence could be fought systematically, not just minimised. In 1996, the threshold theorem showed that if gate errors fall below ~1%, arbitrarily long quantum computations are achievable. Today's best systems already beat that threshold on individual gates. The barrier is now engineering scale and cost — not fundamental physics. Landauer was wrong. Quantum computing is possible. It is just very, very hard.
Section 06
⑤ Interactive

Decoherence Decay Simulator

Watch what decoherence looks like in real time. Choose a qubit technology to set its natural coherence time, adjust environmental noise, toggle shielding, and press Run. Watch the quantum wave become noise and the coherence bar fall to zero.

💀 Decoherence Decay Simulator
Choose qubit type · set noise · run · watch superposition die
INTERACTIVE
Coherence
100%
Elapsed
0 µs
Status
Coherent ✓
Shielding
Off
Qubit Technology
Environmental Noise  Medium
Higher = more stray photons, thermal vibrations, field fluctuations
Select a qubit technology and press Run to watch decoherence unfold. Toggle shielding to see isolation effects. Switch qubit technology to compare coherence budgets.
Quick Check
Lesson Summary

What You Now Know About Decoherence

  • 🌊
    Decoherence is accidental measurement by the environment
    Any interaction that leaks information about a qubit's state into the environment — a stray photon, thermal vibration, fluctuating field — is a partial measurement that gradually destroys the superposition. The environment does not need to be malicious. It just needs to exist and obey physics.
  • ❄️
    This is why quantum computers require extraordinary isolation
    Superconducting qubits operate at 15 millikelvin — 180× colder than outer space. The dilution refrigerators that house them are among the most precisely engineered machines ever built. All of this to remove the environment that would otherwise destroy the quantum state in nanoseconds at room temperature.
  • ⏱️
    T₁ and T₂ set the computational budget
    T₁ (energy relaxation) and T₂ (dephasing) are the two timescales of qubit decay. T₂ ≤ 2T₁ always, and it is usually the binding constraint — because algorithms depend on phase coherence for interference. A qubit with T₂ ≈ 100 µs and gate time ≈ 10 ns gives roughly 10,000 gate operations before the state is lost.
  • 🛡️
    Three strategies: isolate, decouple, correct
    Physical isolation removes the environment. Dynamical decoupling uses pulse sequences to average out slow noise. Quantum error correction spreads logical information across many physical qubits and measures relationships — not states — to detect and fix errors without collapsing the computation. Only the third path leads to fault tolerance.
  • 🔬
    We are in the NISQ era — noisy, intermediate-scale, genuinely quantum
    Today's machines are real, quantum, and research-useful — but too noisy for full error correction. The threshold theorem proves fault tolerance is physically achievable when gate error rates fall below ~1%. Today's best systems already meet this on individual gates. The challenge now is scale: millions of physical qubits per logical qubit.
How clearly does decoherence click?

Decoherence is the enemy.
But there is another fundamental limit —
one that is not about noise or temperature,
but about information itself.
You cannot copy a quantum state. Ever.

→ The No-Cloning Theorem — L17
Sources & Further Reading
← Previous
Three Superpowers Together
L15 — How the three superpowers combine in every algorithm