Blog

Quantum Computing beyond 2026: From Lab Breakthroughs to the Hard Work of Scaling

Jan 18, 202622 min read
Quantum Computing beyond 2026: From Lab Breakthroughs to the Hard Work of Scaling
Q2B25 field notes: scaling fault-tolerant quantum systems is now an engineering campaign.

In late 2024, Google's Willow chip hit an error rate of 0.000015%. Quantinuum shipped real-time quantum error correction integrated with Nvidia GPUs. Princeton pushed qubit coherence past 1 millisecond—roughly triple the prior record.

If you follow quantum casually, these might sound like incremental lab improvements. They're not. They mark the shift from “interesting physics” to “engineering a product.”

I spent the last month analyzing Q2B25 conference transcripts across hardware, software, controls, benchmarking, and commercialization. The industry's center of gravity has moved:

The debate is no longer “Is quantum computing possible?” It's “Can we engineer fault-tolerant systems at scale—and can anyone build a business before that day arrives?”


Quantum Computing 2025: The Snapshot

Before we go deep, here are the numbers that matter.

Market signals

  • Commercialization focus: ~93% of Q2B25 talks mentioned commercialization (vs. ~40% in 2020).
  • Potential value: ~$250B across pharma, finance, and materials (high uncertainty, but directionally useful).
  • Near-term market: low single-digit billions today, with healthy growth rates—but still tiny relative to the hype.

Technical milestones

  • ~50 logical qubits demonstrated (best-of-class public results).
  • Order-of-magnitude QEC efficiency gains via overhead reduction techniques (algorithmic + systems).
  • 99.9%+ two-qubit gate fidelity in ion traps.
  • Below-threshold error rates in superconducting platforms (a necessary condition for scalable QEC).

Commercial reality

  • Multiple hardware approaches still competing—no decisive winner yet.
  • Many enterprise projects exist, but most are pilots or co-development (not repeatable product revenue).
  • Almost no pure-play quantum company is profitable today.

1) The real inflection point: “More qubits” is not the story anymore

For years, the headline metric was physical qubits. But physical qubits are fragile: error rates of 0.1% to 1% mean they fail every few operations. You cannot build a reliable computer from components that fail constantly.

The only credible path to large-scale, general-purpose quantum computing is quantum error correction (QEC): turning many noisy physical qubits into fewer, more reliable logical qubits. The overhead is brutal—often hundreds to thousands of physical qubits per logical qubit—so the entire industry has become obsessed with reducing that overhead and making QEC real-time.

Across Q2B25 content, three themes dominated:

  • QEC is becoming engineering, not just physics. Real-time decoding, automated calibration, and systematic workflows are moving from papers into systems.
  • The ecosystem is shifting to system-level integration. Control, calibration, decoding infrastructure, and HPC/cloud integration now matter as much as qubit physics.
  • Commercialization is real—but early. Many customer projects exist, but few resemble scalable product revenue.

A useful model:

  • NISQ: Noisy circuits, interesting experiments, limited correctness guarantees.
  • QEC era: Logical error suppression improves predictably as codes scale.
  • FTQC: Fault-tolerant systems run long computations with undeniable economic value.

2) Five myths that won't die (and why they're wrong)

  • “Quantum breaks all encryption tomorrow.” Breaking RSA-class crypto needs thousands of logical qubits. We're far from that. The practical move is migrating to post-quantum cryptography (PQC) now.
  • “Quantum makes LLMs 100x faster.” Quantum ML has a data loading bottleneck; most ML workloads don't map cleanly to gate models.
  • “Most qubits wins.” 1,000 noisy qubits can be less useful than 10 high-quality logical qubits.
  • “NISQ is useless.” NISQ is valuable for learning workflows, integration, benchmarking discipline, and early customer discovery.
  • “Only PhDs can work in quantum.” The growth areas are software, controls, integration, and applications—many roles don't require a physics PhD.

3) The stack that matters: system engineering wins

Quantum computing is not one breakthrough away. It's a stack:

  • Qubit hardware (physics)
  • Control + readout (timing, RF/laser control, feedback)
  • Calibration + automation (drift correction, yield, characterization)
  • Error correction + decoding (real-time syndromes + classical co-processors)
  • Software + compilation (abstractions, optimization, resource estimation)
  • Applications + business models (why anyone pays)

Even a “best-in-lab” qubit technology loses if its control/calibration/QEC pipeline can't scale economically. This is why “picks-and-shovels” companies (controls, test, compilers, cloud platforms) show up so prominently in commercialization-focused conferences.


4) The hardware horse race: six approaches, six tradeoffs

No single technology has pulled decisively ahead. Each platform optimizes a different bottleneck:

  • Superconducting (Google / IBM / Rigetti): fast gates, strong toolchains, cryogenics + wiring challenges.
  • Trapped ions (Quantinuum / IonQ): best fidelities, slower operations, scaling via modular traps remains the hard part.
  • Neutral atoms (QuEra / Pasqal): large arrays, improving fidelities, promising scaling narrative if error rates keep improving.
  • Photonics (PsiQuantum / Xanadu): room-temp vision, but deterministic photon sources + loss remain difficult.
  • Spin / silicon (Intel + startups): CMOS compatibility long-game, but fidelity and control challenges persist.
  • Topological (Microsoft): potentially dramatic QEC overhead reduction, but proof burden and timelines are uncertain.

5) What's compounding vs. what may hit a wall

Likely to keep compounding

  • QEC engineering + real-time decoding (the main highway).
  • Control systems + automation + testing (manual calibration doesn't scale).
  • Hybrid integration with HPC/GPUs/cloud (quantum augments classical, not replaces it).
  • Resource estimation (turning sci-fi into budgets and engineering plans).
  • PQC migration (real revenue now; compliance and security need it).

Hype likely to get punished

  • “Quantum ML will revolutionize AI next year.” Data loading + unclear advantages make this a late-stage bet.
  • “Our NISQ optimizer beats classical solvers.” Classical OR tooling is brutal; proof must be public and reproducible.
  • “We'll pivot to revenue when hardware scales.” Many companies won't survive long enough without near-term cash flow.

6) Companies: who is building what—and how do they survive financially?

  • Strategic giants (IBM / Google / Microsoft / Amazon): quantum as an embedded option inside cloud and platforms.
  • Full-stack specialists (Quantinuum / IonQ): hardware + software + services; survival depends on renewals, margins, runway.
  • Pure-play hardware (Rigetti / others): highest technical leverage, but toughest cash-flow math.
  • Picks-and-shovels (controls, test, compilers, cloud integration): diversified across architectures, earlier revenue, usually best risk-adjusted exposure.

Hard truth: the valley of death is long. Runway and milestone velocity matter as much as qubit physics.


7) Where value lands first

ApplicationConfidenceLikely windowWhy
Chemistry & materialsHigh2028–2032Quantum systems map naturally; even modest gains can be worth billions.
PQC migrationHighNow (2025–2030)Compliance + long crypto lifecycles create real enterprise spend today.
Optimization (select cases)Medium2027–2032Needs reproducible advantage vs. world-class classical solvers.
Financial modelingMedium2028–2032High-value, high-bar; must beat HPC with accuracy and ROI.
Quantum MLLow2035+Data loading bottleneck; unclear killer app.
CryptanalysisLow2030–2035+Requires large logical qubit counts; mostly nation-state level.

8) My outlook: 3–7 years

  • 2025–2027: Engineering proof phase. Benchmarking discipline improves; QEC and automation become the core battles; revenue is mostly pilots + gov + PQC.
  • 2027–2032: Early domain value (if QEC keeps scaling). Chemistry/materials likely first; tooling consolidates; M&A increases.
  • 2032+: Platform shift (high uncertainty). The key variable is cost per reliable logical operation.

9) What you can do now (without building qubits)

  • Start with PQC migration. It has budgets today and doesn't require deep quantum physics.
  • Build “picks and shovels.” Controls, testing, compilers, resource estimation, and cloud integration will compound across architectures.
  • Become a translator. If you know pharma, finance, logistics, or materials: learn enough quantum to map real problems to workflows.
  • Invest like it's a long-duration option. Diversify, size small, and focus on runway + milestones—not headlines.

Closing thought

Quantum computing is entering the hardest part: scaling reliability, economics, and systems integration all at once. The industry is increasingly honest about that—which is a good sign. If you position yourself now, you don't need to predict the exact winner to benefit when quantum crosses from “interesting” to “essential.”