The signs of a technology in breakthrough mode are everywhere. Google cracked a 30-year error barrier. Microsoft built a new type of qubit. IBM revealed its Nighthawk processor and a roadmap to the world’s first large-scale fault-tolerant quantum computer. Meanwhile, pure-play quantum stocks posted gains of hundreds — in some cases thousands — of percent over the past year, as investors poured somewhere between $4 billion and $5 billion into quantum startups.
And yet, an uncomfortable truth lingers just beneath the surface of the excitement: Quantum computers have produced extraordinary benchmark results and a handful of promising early-stage commercial demonstrations but nothing that is commercially indispensable. The field sits in a strange and liminal place: something is happening but equally clearly, the revolution is not here yet.
This is the quantum paradox of 2026. The breakthroughs are real. The timelines are real. And the gap between today’s machines and the fault-tolerant quantum computers needed to unlock transformational applications is also very real.
“Why Quantum Is Around The Corner And Why It Is Not” was the title of a panel moderated by The Innovator’s Editor-in-Chief at the World Economic Forum’s annual meeting in Davos. Panelists included IBM Chairman Arvind Krishna, Lene Oddershede, Chief Scientific Officer, Planetary Science and Technology, Novo Nordisk Foundation, Doreen Bogdan-Martin, Secretary General of the International Telecommunications Union, and 2025 Nobel Prize Laureate in Physics John Martinis (watch the video of the Davos panel)
This story builds on that conversation and looks at the breakthroughs and barriers to quantum computing, a technology that promises to optimize business operations through advanced machine learning and simulations, improve the accuracy of disease detection, and ensure secure data transmission with cutting-edge encryption, which is crucial for future-proofing cybersecurity.
Google’s Willow: Cracking The Error Threshold
On December 9, 2024, Google published a paper in Nature announcing that its 105-qubit Willow chip had achieved what the quantum field calls ‘below threshold’ performance — the point at which adding more qubits to a system reduces its overall error rate, rather than increasing it. This had been an unsolved challenge since Peter Shor first proposed quantum error correction in 1995. Every previous quantum system had behaved the opposite way: the more qubits you added, the worse the errors became.
Willow solved that problem by demonstrating exponential error suppression as its qubit array scaled from a 3×3 grid up to a 5×5 and then 7×7 configuration, halving the error rate at each step. As if to make the point in the most dramatic way possible, Google also ran the standard random circuit sampling benchmark on Willow: it completed the task in under five minutes. The equivalent calculation would take the world’s fastest classical supercomputer — the 1.68 exaflop Frontier system at Oak Ridge National Laboratory — approximately 10 septillion years (that is 10 followed by 24 zeros, a number that vastly exceeds the age of the universe).
Critics have been quick to note the limitations of this achievement. The benchmark itself does not correspond to any practically useful computation. The logical error rate Willow demonstrated — around 0.14% per cycle — remains far above the 10⁻⁶ levels estimated to be necessary for running meaningful large-scale quantum algorithms. And Willow’s below-threshold performance was demonstrated in quantum memory, not in logic gate operations. But the principle it proved is enormously important: it shows, for the first time in hardware, that building a large, error-corrected quantum computer is physically possible.
IBM’s Nighthawk and the 2029 Roadmap
IBM’s strategy has never been to wait for perfection before deploying. The company has operated quantum computers on the Cloud for nearly a decade and has a paying user base across finance, pharmaceuticals, and logistics. At its Quantum Developer Conference in November 2025, it unveiled Nighthawk, its most advanced quantum processor to date, featuring 120 qubits connected by 218 next-generation tunable couplers. Nighthawk can accurately execute circuits with 30 percent more complexity than its predecessor, the Heron processor, while maintaining low error rates. IBM expects Nighthawk to support up to 5,000 two-qubit gates — the fundamental entangling operations critical for useful quantum computation — with iterations delivering 10,000 gates by 2027.
In parallel, IBM announced Quantum Loon, an experimental processor that demonstrates all the key components needed for fault-tolerant quantum computing. The company’s fault-tolerant roadmap centres on IBM Quantum Starling, a system targeting 200 logical qubits capable of executing 100 million error-corrected operations, slated for 2029. The plan extends to 1,000 logical qubits by the early 2030s and quantum-centric supercomputers with 100,000 qubits by 2033.
IBM has also already claimed a meaningful real-world result: in September 2025, HSBC announced it had used IBM’s Heron quantum computer to improve its bond trading predictions by 34 percent compared with classical computing alone. IBM partnered with RIKEN in Japan to simulate molecules at a level beyond classical computers using the Heron processor alongside the Fugaku supercomputer — what IBM called a ‘utility scale’ result.
This month IBM unveiled the industry’s first published quantum‑centric supercomputing reference architecture, a new blueprint for integrating quantum computing into modern supercomputing environments. The architecture shows how quantum processors (QPUs) can work alongside GPUs and CPUs—across on‑premises systems, research centers, and the cloud to tackle scientific challenges that no single computing approach can solve on its own.Designed for today’s workloads and built to evolve over time, the architecture brings quantum and classical systems together into a unified computing environment. It combines quantum hardware with powerful classical infrastructure, including CPU and GPU clusters, high‑speed networking, and shared storage, to support computationally intensive workloads and algorithms research.On top of this foundation, IBM’s approach enables coordinated workflows that span quantum and classical computing. Integrated orchestration and open software frameworks, including Qiskit, allow developers and scientists to access quantum capabilities through familiar tools and workflows—which the company says will make it easier to apply quantum computing to problems in areas such as chemistry, materials science, and optimization.
Microsoft’s Topological Gamble: Majorana 1
Microsoft is pursuing the most radical departure from conventional quantum computing. In February 2025, the company unveiled Majorana 1, the world’s first quantum processor powered by a Topological Core — a chip architecture built on exotic quasiparticles called Majorana Zero Modes. Unlike the superconducting qubits used by IBM and Google, which require extensive error correction because they are so sensitive to environmental disturbances, topological qubits are designed to store quantum information non-locally in the fabric of the material itself, making them inherently more stable and far less prone to errors.
Majorana 1 is built from a novel combination of indium arsenide (a semiconductor) and aluminum (a superconductor), cooled to near absolute zero, which creates topological superconductivity — a new state of matter that had previously existed only in theory. Microsoft claims its architecture could eventually scale to a million qubits on a single chip. The company is in the final phase of DARPA’s Utility-Scale Quantum Computing (US2QC) program and says it is on track to build the first fault-tolerant prototype ‘in years, not decades.’
Sceptics note that topological qubits remain unproven at any meaningful scale, and that Microsoft has previously faced scrutiny over earlier Majorana claims that required retraction. The company’s approach is best described as a long-odds, high-reward bet — one that, if it pays off, could make the rest of the field’s careful error-correction work look unnecessary.
Superconducting Startups
Big players like IBM and Google are not the only companies pursuing a superconducting approach to quantum computing. Martinis, a 2025 Nobel Prize winner in physics and the man behind some of the breakthroughs that enable superconducting, co-founded his own startup, Qolab, a Silicon Valley-based hardware company developing utility-scale superconducting quantum computers. By combning deep physics and engineering expertise with strategic semiconductor partnerships, it hopes to solve the toughest challenges on the path to fault-tolerant quantum computing. (For more on Qolab see the separate interview with Martinis).
Finland’s IQM, which raised $320 million in September, giving the company a valuation of $1 billion, is also in the superconducting camp. It is using the same broad approach used by Google and IBM-but differentiates itself through its tunable coupler design, high fidelity and its novel architecture. It has announced plans to become one of Europe’s first publicly listed companies in the sector.
IonQ: Trapped Ions And Aggressive Targets
IonQ, the publicly traded Maryland-based quantum company, is betting on a different underlying technology from IBM and Google: trapped ions. Rather than manufacturing qubits in silicon, IonQ uses individual charged atoms suspended in electromagnetic fields, which naturally offer longer coherence times and higher fidelity than superconducting systems. In June 2025, IonQ unveiled an accelerated roadmap targeting 20,000 physical qubits across two entangled chips by 2028, potentially equating to around 1,600 error-corrected logical qubits — enough, the company argues, to pose a threat to RSA encryption. IonQ has already delivered a real-world milestone: in March 2025, in collaboration with engineering company Ansys, it ran a medical device fluid simulation on its 36-qubit computer that outperformed classical high-performance computing by 12%, one of the first documented cases of practical quantum advantage in a real application.
There are others. In May of last year Einride, a Swedish transport company specializing in electric and autonomous vehicles and IonQ entered into a three-year partnership to explore ways to further enhance the Swedish company’s Saga platform through quantum technology. By combining advanced quantum technology with proven classical computing methods, the partnership aims to help navigate the complexity of today’s electric transport ecosystem. The engineering teams at Einride and IonQ have successfully modularized the fleet orchestration problem, allowing quantum algorithms to specifically target the optimization of shipment allocation while at the same time accounting for critical real-world constraints across shipments, vehicles, drivers, and charging infrastructure.
Initially the companies assessed 15 potential quantum use cases within the Einride ecosystem, spanning from optimizing shipment scheduling and load building to energy trading, and enhancing the safety and security of autonomous trucks through improved training, navigation, and quantum-key distribution. Current benchmarks validate the effective integration of quantum processing within the existing workflow. This foundational work positions the companies to capture competitive advantages as quantum hardware matures, offering customers the potential for optimization capabilities that are unattainable with classical computing alone, says IonQ.
In June of last year IonQ additionally announced results of a collaborative research program with AstraZeneca, Amazon Web Services and Nvidia to develop and demonstrate a quantum-accelerated computational chemistry workflow with the potential to power world-changing innovation in healthcare, life sciences and chemistry. The IonQ-designed workflow provided an end-to-end example of a hybrid quantum-classical workflow that helps provide solutions to complex pharmaceutical development challenges, with the potential to improve speed and efficiency within the drug development process. This demonstration focused on a critical step in a Suzuki-Miyaura reaction – a class of chemical transformations used for the synthesis of small molecule drugs. By integrating IonQ’s Forte quantum processing unit (QPU) with the Nvidia Cuda-Q platform through Amazon Braket and AWS ParallelCluster services, the team announced that it had achieved more than 20 times improvement in end-to-end time-to-solution compared to previous implementations. The technique maintained accuracy while reducing the overall expected runtime from months to days, according to a press release.
These cases are examples of how quantum computing is already beginning to prove that it can enhance classical work flows and tackle simulation problems more efficiently, says Marco Pistoia, Ph.D. CEO of IonQ Italy, the former head of quantum computing at JP Morgan, and the Principal Investigator of the first real-world Quantum Computing application unattainable on any classical supercomputer, published in Nature in April 2025, “Organizations are using it to tackle real world problems, it is no longer a hypothesis.”
Quantinuum: A Leader in Fidelity
Quantinuum, valued at $10 billion and majority-owned by Honeywell, has quietly built what many experts consider the world’s highest-fidelity quantum computer. The company’s H-series machines use trapped-ion technology and have consistently set records in quantum volume — the standard benchmark of overall system quality. In a landmark result published in Nature, Quantinuum, working with JPMorgan Chase, Oak Ridge National Laboratory, and others, generated cryptographically certified true randomness — an application with immediate and demonstrable commercial value in cybersecurity. The company’s Apollo roadmap targets a universal fault-tolerant machine by 2030. Oxford Ionics, a UK-based Quantinuum partner, demonstrated 99.99% fidelity for two-qubit gates in 2025, widely considered one of the highest-quality qubit results ever achieved.
D-Wave: The Annealing Outlier
D-Wave has been operating commercial quantum computers for over 25 years and takes an entirely different approach from all of the above. Its quantum annealing architecture, now with over 5,000 qubits in its Advantage2 system, is purpose-built for optimisation problems rather than general-purpose computation. In March 2025, D-Wave claimed the world’s ‘first demonstration of quantum computational supremacy on a useful, real-world problem,’ outperforming a classical supercomputer on a magnetic materials simulation. More prosaically, Ford Otosan used D-Wave’s technology to reduce production scheduling times from 30 minutes to less than five. D-Wave is profitable where most of its peers are not, suggesting there is a real market for narrow, near-term quantum advantage even before fault-tolerant machines arrive.
PsiQuantum and the Photonic Path
In September 2025, PsiQuantum became the world’s most funded quantum startup, raising $1 billion in a single round to bring its total funding to over $1.3 billion at a $7 billion valuation. The company is building a photonic quantum computer — one that encodes qubits in single photons travelling through optical circuits, rather than in superconducting chips or trapped atoms. The appeal of photons is significant: they operate at room temperature, can be manufactured using existing semiconductor fabrication infrastructure, and can travel long distances without losing coherence. PsiQuantum’s bold claim is that it can reach one million physical qubits by 2027–2028 using standard chip-fab techniques. Critics question whether the extremely high photon loss rates in optical components can be managed at scale.
Amazon’s Ocelot and the Cat Qubit Approach
Amazon Web Services entered the quantum hardware race in 2025 with the Ocelot chip, which takes a hybrid approach combining cat qubits with transmon qubits. Both Amazon Ocelot and the French quantum computing startup Alice & Bob are built around cat qubits, a type of superconducting qubit named after Schrödinger’s cat. Cat qubits have a major advantage: their inherent protection against bit-flip errors. Increasing the number of photons in the oscillator can make bit-flip error rates exponentially small, meaning that instead of increasing qubit count, you can simply increase the energy of an oscillator, making error correction far more efficient. Both organizations have independently arrived at this as the most promising path to fault-tolerant quantum computing, with Alice & Bob’s founders being credited with pioneering much of the foundational work.
Nvidia’s Bid To Build Quantum-Classical Hybrid Supercomputers
Nvidia, meanwhile, announced NVQLink, an open system architecture for coupling its GPUs with quantum processors to build quantum-classical hybrid supercomputers, with over twenty quantum companies joining as launch partners. The message is clear: the largest players in classical computing see quantum as a component of future heterogeneous computing systems, not a standalone replacement.
Spin Qubits: Europe’s Bet
Spin qubits represent one of the most promising approaches to building scalable, fault-tolerant quantum computers. At their core, spin qubits encode quantum information in the intrinsic angular momentum (spin) of electrons, holes, or atomic nuclei confined within nanoscale semiconductor structures called quantum dots. The two possible spin orientations — “spin up” and “spin down” — serve as the quantum analogue of the classical bit’s 0 and 1, while quantum mechanics permits these states to exist in superposition, unlocking the computational power unique to quantum systems.
Spin qubits are commonly realized in silicon (Si) or silicon-germanium (Si/SiGe) heterostructures, as well as in germanium (Ge) devices. Their fabrication exploits the very same semiconductor manufacturing infrastructure — photolithography, ion implantation, CMOS processing — that underpins the global microchip industry. This compatibility with existing industrial processes is a defining differentiator of the spin qubit approach compared to superconducting qubits, trapped ions, or neutral atoms.
Intel and Australia’s Diraq are embracing this approach, along with a whole host of European startups. Europe has emerged as a significant center for spin qubit research and commercialization, driven by world-class academic institutions in Germany, the Netherlands, Finland and France.
The concentration of European spin qubit activity in the Netherlands and Germany is no coincidence. QuTech — the collaborative research center of TU Delft and TNO — has been central to global advances in spin qubit science for over a decade, producing foundational results in silicon, germanium, and diamond spin qubits. Its spinout ecosystem now includes Groove Quantum (hardware), Qblox (control electronics), and Orange Quantum Systems (testing equipment), forming a near-complete spin qubit supply chain in Delft.
Germany’s €2 billion national quantum investment (2020–2024) has supported Arque Systems’ development at Forschungszentrum Jülich and RWTH Aachen, as well as collaborations with Infineon Technologies for industrial-grade chip fabrication. The European Innovation Council has been the primary private funding vehicle for both the Netherlands’ Groove Quantum and Finland’s SemiQon, reflecting the EU’s strategic prioritization of semiconductor-compatible quantum hardware.
The Netherlands launched Quantum Inspire — Europe’s first public cloud quantum computing platform — which includes access to silicon-spin qubit processors alongside superconducting ones, making the technology directly accessible to researchers and developers across the continent. The EuroHPC Joint Undertaking is also driving toward deploying spin qubit-based systems in major European supercomputing centers, with the Dutch national quantum computer — to be installed in Amsterdam — specified to use spin qubit technology.
The European spin qubit cluster — anchored by QuTech, Forschungszentrum Jülich, and their respective startup ecosystems — is well-positioned to lead this transition. With Arque’s electron shuttling architecture, Groove Quantum’s germanium platform, SemiQon’s cryo-CMOS foundation, and Austria’s ParityQC’s error correction blueprints, Europe has assembled the key components of a full spin qubit stack. The next milestone to watch is the delivery of the first multi-hundred qubit spin qubit systems to European supercomputing centers, expected in the latter half of this decade.
France’s Quobly is a natural and significant member of this ecosystem, though it sits in a somewhat distinct national cluster. Quobly, formerly known as Siquance, is a French quantum computing startup founded in 2022, operating as a spin-off from CEA-Leti and CNRS, two prominent French research institutions, based in Grenoble. The startup’s strategy is to use proven semiconductor technologies to bring an operable quantum computer to market, leveraging the physical properties of semiconductors to fabricate quantum dots — the basis for high-quality spin qubits. CEA
Quobly’s approach is built on silicon spin qubits using Fully Depleted Silicon-On-Insulator (FD-SOI)technology. By adopting a fabless model and utilizing FD-SOI technology — a commercially available CMOS platform manufactured by industry leaders like STMicroelectronics, GlobalFoundries, and Samsung — Quobly aims to capitalize on decades of semiconductor infrastructure investments. This places it in direct philosophical alignment with Groove Quantum’s germanium platform and SemiQon’s cryo-CMOS foundation: all three are betting that the path to a million-qubit machine runs through standard semiconductor fabs, not bespoke cryogenic hardware.
Quobly also carries a distinctly French national strategy dimension that gives it a slightly different character from the QuTech/Jülich cluster. Its recent €21 million funding included a €15 million grant from Bpifrance under the France 2030 program, and it has announced a manufacturing partnership with STMicroelectronics targeting production-readiness by 2027. Data Center Dynamics France is building its own sovereign quantum stack, and Quobly is a centerpiece of that effort.
Although it is early days Quobly’s quantum computers are poised to be sufficiently fast, integrated and affordable to enable practical uses for industries that face challenges in chemistry, material science, optimization and logistics, says CEO Maud Vinet. “We will be able to provide access to qubit machines by the end of the year,” she says. The machines will be limited to around 12 qubits but people in the R&D market are willing to pay for a limited quantity to begin to test the technology, she says.
It is an example of how spin qubit technology is transitioning from academic proof-of-concept to early commercialization. The fundamental physics is well understood, world-class gate fidelities have been demonstrated, and the compatibility with CMOS manufacturing is increasingly validated in industrial settings. The core challenge for the next five years is scaling: moving from 12-qubit research chips to hundreds and then thousands of operational qubits while maintaining fidelity and managing the classical control overhead.
China’s Role In The Global Quantum Race
China has made remarkable strides in quantum computing across multiple hardware platforms and on one of quantum computing’s biggest challenges: error correction. It is also making significant moves toward commercialization. The country’s Zuchongzhi 3.0-based system has been opened for commercial use via the “Tianyan” quantum cloud platform. It has reportedly received over 37 million visits from users across 60 countries and handled over 2 million experiments since its launch. China’s new Five-Year Plan (2026–2030) explicitly identifies quantum technology as a new driver of economic growth, signaling a bid for leadership underpinned by a coordinated ecosystem linking universities, research institutes, and industry.
The NISQ Problem
Despite the headlines, the quantum computers that exist today — from Google’s Willow to IBM’s Nighthawk to IonQ’s Forte — belong to a category called NISQ: Noisy Intermediate-Scale Quantum. The term, coined by theoretical physicist John Preskill in 2018, describes machines that have too few qubits and too many errors to perform the algorithms that would make quantum computing genuinely transformative. NISQ hardware is characterized by restricted qubit connectivity, imperfect gate fidelity, limited coherence times, and extreme sensitivity to temperature, vibration, and electromagnetic interference. Any of these factors can cause a qubit to lose its quantum state — a failure known as decoherence — and current machines must operate inside dilution refrigerators cooled to temperatures colder than outer space just to maintain any quantum behaviour at all.
The fundamental barrier is this: the algorithms that would give quantum computing its most powerful advantages — Shor’s algorithm for breaking RSA encryption, or quantum chemistry simulations capable of designing new drugs and materials — require millions of error-corrected logical qubits running billions of operations with extremely low error rates. Today’s best machines have hundreds of physical qubits and cannot yet demonstrate even a single logical qubit operating at fault-tolerant levels for useful computations. As researchers at the MDPI academic journal noted in 2025, ‘achieving a definitive quantum advantage over classical methods for practical problems remains unresolved.’
The Benchmark Problem
The quantum field has a long history of dramatic benchmark demonstrations that turn out not to correspond to any commercially useful calculation. Google’s 2019 ‘quantum supremacy’ claim with its Sycamore processor — the computation that would take a classical supercomputer 10,000 years — was quickly countered by IBM, which showed an improved classical algorithm could do the same task in days. Similarly, Willow’s 10-septillion-year benchmark is based on random circuit sampling, a task specifically designed to be hard for classical computers but which has no practical application. Scientists and investors alike need to be careful to distinguish between benchmark supremacy and practical advantage.
Nvidia CEO Jensen Huang crystallized this skepticism at CES 2025, where he stated that practical quantum computing was 15–30 years away. His comments caused a significant crash in quantum stock prices. Huang subsequently walked back the timeline in March 2025, acknowledging faster-than-expected progress, and later backed the statement with over $100 million in quantum computing infrastructure investments — a reminder that even the most prominent technology sceptics are keeping their options open.
The Talent and Infrastructure Gap
Perhaps the most underreported constraint on quantum computing’s progress is the workforce shortage. Quantum error correction is the key to unlocking fault-tolerant machines, yet there are estimated to be only 600 to 700 quantum error correction specialists worldwide. The field will need 5,000 to 16,000 such specialists by 2030. QEC training typically takes up to ten years. The skills pipeline simply does not yet exist to build the machines the industry is promising at the speed the roadmaps imply.
Commercially useful quantum computing also requires the development of entirely new supply chains for cryogenic systems, specialized microwave electronics, novel materials for qubit fabrication, and low-latency classical co-processors for real-time error decoding. These are engineering challenges of extraordinary complexity. As Riverlane, a quantum error correction specialist, noted in its year-end 2025 review, ‘this lack of clear, consistent communication about machine capabilities and real progress has hindered understanding’ — a polite way of saying that the field’s marketing has consistently outpaced its engineering.
What Real-World Progress Looks Like
Amid the noise, genuinely useful quantum results are beginning to emerge — modest by the transformative promises of the field, but real, nonetheless. HSBC’s 34% improvement in bond trading predictions using IBM’s Heron processor is one of the first public examples of a financial institution gaining a measurable edge from quantum computing. IonQ and Ansys’s 12% speedup in medical device simulation is another. D-Wave’s reduction of a Ford Otosan scheduling task from 30 minutes to under five demonstrates that even narrow, non-universal quantum approaches can deliver commercial value today.
In academia, University of Michigan scientists used quantum simulation to solve a 40-year puzzle about the stability of quasicrystals. IBM and RIKEN jointly achieved molecular simulations at a scale beyond classical computing alone. Quantinuum, in collaboration with JPMorgan Chase and national laboratories, generated verifiable certified randomness for cybersecurity applications. In China, scientists at the University of Science and Technology of China reported that their Jiuzhang 4.0 photonic quantum computer achieved quantum advantage in an independent experiment.
The consulting firm McKinsey projects the quantum computing industry could be worth between $28 billion and $72 billion by 2035, up from $750 million in 2024. The DARPA Quantum Benchmarking Initiative is investing toward a $1 billion quantum computer by 2033. These numbers reflect not imminent widespread deployment, but a genuine industry forming around the anticipation of future capability.
The “Quantum Advantage” Threshold
At around 50 qubits, a quantum computer can represent and manipulate a superposition of 2⁵⁰ (~1 quadrillion) states simultaneously. This is roughly the point where classical supercomputers begin to struggle to simulate the quantum system exactly — you’d need petabytes of RAM just to store the full quantum state. So 50 qubits is often cited as where quantum systems start to operate beyond classical brute-force simulation.
The Magne quantum computer, provided by Atom Computing and Microsoft and to be located in Copenhagen, is expected to hit that threshold later this year.
This is when we will begin to see real world use cases, though quantum computing still won’t be applicable to all problems, says Novo Nordisk Foundation’s Oddershede. Together with the Export and Investment Fund of Denmark, her organization has acquired the Magne computer which will be operational in January 2027, she said in an interview with The Innovator.
Every corporate is being advised to be quantum ready. Which industries will likely see the most impact first? “Number one is going to be simple materials, molecules,” IBM’s Krishna said during the Davos panel. “Think lubricants that can reduce the amount of energy needed right now. We only get 30% oil out from an oil well, if we can create a better lubricant and maybe need 40% less drilling, there will be less environmental impact. Maybe we can use quantum to create a better material to do carbon sequestration. The materials industry is a multi-trillion industry so even 5% productivity could have massive impact. I hold aside hope that we can invent a better fertilizer. Quantum can be applied to certain problems in finance. Complicated instruments –are very hard to solve through the current methods.”
The Road Ahead
The picture that emerges from the latest round of announcements is of a field that has genuinely crossed several important scientific thresholds but has not yet bridged the gap to practical utility at scale.
The different approaches being pursued — superconducting qubits (IBM, Google, Rigetti, IQM), trapped ions (IonQ, Quantinuum), topological qubits (Microsoft), neutral atoms (QuEra, Atom Computing, Pasqal), photonics (PsiQuantum, Xanadu), and silicon spin qubits (Intel, Diraq, Quobly, Arque, Groove Quantum, SemiQon and ParityQC) — mean there is no consensus yet on which technology will dominate. The market may ultimately support several different quantum architectures optimized for different problem types, much as classical computing supports CPUs, GPUs, and specialized processors side by side.
What the field’s leading indicators do suggest is that the period between 2026 and 2033 will be decisive. IBM is promising a 200-logical-qubit fault-tolerant machine by 2029. IonQ targets a cryptographically relevant quantum computer by 2028. Google’s CEO has hinted at a useful error-corrected machine around 2029. Microsoft says its fault-tolerant prototype is ‘years, not decades’ away.
The Quantum Insider, in its 2026 predictions, cautions that ‘announcements will continue — some significant, some incremental’ and warns that ‘advances may be marketed more aggressively than the underlying data justifies.’
2025 Nobel Laureate in Physics Martinis bemoans the “murky” marketing language used to hype quantum and rejects the notion that the science has been proven and it is now just an engineering problem. “I understand companies want to give that message so that all the stockholders feel good about what they’re doing,” Martinis said in an interview with The Innovator. “And in some sense, it’s true, but I try to be a little bit more forthright about how there’s still some real big problems to solve. I think they can be solved, but we still need to do it, and it’s not easy.”
The key advice for organizations and investors watching this space is to look past the benchmarks and toward the specific engineering problems being solved, the quality of the qubits being demonstrated, and the specific real-world problems for which a quantum advantage has been genuinely verified.
The quantum future is real. It is also, for now, still arriving.
This article is content that would normally only be available to subscribers. Become a subscriber to see what you have been missing
