D-Wave Quantum

Stock Symbol: QBTS | Exchange: United States
Share on Reddit

Table of Contents

D-Wave Quantum Inc.: The Unlikely Pioneer of Commercial Quantum Computing

I. Introduction & Episode Roadmap

Picture this: February 2007, the Computer History Museum in Mountain View, California. In a room full of skeptics and cameras, a company almost nobody has heard of is about to do something that, at the time, feels borderline impossible.

The world’s top physics departments are still debating what “real” quantum computing will even look like. Meanwhile, a scrappy Canadian startup named D-Wave walks onstage and demos a machine tackling Sudoku and running a toy “drug discovery” style search for molecules similar to a given compound. The stunt is undeniably showy, but it lands. By the standards of that moment, it looks astonishing—especially compared to what most people think of as “standard” quantum progress back then, where the flashy benchmark was often something like factoring the number 21.

And then the backlash hits.

The reaction from the quantum computing world is swift and brutal. “I think it is not too strong to say they were initially ridiculed by the academic community,” Jeremy O’Brien, a physicist at the University of Bristol, later said. The issue wasn’t just the underlying adiabatic approach—which did have real academic roots, even if the literature was thin. It was the way D-Wave sold it: bold claims, big demos, and a very Silicon Valley style “we built it” confidence that made a conservative scientific community bristle.

That’s D-Wave in a nutshell: a company that has spent more than two decades both ahead of its time and perpetually on trial.

Here’s the paradox we’re unpacking: how did D-Wave—the world’s first, and for years the only, company selling quantum computers—win recognition from customers like Lockheed Martin, which bought a D-Wave system in 2011 for about $10 million, and later draw serious interest from Google… and yet remain controversial among the very researchers whose field it helped push into the commercial world?

To answer that, we’re going to follow the whole arc. The tension between scientific purity and commercial pragmatism. The courage—or stubbornness—to bet on an approach the gate-based crowd didn’t respect. And the razor-thin line between “visionary” and “delusional” in deep tech, where the product barely works, the market barely exists, and the story you tell can matter almost as much as the physics.

We’ll trace D-Wave from its origins in a University of British Columbia lab to its eventual New York Stock Exchange listing. We’ll dig into the annealing versus gate-based fight that defined its reputation. And we’ll end in the present, where the AI boom has breathed new life into optimization—and may finally be giving D-Wave the moment it’s been promising for years.

And yes, something has shifted. In 2025, quantum computing stocks became some of the biggest winners in tech. D-Wave Quantum surged—up 256% on the year—outpacing not just its quantum peers but even the broader market and the so-called “Magnificent Seven.” After years of skepticism, ugly drawdowns, and endless “is this even quantum?” arguments, investors started paying attention again.

Whether that move reflects a real inflection point—or just another spin of the hype cycle—is the question hanging over everything that follows.


II. Quantum Computing 101 & Why It Matters

Before we dive back into D-Wave’s saga, we need to understand the battlefield it chose to fight on.

Classical computers—the ones in your phone, your car, and every data center on Earth—process information using bits. Each bit is either a 0 or a 1. That simple rule is the source of their reliability and their speed… and also the root of their limits. When a problem explodes into a huge number of possible combinations, classical machines still have to grind through them in essentially a step-by-step way, even if they do it incredibly fast and in parallel.

Quantum computers swap bits for qubits. And qubits don’t have to “pick a side” the way bits do. Thanks to superposition, a qubit can be in a mix of 0 and 1—described not as a definite value, but as probability amplitudes. When you scale that up across many qubits, the system can represent a vast space of possibilities at once. When qubits become linked through entanglement, what happens to one is tied to the others, and the system’s state becomes something you can’t neatly describe as independent parts. That’s where the potential power comes from.

It’s tempting to translate all of this into a single, catchy line like “a qubit is both 0 and 1 at the same time.” It’s not wrong, but it can be misleading. The real point is: quantum mechanics gives you a different kind of math to compute with. And for certain classes of problems, that different math can be a game-changer.

If quantum computing works at scale, it could reshape everything from simulating molecules for drug discovery, to optimizing logistics networks, to cracking cryptography, to accelerating certain parts of AI and machine learning. That’s the dream that keeps this field perpetually “ten years away” and perpetually funded.

But the catch is brutal: quantum states are fragile. Interactions with the environment—heat, vibration, electromagnetic noise—can cause decoherence, collapsing the system back into classical behavior and wiping out the advantage. Keeping qubits coherent long enough to do useful work is one of the hardest engineering challenges in modern technology.

And this is the fork in the road where D-Wave’s whole story lives. Because “quantum computing” isn’t one thing. There are two fundamentally different approaches—and they lead to very different products, timelines, and expectations.

Quantum annealing is built for optimization. You take a problem, encode it into the energy landscape of a physical system, and then let physics do what physics loves to do: roll downhill. The machine starts in an easy-to-prepare state and evolves toward a final state whose lowest-energy configuration corresponds to a solution. In plain English: you translate your problem into “find the lowest valley,” then you let the system try to settle into the deepest one.

Gate-based quantum computing is the general-purpose model most people picture. You initialize qubits, then apply a sequence of operations—quantum gates—like instructions in a program. In theory, this is the path to universal quantum computing: the kind that could run famous algorithms like Shor’s (the encryption-breaker) and Grover’s (the search accelerator). This is the route IBM, Google, and most academic labs have pursued.

So here’s the simplest way to hold it in your head: gate-based is the generalist; annealing is the specialist. Gate-based is more flexible in principle, but harder to build into something practical. Annealing is narrower—mostly optimization—but potentially achievable sooner.

D-Wave bet everything on the specialist.

That decision is why they got a machine into customers’ hands while much bigger players were still in the lab. And it’s also why the company has spent years being argued over like a court case: not just “does it work,” but “does it count?”

This distinction matters because it’s the foundation of D-Wave’s entire thesis. The company isn’t claiming it will be the universal quantum computer that does everything. It’s arguing something far more pragmatic: that quantum annealing can deliver real value on real problems now—or at least sooner—even if gate-based universality remains a long way off.

And whether you see that as a brilliant shortcut or a permanent dead end determines how you interpret everything that comes next.


III. Origins: From UBC Research to Startup Ambition (1999–2007)

D-Wave starts with a founder who reads like a character someone would tone down in the edit: a wrestler who became a physicist who became an entrepreneur—and never really stopped being any of those things.

“I’ve been doing combative stuff since I was born,” Geordie Rose once said, half-joking, half-confessing. For most of his early life, that meant wrestling: a sport he described as offering the least reward for the most work. He later quipped that the same could be said of D-Wave: “an unbearable amount of pain and very little recognition.”

Rose earned his PhD in theoretical physics from the University of British Columbia. He was also a two-time Canadian national wrestling champion, later won a world title in Brazilian jiu-jitsu, and went on to be named Canadian Innovator of the Year. On paper, it sounds like a jumble. In practice, it’s a pretty clean through-line: obsessive preparation, comfort with discomfort, and a willingness to take on problems other people avoid because they look impossible.

D-Wave itself was founded in 1999 by Haig Farris, Geordie Rose, Bob Wiens, and Alexandre Zagoskin. Farris taught a business course at UBC, where Rose did his PhD, and Zagoskin was a postdoctoral fellow. Even the name had a very “physics lab turned startup” origin: it referred to their first qubit designs, which used d-wave superconductors.

From the start, the company carried this distinctive D-Wave blend of academic credibility and entrepreneurial hustle. The first investment check arrived on May 28, 1999. In hindsight, that moment matters not just because it launched D-Wave, but because it marked one of the earliest serious attempts to fund quantum computing as a commercial industry. After that, Haig and Geordie did what founders always do: they started making the rounds, trying to convince anyone with capital and curiosity to take a leap with them.

The first “yes” came from Barry Lando, a friend of Haig’s and a longtime producer of 60 Minutes. Paul Lee followed. Then Norm Francis came on board—his willingness to both show up for Haig’s class as a volunteer and write a check into something this speculative helped get the whole thing moving. Eventually, they scraped together roughly $500,000 from investors around Vancouver. Not a war chest—just enough runway to keep the idea alive.

And like a lot of ambitious Canadian startups, they eventually pointed themselves south and tried their luck with the real money. Rose and Farris ended up on Sand Hill Road in the final days of the dot-com boom, pitching anyone who would take the meeting. One of Rose’s first shots was Kleiner Perkins. He started explaining quantum computing—physicist mode fully engaged—and about 30 seconds in, the investor cut him off and asked, “Where’s your six-month IPO?” Rose hesitated, trying to process the question. The investor didn’t wait. “Get out.”

Those early years were defined by a strategy that, in retrospect, feels like a startup trying to buy time while the world catches up. From 1999 to 2003, D-Wave’s model was to fund academic research groups to explore ideas that might someday matter, and then own the resulting intellectual property. But by 2003, they decided it wasn’t going to work. They needed to stop sponsoring science and start building hardware.

That pivot changed everything. D-Wave went from a portfolio of bets to a single, brutal, all-in engineering program: build a real quantum computer. One of the first key hires in that era was Andrew, an experimental physicist who became central to the hands-on work—taking the company from its earliest lab steps toward the systems it would eventually ship.

Rose later called the earlier model “just a bad, bad, bad idea.” His new obsession was adiabatic quantum computing—an approach that naturally fit optimization problems, where you’re not asking for one correct answer so much as the best answer. The easy way to picture it is the “wedding seating chart” problem: you’re trying to satisfy a pile of competing constraints at once—best friends near each other, sworn enemies apart, family politics managed—until the whole arrangement settles into something that’s “lowest energy,” meaning least conflicted. Or think protein folding: finding the most stable configuration out of an astronomical number of possibilities. These weren’t parlor tricks. They were the kinds of real-world problems that seemed tailor-made for the one thing annealing promised: letting physics search a landscape of trade-offs.

The financing story kept evolving too, and it only got stranger. In 2011, years after that 2003 pivot, D-Wave secured $30 million in funding from Amazon founder Jeff Bezos and In-Q-Tel, the CIA’s venture capital arm. The fact that In-Q-Tel was willing to back a quantum computing company that early—well before most people could explain the difference between a qubit and a bit—says a lot about how compelling the potential applications looked to the right kind of buyer, and how much conviction D-Wave had managed to build with a certain class of sophisticated investor.

Underneath all of it was the same founding belief: quantum computing could be made practical, but only if you accepted constraints that purists hated. When Rose helped start D-Wave in 1999, he had an engineering degree, some early progress toward a PhD in theoretical physics at UBC, and essentially no concrete blueprint for how to build a quantum computer. What he did have was a founder’s most dangerous asset: the conviction that if you waited for perfect, you’d wait forever.


IV. The Breakthrough Moment: First Commercial System (2007–2011)

The quantum computing community was not ready for what D-Wave did in 2007.

On February 13, 2007, at the Computer History Museum in Mountain View, California, D-Wave rolled out the Orion prototype: a 16-qubit quantum annealing processor. Later that same year, on November 12, D-Wave demonstrated what it said was a 28-qubit quantum annealing processor—fabricated at NASA’s Jet Propulsion Laboratory Microdevices Lab in Pasadena.

This wasn’t just a demo. It was a dare.

At the time, academic labs were still treating “a few entangled qubits” as a publishable milestone. D-Wave, meanwhile, was publicly running applications—most famously, solving a Sudoku puzzle. And that’s exactly where the mistrust crystallized: many computer scientists and physicists argued the device was being driven by plain old classical physics, not quantum mechanics.

The criticism got sharp fast, and it got personal. A common charge was that D-Wave hadn’t produced convincing evidence that the device was genuinely quantum-mechanical. To skeptics, the company looked like it had built an extremely expensive, inefficient classical machine—and then wrapped it in quantum marketing.

No one became the face of that skepticism more than Scott Aaronson, then at MIT and later at the University of Texas at Austin. He didn’t dismiss quantum computing as impossible, and he didn’t call D-Wave a scam. But he did argue that their behavior triggered every alarm bell scientists are trained to recognize. As he wrote at the time: “Let me be clear: I think that quantum computers are possible in principle, and that D-Wave’s approach might even get us there. I’ve also met people from D-Wave; I don’t think they’re frauds. But the human capacity for self-deception being what it is, scientists train themselves to look for red flags—and D-Wave is pretty much a red-flag factory.”

The fight wasn’t really about whether optimization problems were important. It was about what counted as proof, and what counted as progress. The academic community kept coming back to the same objections: D-Wave hadn’t published the sort of rigorous, peer-reviewed evidence that would satisfy the field. They hadn’t definitively demonstrated entanglement. And even if the machine was quantum, there was no clear demonstration that it was faster than the best classical methods on a problem anyone cared about.

Rose’s response was unapologetic. He neither denied nor softened the brashness. He argued that this was how you built a company: you ship systems, you learn from customers, and you iterate in public. He also insisted he had no regrets about the 2007 press event—especially because it got Google’s attention, leading to informal work between Google and D-Wave soon afterward. “We’re not in this business to be popular,” he said.

It was a profound philosophical divide. Academics wanted airtight proof before bold claims. D-Wave believed building machines and getting them into the world was the fastest way to find truth. Neither side was entirely wrong. But they weren’t playing the same game.

Then came the commercial breakthrough that changed the conversation.

On May 11, 2011, D-Wave announced the D-Wave One: an integrated system built around a 128-qubit processor. The processor performed one kind of operation—discrete optimization—using quantum annealing. D-Wave called it the world’s first commercially available quantum computer, and the price tag being quoted was about $10 million.

The headline customer was even more important than the headline price.

Back in November 2010, Lockheed Martin had signed a multi-year contract with D-Wave to explore applying quantum annealing to some of Lockheed’s most challenging computation problems. Now, with the D-Wave One announcement, that relationship became the signal the industry couldn’t ignore: a serious buyer was taking this seriously.

Lockheed wasn’t buying on vibes. According to communications manager Thad Madden, Lockheed spent a year reviewing the D-Wave One before purchasing. The company said it planned to use the technology to help build “cyber-physical systems,” integrating software with environmental sensors.

Lockheed researchers also developed an algorithm aimed at a particularly thorny problem: determining whether software code is bug-free. They argued that, with classical computers, you could never be completely certain. Ray Johnson, Lockheed Martin’s chief technology officer, put it bluntly: “You would never know” for sure if classical-computer code was clean. “But now you can say with certainty,” he said. “We have great hope, and confidence, in the ability of the computer to scale to real-world complex problems.”

For Rose, the sale was the proof point he’d been chasing: quantum computing, he argued, was finally starting to deliver on its decades-long promise. Aaronson wasn’t persuaded. A flagship company buying a system didn’t settle the science. “Just because a flagship company has bought the system, doesn’t mean that it now works,” he said.

And the academic community, broadly, stayed unconvinced. A research team led by Matthias Troyer and Daniel Lidar found evidence consistent with quantum annealing in the D-Wave One, but reported no speed increase compared to classical computers. They built and ran an optimized classical algorithm on the same type of problem and didn’t see D-Wave pulling away.

But here’s the thing: for D-Wave, the Lockheed deal wasn’t about winning a debate. It was about proving you could sell the future before everyone agreed it was real. You didn’t need academic consensus to build a business. You needed a customer willing to spend $10 million on a machine that might—just might—solve problems their existing systems couldn’t.

The controversy didn’t go away. If anything, it became part of the brand. But from this moment on, D-Wave had a claim no one else in quantum could match: it was actually selling quantum computers. And that distinction would shape the next decade.

V. Scaling Up: The NASA, Google, and Enterprise Era (2013–2018)

If Lockheed Martin was validation, Google and NASA felt like coronation—or at least, that’s how it played in the headlines.

On May 16, 2013, Google Research announced the launch of the Quantum Artificial Intelligence Lab, hosted at NASA’s Ames Research Center. The lab was a collaboration between Google, NASA, and the Universities Space Research Association, and it centered on D-Wave’s newest system at the time: the D-Wave Two, a 512-qubit quantum annealer installed at the NASA Advanced Supercomputing Facility.

For D-Wave, the symbolism was hard to overstate. Google was the kind of company that didn’t buy cutting-edge computers because they were trendy. It built its own. So the fact that Google showed up at all suggested that, whatever the academic arguments said, some of the world’s most demanding engineers saw enough promise to start experimenting in earnest.

The lab’s mandate also hit the sweet spot of D-Wave’s pitch: machine learning. Not “quantum computers will replace everything,” but “there are specific, messy, constraint-filled problems—optimization problems—where this approach might help.” Google’s blog post framed it as a search for new kinds of solutions, even calling quantum machine learning “the most creative problem-solving process under the known laws of physics.” The ambition was clear: explore whether annealing could help with the kinds of tasks behind translation, image search, and speech recognition—problems that often boil down to choosing the best option from an overwhelming number of possibilities.

Underneath the marketing, D-Wave was doing what it always did: scaling hardware relentlessly.

After the 128-qubit D-Wave One sold to Lockheed in 2011, D-Wave moved to the 512-qubit D-Wave Two in 2013. In 2015, it announced general availability of the D-Wave 2X, a system with more than 1,000 qubits, and later that year said the 2X had been installed at the Quantum AI Lab at NASA Ames. In January 2017 came the D-Wave 2000Q, which pushed the headline number again, to around 2,000 qubits. Each release told the same story: more qubits, better performance, and an unspoken dare to critics—if this is “just classical,” why is it getting harder to build every year?

But the central critique didn’t go away. The question wasn’t whether D-Wave could manufacture larger machines. It was whether those machines were delivering a meaningful advantage over the best classical methods on problems anyone actually cared about.

D-Wave tried to address that head-on, including by bringing in outside evaluation. It hired an external expert who concluded that the D-Wave Two was dramatically faster—3,600 times faster—than a leading conventional computer on the specific kind of problem the machine was designed to solve. The fine print mattered, of course: “specific kind of problem” is doing a lot of work in that sentence. And even then, researchers like USC’s Daniel Lidar remained unconvinced, arguing that in their tests classical computers were still faster: “Every problem we have tested can still be solved faster on classical computers.”

Then Google’s own experiments added fuel to the fire.

In 2015, researchers from Google’s Quantum AI Lab reported results that seemed, at least on the surface, like the proof D-Wave had been chasing for years. Hartmut Neven, who led the lab, said the team saw a stunning performance gap when the D-Wave system at NASA was raced against a conventional computer running on a single processor. For what Neven described as a carefully crafted proof-of-concept problem, he claimed a “100-million-fold speed-up.”

Scott Aaronson, D-Wave’s most famous skeptic, responded with a mix of genuine interest and familiar caution. He called it “the most impressive demonstration so far of the D-Wave machine’s capabilities,” while still emphasizing that it was “totally unclear” whether this translated into what he’d consider true quantum speedup on practical problems.

And that’s where the argument landed, again: not “did it run fast,” but “what was it running, and why?”

In the Google paper, the authors argued that even if classical approaches like quantum Monte Carlo could match the asymptotic behavior, the constants favored the D-Wave machine massively—on the order of that 100 million figure. The counterargument, echoed by critics, was that the comparison was being made on problem instances that looked a lot like simulating the D-Wave machine itself. If you spend enormous effort building special-purpose hardware, it shouldn’t shock anyone when that hardware outperforms a general-purpose classical machine at tasks tailored to the hardware’s structure. The hard part is mapping a practically important optimization problem onto the D-Wave topology in a way that preserves the value.

Still, this period did change something important: D-Wave was no longer a weird startup with a flashy demo and one defense contractor customer. It had a credible ecosystem forming around it—Google, NASA, and research groups trying to figure out what this thing was good for.

And in 2018, D-Wave made what might have been the most strategically important move of the era: it leaned into the cloud.

That year, the company launched the Leap quantum cloud service, giving developers and organizations real-time access to D-Wave systems, plus tooling and learning resources. It was an acknowledgement of reality. Selling multimillion-dollar machines with long installation cycles was never going to create a broad market. But letting people try quantum annealing without buying the refrigerator-sized hardware? That could.

Leap wasn’t just a product. It was D-Wave quietly rewriting the business model from “buy our box” to “use our capability.” And as the rest of the quantum industry began to gather momentum, that shift would prove to be one of the few advantages D-Wave could defend without having to win every physics argument first.


VI. The SPAC and Going Public: Timing and Turbulence (2019–2022)

By 2022, D-Wave had been building quantum computers for more than two decades. It had shipped multiple generations of hardware, signed up customers from Lockheed Martin to Volkswagen, and survived years of academic skepticism. What it still hadn’t done was the thing that, in Silicon Valley, turns a long-shot technology story into a market story: go public.

Then came the SPAC era.

In 2020 and 2021, special purpose acquisition companies became the shortcut to the public markets—faster and less demanding than a traditional IPO. For a pre-profit company selling a hard-to-explain, hard-to-measure frontier technology, it was an inviting door. D-Wave decided to walk through it.

The plan was to merge with DPCM Capital. The deal was pitched as providing about $340 million in cash and an initial market valuation of roughly $1.6 billion. It was expected to close in the second quarter of 2022, with the combined company trading on the New York Stock Exchange under the symbol QBTS.

Under the transaction terms, D-Wave’s equity value was about $1.2 billion. There was also a twist meant to discourage the biggest SPAC problem of the moment—shareholders redeeming their shares and draining the cash trust before the merger closed. The structure set aside a bonus pool of 5 million shares to be allocated pro rata to DPCM public stockholders who didn’t redeem. In the best case, the combined company would receive $300 million in gross proceeds from DPCM’s trust account—assuming no redemptions—plus another $40 million in gross proceeds via a committed PIPE from strategic and institutional investors.

D-Wave’s CEO, Alan Baratz, framed the moment as more than a financing event. “Today marks an inflection point signaling that quantum computing has moved beyond just theory and government-funded research to deliver commercial quantum solutions for business,” he said. Along with DPCM Capital and a roster of investors that included PSP Investments, Goldman Sachs Asset Management, NEC Corporation, Yorkville Advisors, and Aegis Group Partners, he argued it wasn’t “a moment of hope or science,” but “a moment of practical value creation for customers and investors.”

But the market backdrop shifted under their feet.

By the time D-Wave was trying to close, the SPAC boom had turned into a SPAC winter. And the thing that was supposed to be “assuming no redemptions” became the thing that defined the deal. D-Wave disclosed in a filing with U.S. regulators that DPCM shareholders had redeemed $291 million worth of shares—gutting the expected funding from the trust account. It wasn’t just D-Wave: other quantum companies that had gone public through SPAC mergers, including Rigetti Computing and IonQ, were already struggling to hold their share prices.

Still, the merger crossed the finish line. On August 5, 2022, DPCM Capital announced the completion of its business combination with D-Wave. D-Wave Quantum Inc. became the parent company of both DPCM Capital and D-Wave, and trading began on August 8, 2022 on the NYSE under “QBTS” for common stock and “QBTS WS” for warrants.

The investor pitch was clean and ambitious. D-Wave positioned itself as the only company with a complete, end-to-end quantum solution—hardware, software, a real-time quantum cloud service, developer tools, and hybrid solvers—and said it was the only company building both annealing and gate-model quantum computers. It emphasized five generations of product delivery, a portfolio of more than 200 patents spanning both annealing and gate-based approaches, and commercial relationships that included nearly two dozen Forbes Global 2000 companies.

On the numbers, the story was still early-stage. CFO John Markovich told investors on a webcast that the company had built a bottoms-up five-year plan projecting more than 160% revenue growth over the following five years, starting with a targeted $11 million of revenue for 2022. He added that about 40% of that $11 million objective was supported by contracted bookings.

Public markets didn’t reward early-stage narratives for long in 2022. After trading around $10 per share initially, D-Wave’s stock became volatile, with a 52-week range from $3.74 to $46.75.

And the fundamentals were still tough. Revenue remained in the single-digit millions while operating losses grew. D-Wave kept burning cash on R&D and on the expensive work of selling and supporting frontier hardware and cloud services. For a company valued at more than a billion dollars on promise, the present-day business still looked embryonic.

But there was one undeniable milestone: D-Wave was now public. It had access to capital markets, analyst coverage, and the unforgiving discipline of quarterly reporting. Whether that new status would help it survive long enough to reach quantum’s long-promised inflection point was still an open question.


VII. The Inflection Point: AI, Optimization, and Quantum's Second Act (2023–2024)

Then ChatGPT happened.

The AI boom that kicked off in late 2022 didn’t help D-Wave because D-Wave was suddenly doing generative AI. It helped because it changed the conversation about computing. Overnight, the world woke up to a new, bottomless appetite for computation, and a new anxiety about the limits of classical hardware. If the next decade was going to be defined by optimization—of models, of infrastructure, of supply chains, of energy—then D-Wave’s long-running pitch started to sound less like a detour and more like an angle.

D-Wave leaned into that moment and repositioned around what it had always been best at: optimization. Its hardware story had already taken a big step in 2020 with Advantage, an annealing system featuring more than 5,000 qubits and 15-way qubit connectivity. Along the way, D-Wave had continued to attract funding from a mix of backers, including In-Q-Tel, Goldman Sachs, BDC Capital, and PSP Investments. And it wasn’t starting from zero on credibility: its early customer roster still read like a roll call of serious institutions—Google, NASA, Lockheed Martin, and Los Alamos National Laboratory.

Next up was Advantage2. In 2021, D-Wave announced the next-generation system that would become Advantage2, with delivery expected in late 2024 or early 2025. The plan was straightforward: push beyond 7,000 qubits and move to the Zephyr graph topology, increasing per-qubit connections to 20. As of July 2025, the manufacturer said a “4,400+ qubit system” was generally available.

On the business side, something finally started to move. D-Wave said its fiscal year 2024 bookings would exceed $23 million—up roughly 120% from fiscal 2023 bookings. It also pointed to a milestone that mattered for more than just the headline: the first-ever customer purchase of a D-Wave Advantage annealing quantum computing system. After years of steering the market toward cloud access and services, this was a sign D-Wave could still sell the big machine, too—broadening the go-to-market beyond “use it in our cloud” to include on-premise system sales.

The company later gave more detail: bookings for the year ended December 31, 2024 were $23.9 million, up from $10.5 million in 2023. The swing was especially sharp in the fourth quarter, when bookings were $18.3 million versus $3.0 million a year earlier. D-Wave also reported a cash position above $300 million.

Then came the scientific claim that D-Wave has been chasing—carefully, controversially—for most of its existence.

In March 2025, D-Wave announced what it described as “quantum supremacy,” saying its latest quantum annealer solved a specialized optimization problem faster than the world’s most powerful classical supercomputers could reproduce the result. The Wall Street Journal noted the performance gap, while also capturing the familiar pushback: skeptics argued the problem was carefully chosen.

D-Wave’s peer-reviewed paper in Science, “Beyond-Classical Computation in Quantum Simulation,” made the more formal case. It reported that D-Wave’s annealing quantum computer outperformed one of the world’s most powerful classical supercomputers on simulations of complex magnetic materials with relevance to materials discovery. The company framed it as the first and only demonstration of quantum computational supremacy on a useful problem. The work came from an international collaboration led by D-Wave, running simulations of quantum dynamics in programmable spin glasses on an Advantage2 prototype and comparing results to the Frontier supercomputer at Oak Ridge National Laboratory.

The team simulated a set of lattice structures and sizes across different evolution times and extracted multiple material properties. D-Wave said its system handled the most complex simulation in minutes, while reproducing the same result on the supercomputer would take nearly a million years—and, as framed by D-Wave, would demand more energy than the world uses in a year given Frontier’s GPU-based architecture.

The reaction was, predictably, not a standing ovation. One researcher attempted to scale the classical calculations to larger problems, though those results hadn’t been published. Another dismissed the dispute as “petty” and argued a tensor-based approach could scale to achieve comparable results. Researchers at École Polytechnique Fédérale in Lausanne told New Scientist that the problems D-Wave solved could be tackled without any need for quantum entanglement.

D-Wave pushed back, saying that while critics’ work might be an advance, it “does not challenge our claims whatsoever beyond classical quantum simulation.” And it’s worth remembering: disputed “supremacy” moments are basically a genre now. Google’s 2019 claim was also challenged. D-Wave’s own wording reflected that history; the company used “quantum advantage” rather than leaning exclusively on “supremacy,” signaling something closer to “we beat classical on this” than “case closed forever.” But even a marginal edge can matter—if it shows up on problems that map to the real world.

Public markets certainly treated it like a turning point. The stock surged 992% over the last 12 months, putting D-Wave among the standout performers as quantum computing pushed back into mainstream attention.

And big money noticed. After substantial fourth-quarter 2024 buying activity from major funds, D-Wave said institutional ownership reached 55.4%—a striking level of institutional participation for a pre-profit quantum computing company, and a signal that at least some investors were newly willing to underwrite the long game.

VIII. The Annealing vs. Gate Debate: Who Was Right?

Fifteen years of data later, what can we actually say about D-Wave’s foundational bet on quantum annealing?

First, a framing that gets lost in the shouting: D-Wave’s annealers and universal gate-based quantum computers aren’t really trying to be the same product. They share quantum foundations, but they’re built for different jobs and they come with different engineering headaches.

A D-Wave system is a quantum annealer. It runs adiabatic-style algorithms designed to search a huge landscape of possible solutions and settle into a low-energy minimum. That makes it a natural fit for optimization-style work. And in the D-Wave orbit, one of the most cited data points is Google’s claim that, on a specific benchmark, the D-Wave machine was more than 10^8 times faster than simulated annealing running on a single core.

Gate-based quantum computing is the general-purpose vision: build qubits stable enough to run quantum circuits, string gates together, and in principle run arbitrary algorithms. That universality is why gate-model systems get most of the attention. It’s also why they’re so hard: reliability, error rates, and scaling are brutally unforgiving.

The trade-off is clean. Quantum annealing can’t efficiently run Shor’s algorithm, so it’s not the machine that threatens modern cryptography. And it won’t become a universal quantum computer that performs arbitrary quantum computations. It’s a specialist tool, and it always will be.

Which brings us to the honest assessment: D-Wave made the right bet for the moment it was born into. Annealing gave them a path to ship something and commercialize while gate-based systems were still, for the most part, lab-scale experiments. But the bet also boxed them in. If the world’s biggest quantum value ends up being universal, fault-tolerant machines running broad algorithms, D-Wave’s core product will never be “that.”

What’s changed is the tone of the argument.

Seth Lloyd, Professor of Quantum Mechanical Engineering at MIT, put it this way: "Although large-scale, fully error-corrected quantum computers are years in the future, quantum annealers can probe the features of quantum systems today. In an elegant paper, the D-Wave group has used a large-scale quantum annealer to uncover patterns of entanglement in a complex quantum system that lie far beyond the reach of the most powerful classical computer. The D-Wave result shows the promise of quantum annealers for exploring exotic quantum effects in a wide variety of systems."

That kind of statement would’ve sounded unimaginable during the early years, when the dominant reaction was basically: this isn’t even quantum. Today, the academic posture is far closer to: yes, it’s quantum—now prove it matters.

D-Wave has helped that shift by publishing extensively in peer-reviewed journals, including Nature and Science, and by steadily building evidence that its systems exploit quantum effects. So the debate has matured. The question is no longer “is it quantum?” It’s “is it useful?”

And “useful” is still tricky. The research increasingly suggests the machine really is quantum in its behavior, but it’s not yet settled that this particular kind of quantum computer reliably beats the best classical approaches on broadly practical problems. Annealers solve a narrow class of problems. Even when the machine looks strong on carefully constructed instances, you still have to do the messy real-world work of encoding your business problem into the machine’s format—and it’s not obvious that this translation doesn’t eat the gains. In other words: pragmatic applications may still be some distance away.

From D-Wave’s perspective, the counterargument has always been simple: while the academics debated definitions and benchmarks, D-Wave built. They shipped systems, acquired customers, and generated revenue. Their staying power through years of volatility is, in their telling, the payoff from being early and relentless in quantum annealing.

So who was right?

The gate-based camp was right about universality: annealing is not the road to Shor’s algorithm and general-purpose quantum computing. D-Wave was right about timing: annealing gave them something they could deliver, years before the universal dream could leave the lab. The remaining question—and it’s the one that will decide whether this is a great niche business or a foundational computing company—is whether D-Wave can turn its annealing lead and customer base into durable value beyond the narrowest optimization demos.

IX. Business Model Deep Dive & Strategic Positioning

D-Wave’s business model has had to grow up fast. For years, the company was defined by a single, daunting proposition: if you wanted to work with D-Wave, you bought a very expensive, very specialized machine. Over time, that shifted into a three-part revenue engine: selling on-premises systems, selling access through its Leap cloud platform, and selling professional services to help customers figure out what to do with the technology in the first place.

That evolution shows up in the financials. Revenue for the six months ended June 30, 2025 was $18.1 million, up from $4.6 million for the same period in 2024. During that stretch, D-Wave also completed a $400 million at-the-market equity offering, and said it ended June 30, 2025 with approximately $819 million in cash—its highest quarter-end balance. It also announced general availability of Advantage2, which it described as its most advanced and performant system.

The headline moment inside those results was a throwback to D-Wave’s original model: selling the actual box. In the first quarter of 2025, D-Wave reported $15 million in revenue, up 509% from a year earlier, and said $12.6 million of that came from selling an Advantage2 system to the Jülich Supercomputing Center. The quarter still showed a net loss of $5.4 million, and D-Wave reported a cash balance of more than $304 million at the time—another reminder that even with real revenue, this is still a capital-intensive deep tech company.

What’s strategically interesting about on-premises sales isn’t just the dollars. It’s what the customer gets. Buying an on-premise system gives the customer full access to the Advantage quantum computer, including the ability to modify system parameters and integrate the system in ways that aren’t possible through standard cloud access. D-Wave has positioned this as a complementary offering to Leap, and said demand for on-premises systems has been growing from research centers, academic institutions, high-performance computing centers, and businesses pushing the edge.

At the same time, Leap is the model that makes D-Wave feel like a modern computing company rather than a bespoke hardware vendor. Advantage2 is integrated with hybrid solvers in the Leap quantum cloud service, which D-Wave says support problems with up to two million variables and constraints—framing it as a way to run large-scale, business-critical applications in production. Leap is available in more than 40 countries, and D-Wave emphasizes the enterprise basics: 99.9% availability, sub-second response times for QPUs, and SOC 2 Type 2 compliance.

The customer list in 2025 reflects the split between research credibility and commercial pull. D-Wave signed new and renewing engagements across both categories, including E.ON, GE Vernova, the UK’s National Quantum Computing Centre, Nikon, NTT Data Corp., NTT DOCOMO, Sharp, and the University of Oxford.

Usage data has become part of the story too—because in cloud businesses, adoption is the product. D-Wave said more than 20.6 million customer problems had been run on Advantage2 prototypes available through Leap since June 2022, with customer use up 134% in the last six months. It also pointed to examples: Japan Tobacco used an Advantage2 prototype in a proof-of-concept project that combined quantum computing and AI for drug discovery, and both Jülich and Los Alamos National Laboratory have used the prototypes in their research programs.

And then there’s the question that hangs over every quantum company that isn’t yet consistently profitable: runway. D-Wave continued to report losses as it invested in R&D and go-to-market. It reported a $27.7 million operating loss in the third quarter alone and an adjusted net loss of $18.1 million, along with negative free cash flow of $55.8 million year to date. Cash burn is a real risk for development-stage companies—but D-Wave also reported $836.2 million in cash on its balance sheet, which it said should fund operations at its current level for several years.

The go-to-market shift might be the cleanest summary of all of this. D-Wave went from “buy a multimillion-dollar system” to “try it first.” Advantage2 is available through Leap in 42 countries, and customers can explore it through D-Wave’s Quantum Launchpad program, which offers three months of free access for testing and development. Trevor Lanting described Launchpad as a “free trial access program” for customers who want to “kick the tires” and see if quantum is right for them.

In other words: D-Wave is still selling moonshot hardware—but it’s finally packaging that moonshot in a way that looks like a product you can actually adopt.

X. Porter's 5 Forces & Hamilton's 7 Powers Analysis

Porter's Five Forces Analysis:

Threat of New Entrants: The barriers are real. Quantum hardware isn’t a “two founders and a laptop” market; it takes deep expertise in superconducting physics, cryogenics, and the messy engineering of keeping qubits stable long enough to be useful. But “hard” doesn’t mean “safe.” The field is swarming with extremely well-funded labs and corporate programs—IBM, Google, and Amazon among them—pursuing different architectures with different claims. D-Wave isn’t fighting off scrappy startups; it’s trying to stay ahead of giants with unlimited patience and enormous R&D budgets.

Bargaining Power of Suppliers: Here, D-Wave has done something quietly important: it has vertically integrated much of what it can, which reduces supplier leverage. Still, the supply chain is not commodity. Exotic components like superconducting materials and dilution refrigerators come from a relatively small set of vendors. D-Wave’s long operating history gives it procurement experience and relationships, but it doesn’t eliminate the fundamental constraint: the parts are specialized, and there aren’t infinite sources for them.

Bargaining Power of Buyers: Buyers in this category are sophisticated—and few. Enterprise and government customers can demand long evaluations, custom work, and proof that a use case is worth the effort. Sales cycles are slow, and even when D-Wave is “working a deal,” that can mean a long lead time before revenue shows up. The company has said it’s encouraged by the interest following its quantum advantage claims and capability demonstrations, but leverage still sits with the customer. The flip side is that once a customer builds workflows and tooling around D-Wave’s systems, switching becomes painful. The catch is that the installed base is still small, so that leverage doesn’t yet scale.

Threat of Substitutes: This is the big one. Classical computing doesn’t stand still. High-performance computing keeps improving, GPUs keep getting better, specialized accelerators keep proliferating, and algorithms keep getting smarter. Even if D-Wave wins the “it’s quantum” argument, it still has to win the business argument against alternatives that are faster to adopt and easier to justify. And because annealing is best suited to a narrow slice of optimization problems, the window where it’s decisively better can be easy for classical methods to narrow.

Industry Rivalry: Rivalry is intense, but oddly fragmented—because not everyone is building the same kind of quantum computer. Public markets poured attention back into the sector in 2025, and the “pure-play” names like IonQ and Rigetti became proxy battles for different technical approaches. Their stocks swung wildly, with Rigetti outperforming IonQ year to date. But that stock chart competition reflects something deeper: the market is still trying to decide which approach will matter, and the companies are fighting for mindshare as much as they’re fighting for customers.

Hamilton's 7 Powers Analysis:

  1. Scale Economies: Limited. Quantum systems are still closer to bespoke hardware than mass-manufactured products. The cloud could provide scaling leverage, but only if the user base grows meaningfully.

  2. Network Effects: Weak. Leap has community benefits—tools, examples, shared learning—but customers don’t get dramatically more value simply because other customers are present.

  3. Counter-Positioning: Strong. D-Wave made the contrarian bet: ship annealing systems while the industry’s prestige and attention concentrated on the gate-based path. That created a unique position—and also a binary risk if annealing never becomes broadly valuable.

  4. Switching Costs: Moderate to Strong. If a customer embeds D-Wave workflows into real operations, the cost to unwind that—technical integration, internal skills, organizational process—is meaningful. But because adoption is still early, D-Wave doesn’t yet have switching costs at massive scale.

  5. Branding: Moderate. “First commercial quantum computer” is a powerful story, and peer-reviewed publications in places like Science and Nature help. But years of controversy over speedup claims, plus the post-SPAC stock collapse, have also left reputational scar tissue.

  6. Cornered Resource: Moderate. D-Wave has delivered five generations of systems and built a portfolio of more than 200 patents spanning both annealing and gate-based quantum computing. That’s a real moat in expertise and IP, but it’s not unassailable in a world where giants are investing heavily in competing approaches.

  7. Process Power: Emerging. Fifteen-plus years of building, running, and supporting quantum systems turns into hard-to-copy institutional muscle memory. Leap adds another layer: production-grade access to quantum computers and hybrid solvers, with enterprise expectations around reliability and security. D-Wave has said Leap delivers 99.9% uptime and has supported customers running hundreds of millions of business and research problems since launching in 2018.

Synthesis: D-Wave’s edge comes from being early, contrarian, and operationally experienced—counter-positioning plus emerging process power. What it lacks are the classic compounding advantages of modern tech: massive scale economies and strong network effects. That leaves the company in a precarious but interesting position: defensible enough to matter today, yet still vulnerable if better-funded gate-based competitors deliver truly practical systems. The thing buying D-Wave time is that “practical” still looks years away for much of the gate-based world.


XI. Bull vs. Bear Case

The Bull Case:

D-Wave occupies a strange, potentially valuable corner of the quantum world: it’s one of the few companies that can credibly say it has quantum computers in production and customers actually using them. With the general availability of Advantage2, D-Wave is effectively doubling down on the pitch it’s been making for years: this is a commercial-grade annealing system, designed not for science fair demos, but for real workloads in optimization, materials simulation, and certain AI-adjacent problems.

The Science publication is the kind of milestone D-Wave has chased since the Orion days: peer-reviewed, mainstream validation that its machine can do something classical systems struggle to reproduce. In D-Wave’s framing, the work showed its system modeling the behavior of magnetic materials in a way that pushes beyond what today’s classical supercomputers can practically simulate. The company has described it as the first instance of “quantum supremacy” on a problem with practical relevance—an important distinction from earlier “we beat classical” headlines that revolved around abstract or synthetic benchmarks.

Customer traction is also starting to look more tangible. Advantage2 is now operational at Davidson’s headquarters in Huntsville, Alabama. D-Wave says the system is expected to tackle mission-critical U.S. government problems, particularly in national defense, and that it will eventually run sensitive applications. The launch is positioned as a key step in a multi-year agreement meant to accelerate quantum adoption across U.S. government agencies.

Internationally, D-Wave announced an agreement with Swiss Quantum Technology SA to deploy an Advantage2 system in Europe. The agreement represents a €10 million commitment from SQT, with an option to purchase the system. D-Wave also said this deployment would support the newly formed Q-Alliance in Italy.

And then there’s the most practical bull point of all: runway. D-Wave completed a $400 million at-the-market equity offering and reported a consolidated cash balance of $819.3 million as of June 30, 2025. For a capital-intensive hardware company that has spent its entire life living between breakthroughs and burn, that kind of liquidity buys time—time for R&D, time to build go-to-market muscle, and time to wait out competitors who are still stuck in the “great prototype” phase.

Meanwhile, gate-based competitors still look years away from reliable commercial utility. Rigetti has a 100-plus-qubit chiplet-based system with 99.5% fidelity and has said it’s targeting an over 1,000-qubit system by 2027. IonQ has reported 99.99% accuracy. But even “high” fidelities can still be too error-prone for practical quantum computing at scale, and many computer scientists argue you need at least 99.9% fidelity before you can even begin layering on error-reduction techniques in a meaningful way. In that context, the bull case is simple: while everyone else is racing toward a future universal machine, D-Wave may have a multi-year window where “useful specialist” beats “promising generalist.”

The Bear Case:

The most existential bear argument is that annealing is a technological cul-de-sac. The industry’s center of gravity is still gate-based quantum computing, and D-Wave’s specialist approach narrows its addressable market. Add to that the limited visibility into D-Wave’s gate-model strategy and progress, and skeptics see a company leaning harder into the one path it already has, not necessarily the one that wins long-term. Even if D-Wave’s commercial success is real, there’s an uncomfortable question underneath it: how big is the market for annealing-based optimization once classical methods keep improving?

Then there’s the mismatch between narrative and revenue. In 2024, D-Wave Quantum reported revenue of $8.83 million, up only slightly from $8.76 million the year before, while losses grew to $143.88 million, 73.9% higher than in 2023. For critics, that’s the warning sign: a company can’t live forever on “the future is coming,” especially when the present-day business remains small and expensive to run.

In April 2025, independent investment research firm Kerrisdale Capital made that critique explicit, arguing D-Wave’s market valuation was “disconnected from its stagnating revenue and lack of broad commercial adoption.” They highlighted concerns about share dilution and questioned the company’s path to profitability.

The scientific claims, too, remain contested in the way quantum claims almost always are. Critics have argued that advances in classical algorithms could narrow or even close the gap on the benchmark D-Wave used, implying that the advantage may be fragile—more about a narrow, carefully constructed problem than a broadly repeatable edge across real-world applications.

Finally, there’s dilution. D-Wave has raised capital repeatedly through at-the-market offerings, expanding the share count and creating a persistent risk for shareholders. The company’s completion of a $175 million equity offering adds working capital, but it also reinforces the bear point: funding the journey has often meant issuing more stock, and there’s no guarantee that future progress won’t require more of the same.


XII. The Competitive Landscape & Future Outlook

D-Wave operates in a quantum computing market that feels paradoxical: it’s crowded with competitors, yet still oddly empty of proven, repeatable commercial applications.

On the gate-based side, IonQ has introduced its Tempo system with an algorithmic qubit score of 64, and it has claimed that its computational space is 260 million times larger than its commercialized Forte system. Rigetti, meanwhile, has laid out an aggressive technology roadmap—targeting a 100-plus qubit system at 99.5% fidelity by the end of 2025, a 150-plus qubit system at 99.7% fidelity in 2026, and a 1,000-plus qubit system at 99.8% fidelity in 2027. The direction of travel is clear. The commercial traction, so far, is not.

The differentiation here isn’t just branding; it’s physics. IonQ’s trapped-ion approach is positioned around accuracy, while Rigetti’s superconducting qubits are positioned around speed—reports have suggested an advantage on the order of 10,000 times faster. Rigetti also reported a 100-plus-qubit chiplet-based system at 99.5% fidelity. These are real technical achievements. They’re just not yet the kind of “it changes how a business runs” achievements that turn quantum computing into a mainstream category.

IonQ and Rigetti share another similarity, too: both are marching toward fault-tolerant quantum computing, both lean on government relationships (including work with partners like DARPA), and both emphasize a future that includes next-generation quantum networking. In other words, they’re building for the long game—because they have to.

And then there’s the looming strategic reality that hangs over all of them: the hyperscalers. It’s hard to build a quantum company in the shadow of Microsoft and Alphabet. These firms have money, talent, distribution, and patience. For public “pure-play” quantum companies whose stock prices often depend as much on market mood as on technical progress, that creates a strange dynamic: the near-term investing case can look fragile, while the long-term payoff—if the technology finally clicks—could be enormous.

D-Wave, for its part, has chosen a different kind of arms race. Instead of chasing universal fault tolerance first, it keeps pushing scale on annealing hardware. Trevor Lanting put it bluntly: the company has talked about a roadmap to 100,000-qubit processors, and Advantage2 is framed as a key step toward that—through scaling processor fabrics and developing multichip architectures.

The other pillar of D-Wave’s forward story is the AI wave. The company increasingly positions itself not as “a quantum computer you might need someday,” but as an optimization engine that plugs into the reality of modern machine learning systems. Optimization problems are everywhere in AI—how you allocate compute, how you schedule jobs, how you route data, how you tune models. As AI workloads grow, the hunger for new computational approaches grows with them.

And that’s the cleanest statement of D-Wave’s bet. While more general-purpose quantum processors are still early and experimental, annealing systems are designed for large-scale optimization problems like route planning, scheduling, and certain machine learning tasks—the messy, constraint-heavy problems where classical computing can struggle to find good answers quickly at scale. Whether that niche becomes a bridge to something bigger, or the ceiling of the business, is what the next chapter of quantum will decide.

XIII. Lessons for Founders, Investors, and Technologists

D-Wave’s quarter-century journey offers lessons that reach well beyond quantum computing.

On Deep Tech Entrepreneurship: D-Wave showed that you can commercialize a frontier technology before the scientific establishment is ready to bless it. But the price is living under a microscope. “We’re not in this business to be popular,” Geordie Rose said—easy to declare, harder to endure when your work is publicly dismissed as a “red-flag factory.”

On Market Timing: D-Wave was early—maybe a decade early. In deep tech, “early” doesn’t mean “first-mover advantage” so much as “long, expensive waiting room.” You spend years funding R&D, supporting customers, and building the ecosystem while the killer applications and buying behavior slowly catch up. Along the way, you have to survive more than one quantum winter, when interest drops and capital gets scarce.

On the Credibility Tightrope: D-Wave’s biggest opportunities and biggest problems came from the same place: how it told its story. The adiabatic approach had real academic roots, even if the literature was thin. What set off alarms was the go-to-market posture—press events and big claims before the kind of sober, peer-reviewed characterization that much of the quantum community expected. Scott Aaronson, in particular, didn’t hide his view: “They are marketing types who are trying to make the most dramatic claims possible.” Whether you agree or not, the takeaway is clear: in science-adjacent industries, credibility is not a nice-to-have. It’s a product feature.

On Pivoting in Hardware: Software companies can change direction in weeks. Hardware companies do it in years, with factories, supply chains, and physics in the loop. D-Wave’s shift from primarily selling machines, to offering cloud access, to packaging hybrid solvers alongside the hardware is what adaptation looks like when every iteration is capital-intensive and slow.

On Managing Hype Cycles: Quantum computing doesn’t move in a straight line. It moves in waves—breakthrough headlines, then disappointment, then a new narrative. D-Wave survived by finding investors willing to be patient, and by accepting that sustaining losses during down cycles is sometimes the admission price for staying alive long enough to catch the up cycle.

On the “Good Enough” Bet: D-Wave’s core wager was pragmatic: deliver a specialist tool for optimization rather than wait for a universal, fault-tolerant quantum computer. That got them to customers sooner, and it gave them a real product to iterate on. It also narrowed the lane they can dominate. Whether that trade ends up looking like genius or self-limitation still depends on how big the optimization market becomes—and whether annealing keeps winning against improving classical techniques.

On Surviving to the Inflection Point: A lot of early investors have been waiting more than 20 years for meaningful returns. The company has endured near-death moments, leadership changes, and the constant pressure of funding an expensive roadmap—often through dilution. The simplest, hardest lesson might be this: in deep tech, resilience isn’t a virtue. It’s the strategy.

XIV. Epilogue & Recent Developments

As 2025 drew toward the finish line, D-Wave found itself in a place that would’ve sounded absurd back at the Computer History Museum in 2007. The company that got laughed at for claiming it had built a quantum computer had now published peer-reviewed work in Science arguing it could outperform classical supercomputers on a problem with practical relevance. And the stock that once fell to the low single digits had climbed back above $30.

On the product front, D-Wave announced general availability of Advantage2: an annealing quantum computer it described as both more powerful and more energy-efficient, aimed squarely at real workloads in optimization, materials simulation, and AI-adjacent applications. This was D-Wave trying to turn a long-running promise into something closer to a commercial standard: not “look what we can demo,” but “here’s what you can run.”

That message got more concrete with a deployment in Huntsville, Alabama. D-Wave’s Advantage2 system became operational at Davidson Technologies, where it was expected to be used on mission-critical U.S. government problems, particularly in national defense, with an eventual path toward sensitive applications. The company framed the launch as a milestone in a multi-year agreement designed to accelerate quantum adoption across U.S. government agencies.

Financially, D-Wave entered its strongest position yet. It reported a record consolidated cash balance of $819.3 million as of June 30, 2025—enough runway to keep building, keep selling, and, crucially, keep buying time for the hardest part of its thesis: proving that quantum annealing can reach durable product-market fit after two decades of “almost.”

That’s why the lingering question—“zombie company” or “comeback story”—still doesn’t have a clean answer. Revenue remains small relative to the size of the bet. The competitive field keeps tightening as gate-based programs push toward commercial viability. And classical computing continues its relentless improvement, always threatening to shrink the slice of problems where a quantum approach is meaningfully better.

But one thing is no longer debatable: D-Wave outlasted almost everyone’s timeline for its failure. The company that skeptics dismissed as destined for bankruptcy is still here—still building machines, still attracting customers, still making claims that force the industry to respond.

The broader lesson, and maybe the most uncomfortable one, is that being right too early can look indistinguishable from being wrong. Deep tech doesn’t reward conviction quickly. It tests it for years. D-Wave’s story still isn’t finished, but it has already secured a strange kind of legacy in computing history: the company that tried to commercialize the impossible—and then refused to die while waiting for the world to catch up.

Key Performance Indicators to Watch:

For investors tracking D-Wave’s progress from here, three measures matter most:

  1. Quantum Computing as a Service (QCaaS) Revenue Growth — A signal of recurring demand and whether Leap becomes more than a demo pipeline for hardware.

  2. Bookings Growth Rate — A forward-looking indicator of customer momentum, expansion, and near-term revenue potential.

  3. Cash Burn Relative to Revenue Progress — The simplest reality check: whether losses are shrinking as revenue scales, and how long D-Wave can fund the roadmap without needing to return to the market.


XV. Further Resources

Top Long-Form Resources:

  1. D-Wave Investor Relations & SEC Filings — The primary source for how D-Wave explains itself to the market: 10-Ks, earnings calls, and SPAC merger documents.

  2. "Quantum Computing: Progress and Prospects" (National Academies, 2019) — A clear, authoritative snapshot of what quantum can do, what it can’t, and what it will take to get from prototypes to impact.

  3. Scott Aaronson’s blog (Shtetl-Optimized) — Often skeptical, always rigorous, and still one of the best ways to understand the arguments around what “counts” as quantum advantage.

  4. D-Wave’s technical publications and white papers — The company’s own evolving case for quantum annealing, written in the language of engineers and customers rather than headlines.

  5. The Science paper: "Beyond-Classical Computation in Quantum Simulation" — D-Wave’s peer-reviewed claim of beyond-classical performance, and the best place to see exactly what was tested and what was compared.

  6. IBM Quantum blog & roadmap — A useful window into the gate-based worldview: how the biggest incumbent thinks about timelines, milestones, and error correction.

  7. MIT Technology Review’s quantum computing coverage — Consistently readable reporting that captures both the breakthroughs and the fine print.

  8. Geordie Rose’s Medium posts — The founder’s perspective, in his own words, on the decisions, trade-offs, and scars that shaped D-Wave.

  9. "Quantum Computing: An Applied Approach" by Jack Hidary — A practitioner-friendly guide that helps connect the physics to real applications.

  10. Quantum Computing Report (quantumcomputingreport.com) — A solid ongoing tracker for the industry: companies, funding, benchmarks, and the shifting competitive landscape.

Share on Reddit

Last updated: 2025-12-26