A milestone simulation of a 12,635-atom protein complex signals quantum computing’s maturation from experimental technology to a viable scientific tool for drug discovery.
Scientists at IBM (NYSE: IBM), Cleveland Clinic, and Japan’s RIKEN have simulated a protein complex of 12,635 atoms, the largest-ever molecule modeled with quantum hardware. The achievement, announced May 5, uses a hybrid quantum-classical approach that could shorten drug development timelines that currently stretch over a decade.
"This work marks an important advance and underscores quantum computing's emerging role on systems of relevance to drug discovery," said Kenneth Merz, Ph.D., lead author of the study and staff scientist in Cleveland Clinic's Computational Life Sciences Department. "By crossing the 12,000-atom barrier, we have significantly expanded the scale of biologically meaningful molecular simulations possible with quantum computing."
The simulation ran on IBM's 156-qubit Heron processors located at Cleveland Clinic and RIKEN, using up to 94 qubits and nearly 6,000 quantum operations. This task was coordinated with two of the world's most powerful classical supercomputers, Fugaku and Miyabi-G. The framework achieved a simulation of a molecule roughly 40 times larger than what the same method could handle just six months prior, with accuracy in key calculations improving by up to 210 times.
The research directly addresses a primary bottleneck in life sciences: accurately predicting how a drug candidate binds to a target protein. Today’s computational methods struggle with the complexity of large molecules, leading to expensive and lengthy trial-and-error lab work. This quantum-centric approach offers a path to more accurate energy calculations, potentially saving billions in research and development costs across the pharmaceutical industry.
A Hybrid Approach to Molecular Simulation
The breakthrough was enabled by a framework IBM calls “quantum-centric supercomputing,” which pairs quantum processors with classical supercomputers. In this model, the classical machines—Fugaku at RIKEN and Miyabi-G at the University of Tokyo—deconstructed the massive protein-ligand complexes into smaller, computable fragments.
IBM’s Quantum Heron processors then calculated the quantum-mechanical behavior of these individual pieces. The results were reassembled by the supercomputers to create a complete picture of the 12,635-atom molecule. A novel hybrid algorithm, EWF-TrimSQD, was instrumental in reducing the computational overhead, making it possible to simulate a system of this scale. This work builds on previous milestones, including the simulation of the 303-atom Trp-cage benchmark molecule.
From Hardware Metrics to Solved Problems
For years, quantum computing’s progress was measured by qubit counts and error rates. This achievement suggests a new metric: the significance of the problems it can help solve. "Quantum computers are no longer proving they are viable tools – they are proving they can contribute meaningful results in quantum-centric supercomputing architectures," said Jay Gambetta, Director of IBM Research.
For investors, this signals a tangible return on decades of R&D for IBM, which trades at a forward P/E ratio of around 19. While not impacting near-term earnings, this demonstrates a clear path to applying quantum computing in the high-value pharmaceutical and biotech sectors. Competitors like Alphabet (NASDAQ: GOOGL) and startups such as PsiQuantum and Infleqtion are pursuing different paths to fault-tolerant quantum computing, but IBM's demonstration on a real-world scientific problem gives it a key proof point. The ability to accurately model molecular interactions could become a significant long-term revenue driver as quantum systems become integrated into standard R&D workflows at major pharmaceutical companies like Pfizer and Merck.
This article is for informational purposes only and does not constitute investment advice.