- Quantum computers require reliable qubits, but those can be hard to come by and are limited in their error correction abilities.
- A new study from the University of Sydney has announced the development of a 2D error-correction architecture that could spot quantum errors using fewer qubits, thereby making them more efficient.
- This efficiency could lead to more compact quantum hard drives that can encode quantum information more reliably.
The promise of quantum computers has not been overstated. By using the incredible calculating properties of qubits—the quantum equivalent of bits—these machines could speed up medical breakthroughs, discover exotic materials, create unbreakable cybersecurity, and even help solve climate change. For all the amazing advancements made possible by the classical number-crunching of ones and zeroes, quantum computers would elevate human innovation into a whole new era.
There’s just one problem: qubits have a nasty habit of producing errors. This side effect is understandable when you understand what it takes to sustain these little capsules of computation in the first place (near absolute temperatures, superconducting magnets, etc…). Eventually, qubits will experience decoherence if they experience what’s known as “noise”—created by any temperature fluctuation or electromagnetic interference—causing them to producing errors. And if a qubit can’t be held in the quantum state of superposition where it can be both one and zero at the same time, then it’s basically just a classical bit.
These errors are one of the primary limiting factors of quantum computers. By one estimate, one error occurs every 1,000 operations, and error rates need to be something like one in a trillion for quantum computing to really take off. Thankfully, two quantum information theorists from the University of Sydney Nano Institute have developed new architecture capable of suppressing errors in a quantum system using fewer qubits. This breakthrough not only makes quantum information storage more reliable, but it brings the world one step closer to the long sought-after quantum hard drive. The results of the study were published in the journal Nature Communications.
“There remain significant barriers to overcome in the development of a universal quantum computer,” University of Sydney’s Dominic Williamson, a co-author of the study, said in a press statement. “One of the biggest is the fact we need to use most of the qubits—quantum switches at the heart of these machines—to suppress the errors that emerge as a matter of course within the technology. Our proposed quantum architecture will require fewer qubits to suppress more errors, liberating more for useful quantum processing.”
This new, efficient architecture takes error correction to a whole new dimension—literally. Usually, 3D error-correction can only mitigate errors along a single line of qubits. But this new technique, according to the paper, uses a 3D lattice with “topological codes with optimal scaling code parameters” that essentially allow for error correction across two dimensions within the 3D structure. The single-dimension nature of previous error correction methods limited the system, but this new architecture could push forward a new era of quantum machines, including quantum hard drives.
“This advancement could help transform the way quantum computers are built and operated, making them more accessible and practical for a wide range of applications, from cryptography to complex simulations of quantum many-body systems,” Stephen Bartlett, director of the University of Sydney Nano Institute, said in a press statement.
Of course, creating this kind of lab-based quantum error correction is one thing—scaling such a breakthrough is another challenge entirely. But this new step toward 2D error correction brings humanity undeniably closer to realizing its quantum potential.
Darren lives in Portland, has a cat, and writes/edits about sci-fi and how our world works. You can find his previous stuff at Gizmodo and Paste if you look hard enough.