All the big tech companies (and at least one U.S. government agency with the acronym NSA) are in a race to be the first to capture computing’s Holy Grail—the qubit. A qubit, or quantum bit, is the basic unit of information in a quantum computer. A qubit is different from a classical bit in computing, which can only exist in one state or another.
A bit is used to represent information stored and processed inside computers. It has two possible states: 0 and 1. Think of a bit as a light switch in which the “off” state is represented by a 0 and the “on” state is represented by a 1. That’s how computers process and store information. They are binary systems.
that could successfully scale to larger dimensions.
A qubit can exist in either a 0 or 1 state just like a classical bit, but it can also be in a superposition of both 0 and 1 simultaneously. This may not seem like a big deal to us non-quantum-scientist types, but in the realm of information storage and processing it is a very big deal because it will exponentially increase computing capacity and processing speed.
When quantum computers fully mature as a technology, they will be far superior to today’s fastest computers. Qubits are what will send the integrated circuit further up the steepening curve of Moore’s Law, which states that computer processing speed will double every couple of years. Moore’s Law, named so after Intel’s founder Gordon Moore who first made the hypothesis in a technical paper in 1965, has held true since 1970.
A computer’s processing capability is based on how many transistors can be packed onto an integrated circuit, which makes up a computer’s “central processing unit” or CPU. The more transistors, the faster the processing. We’re reaching the end of the road for Moore’s Law though. This is because the physical limits of the materials, mostly silicon and precious metals, cannot be made any smaller.
In an interview earlier this year, Moore himself predicted the end of his own law. “We won’t have the rate of progress that we’ve had over the last few decades,” he said. “I think that’s inevitable with any technology; it eventually saturates out. I guess I see Moore’s Law dying here in the next decade or so, but that’s not surprising.”
I think Moore is wrong. Indeed, every technology “eventually saturates out.” Like people, technology goes through a lifespan. It is created, develops, matures, then becomes obsolete and is eventually replaced by a new technology. I think Moore is wrong about his own law dying though. While we may reach a plateau in Moore’s Law with current transistor design, quantum computing promises to exponentially increase computer processing speeds beyond our imagination.
Last month, IBM unveiled a square quantum bit circuit design that they claim is “the only physical architecture that could successfully scale to larger dimensions.”
According to an IBM press release, “If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today’s TOP500 supercomputers could successfully outperform it.”
Currently, the world’s fastest supercomputer on the TOP500 list is the Tianhe-2 developed by China’s National University of Defense Technology. The Tianhe-2 is comprised of 16,000 computer nodes that provide over 3 million processor cores capable of processing 33.86 petaflops/
second. A petaflop is a quadrillion (thousand trillion) operations.
“Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today,” said Arvind Krishna, senior vice president and director of IBM Research. “While quantum computers have traditionally been explored for cryptography, one area we find very compelling is the potential for practical quantum systems to solve problems in physics and quantum chemistry that are unsolvable today.”
Another area that quantum computing would have a big impact is in Big Data. The term “big data” refers to collections of data sets so large and complex that they have become difficult to process using traditional data processing systems.
“Big data refers to things one can do at a large scale that cannot be done at a smaller one, to extract new insights or create new forms of value, in ways that change markets, organizations, the relationship between citizens and governments, and more,” wrote Viktor Mayer-Schonberger and Kenneth Cukler in their recent book Big Data: A Revolution That Will Transform How We Live, Work, and Think.
According to IBM, “Quantum computers could quickly sort and curate ever larger databases as well as massive stores of diverse, unstructured data. This could transform how people make decisions and how researchers across industries make critical discoveries.”
IBM’s announcement of its advancement in creating a quantum bit circuit design comes just a month after researchers at Google and University of California, Santa Barbara (UCSB) demonstrated the ability to program groups of qubits to detect certain types of errors that could corrupt a quantum computer’s calculations. That breakthrough comes less than a year after Google formally announced its initiative to build a quantum computer.
Qubits are susceptible to two types of errors: “bit-flip” and “phase-flip”. A bit-flip is when a 0 flips to be a 1 or vice-versa. A phase-flip occurs when a qubit is in that third state of superposition and the relation of 0 and 1 becomes flipped as well. These types of errors need to be detected and corrected in order for a quantum computer to reliably perform calculations.
“Up until now, researchers have been able to detect bit-flip or phase-flip quantum errors, but never the two together,” said Jay Gambetta, a manager in the IBM Quantum Computing Group. “Our four qubit results take us past this hurdle by detecting both types of quantum errors and can be scalable to larger systems.”
That’s a big hurdle to clear, arguably placing IBM out at the front of the pack in the quantum computing race that still has far to go before you have your own quantum computer.