Tests Tells Real Quantum Computer Chips from Fakes
Quantum computers utilize quantum mechanics to make computations at rapid speeds.
CREDIT: University of Southern California, Lockheed Martin
Quantum computers — computers that use quantum mechanics to store data and make computations — have the potential to be much faster and more powerful than classical computers, scientists believe.
But making a quantum computer is easier said than done. That's why a new paper by scientists at the University of Southern California and defense contractor Lockheed Martin is so significant: They have devised a method to determine whether a quantum computing processor is indeed using quantum mechanics to perform its calculations.
This particular experiment was conducted using a quantum optimization processor with 108 qubits — the quantum equivalent of the classical "bit," the most basic unit of information that computers use to store data. [See also: What is Quantum Computing?]
The chip, nicknamed Rainier, was made by a Canada-based manufacturing company called D-Wave that specializes in quantum computing.
During the course of the experiment, the USC scientists took various groups of 8 qubits from the chip's 108 and tested them together.
The experiment confirmed that Rainier was functioning in a way that was consistent with quantum mechanics and inconsistent with classical mechanics. In other words, a classical understanding of computation was unable to explain the way this chip was operating.
When scientists refer to things as "classical" or "quantum," they're referring to two different approaches to physics. "Classical" refers to the conception of physics based on Newton's laws of motion.
In the late 19th and early 20th century, physicists began to realize that Newtonian, or classical physics, didn't apply to certain particles, particularly particles operating at an extremely small scale.
At this scale, scientists believe, particles don't have precise speeds or locations. Instead of a precise data set, the particle has a series of probable speeds and locations, called a wave function.
Quantum computing is about applying these principles to computation.
At their most basic level, computers store data as complex series of ones and zeroes, which is called binary code. Each individual data point — the ones and the zeroes — is called a bit.
In classical computing, a bit can be either a one or a zero. In quantum computing, a qubit can be both, because in the quantum understanding of particles, a particle has probable data, not precise data.
Because a qubit can contain multiple data points, the computer is able to perform multiple computations simultaneously, called parallel processing.
It's possible to think of qubits as the quantum equivalent of the classical bit, but in reality, the way these two systems approach storage of information is entirely different. [See also: Future Quantum Computers Begin to Take Shape]
Quantum computing can also take advantage of a process called quantum tunneling that will allow quantum computers to perform their calculations at drastically lower energy levels.
The idea, explained Federico Spedalieri, a computer scientist at USC's Viterbi Information Sciences Institute, is that when performing calculations, "you want to find the lowest energy state of a system." During the process of finding that lowest state, it's possible for the computer to find itself in a dead end, surrounded only by higher energy states. These high-energy states can be thought of as walls.
In classical computing, the computer would have to expend a spike of energy to surmount these walls, but quantum particles can simply "tunnel" through them. In other words, explained Spedalieri, "quantum mechanics allows you to tunnel to the lowest energy state." This entire process is called "quantum annealing" and could help computers perform not only more quickly, but using less energy.
Rainier was also able to avoid a problem called "decoherence," which many quantum processors encounter. Decoherence means the qubits stop "cohering," or working together, when systems get too large or there's too much outside interference.
For that reason, previous quantum computers have only been able to use 1 or 2 qubits for computation.
Rainier was developed two years ago. The scientists spent two years trying to find a way to determine that quantum processes and not classical ones are at play.
Part of the process included finding a way to avoid a problem that many quantum processors encounter called "decoherence." This means that the qubits stop "cohering," or working together in a quantum fashion, when systems get too large or there's too much outside interference.
The scientists were able to prevent decoherence by putting Rainier in a magnetically shielded box kept at a temperature of near absolute zero. At this temperature, which is approximately minus 459 degrees Fahrenheit, entropy (the state of systems growing ever more chaotic) slows to a near halt.
This methodology will be useful for creating and testing future quantum computing devices. [See also: Experiment Demonstrates Possibility of Quantum Internet]
The USC scientists' paper was published June 28 in the journal Nature Communications.
But Rainier has already lost its seat as the largest quantum processing chip. The USC scientists have already received a new chip from D-Wave, nicknamed "Vesuvius," that has 512 qubits. The study of that chip is currently in progress, but Spedalieri said that there appears to be no reason to believe Vesuvius won't work just as well as Rainier does.