Delays in The D-Wave: Quantum Computer Not Yet Processing as Expected

Even with the aid of physical phenomena, quantum computers aren’t quite performing at warp speed as scientists and astronauts had hoped. 

Last year, the Canadian D-Wave Computer was just beginning to make headlines as the first quantum computer, shared between NASA and Google. This computer was intended to revolutionize computing powers by operating at warp speed while juggling simultaneous variables. 

Unlike standard computers that process bits of information coded in 0’s or 1’s, the D-Wave computer analyzes “quibits,” short for quantum bits. Rather than solving problems that relied on constant variables with bit values at 0 or 1, the D-Wave computes assuming both variables at once. This feature would give quantum computers the ability to solve more complex problems even faster than their bit-driven predecessors. Scientists at NASA intended to use the new quantum computer to further space exploration while preserving astronaut safety. Theoretically, it could assist in space explorations by advancing artificial intelligence and ‘thinking’ more like an actual person capable of planning multiple scenarios at once.

Initial tests of the D-Wave gave scientist Zhengbing Bian and others at the D-Wave’s birthplace in Vancouver at D-Wave Systems hope for the potential of the quantum computer’s potential. Back in 2012, the first D-Wave computer calculated the world’s largest quantum computation using only 84 quibits in a mere 270 milliseconds. The problem involved two-color Ramsey numbers, which mathematicians have struggled with for years. In layman’s terms, the quantum computer was required to count and calculate a specific number that resulted from a systematic pattern in a seemingly completely random setting. Some of these calculations have been impossible to solve to date, but the D-Wave computer made it look like calculating a tip after dinner.

Scientists at the University of Southern California continued to demonstrate the D-Wave’s success through late June of 2013. Director of the Quantum Computing Center Daniel Lidar claimed that the D-Wave processer found the lowest-energy solutions much faster than any traditional analytical computer operating on bits.

And yet despite its promising beginning, developments in the quantum computer may have been stalled. According to a study published in Science by researchers at the University of Southern California, the quantum computer may not actually be faster than classical models. Scientists gave the D-Wave computer a series of problems that became increasingly complicated; the time it took to do so increased exponentially with the degree of difficulty of the problem. Rather than saving time with complex problems, the inconclusive results show that quantum computers may behave similarly to the already existing computers.

Rather than finding failure in the D-Wave, Lidar speculates that perhaps this wasn’t the right test for the D-Wave. He believes the problems given to the quantum computer were too easy, and could have been calculated by a traditional computer all along. In other words, the full potential of the quantum computer was never realized. In order to fully test the quantum computer in the future, research must be conducted to find even the right type of questions.

Of course, the ‘correct’ type of problem for a quantum computer can only be tested by experts with the right resources; Lider claims that quantum computers are best suited for optimization problems that may take a heavy physics background to even theorize.

And then there are the physical problems: First, in order to work properly, quantum processors — commercially called “Rainier chips” — must be kept in a magnetized, supercool chamber to prevent them from overheating. This intense technology habitat can only be created in advanced labs with the electricity to keep such chambers at close to absolute zero; NASA’s Ames Research Center has one, as does the University of Southern California. Additionally, D-Wave computers still have relatively low memory, which makes it ill-suited for recognizing and predicting patterns as scientists had originally hoped. Finally, with a price tag of approximately $10 million for the computer alone, only a handful of labs and organizations can even afford to invest in them, let alone the extensive research needed to understand them further.

Lider and his colleagues at USC are hardly discouraged; they recognize that as they learn more about the operations of the D-Wave computer, they will understand how to calculate problems suited for it while keeping it in its optimal environment to enhance its quantum abilities. As testing continues, hopefully the initial goals of using this space-age computer to enhance artificial intelligence will come to fruition.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s