Delays in The D-Wave: Quantum Computer Not Yet Processing as Expected

Even with the aid of physical phenomena, quantum computers aren’t quite performing at warp speed as scientists and astronauts had hoped. 

Last year, the Canadian D-Wave Computer was just beginning to make headlines as the first quantum computer, shared between NASA and Google. This computer was intended to revolutionize computing powers by operating at warp speed while juggling simultaneous variables. 

Unlike standard computers that process bits of information coded in 0’s or 1’s, the D-Wave computer analyzes “quibits,” short for quantum bits. Rather than solving problems that relied on constant variables with bit values at 0 or 1, the D-Wave computes assuming both variables at once. This feature would give quantum computers the ability to solve more complex problems even faster than their bit-driven predecessors. Scientists at NASA intended to use the new quantum computer to further space exploration while preserving astronaut safety. Theoretically, it could assist in space explorations by advancing artificial intelligence and ‘thinking’ more like an actual person capable of planning multiple scenarios at once.

Initial tests of the D-Wave gave scientist Zhengbing Bian and others at the D-Wave’s birthplace in Vancouver at D-Wave Systems hope for the potential of the quantum computer’s potential. Back in 2012, the first D-Wave computer calculated the world’s largest quantum computation using only 84 quibits in a mere 270 milliseconds. The problem involved two-color Ramsey numbers, which mathematicians have struggled with for years. In layman’s terms, the quantum computer was required to count and calculate a specific number that resulted from a systematic pattern in a seemingly completely random setting. Some of these calculations have been impossible to solve to date, but the D-Wave computer made it look like calculating a tip after dinner.

Scientists at the University of Southern California continued to demonstrate the D-Wave’s success through late June of 2013. Director of the Quantum Computing Center Daniel Lidar claimed that the D-Wave processer found the lowest-energy solutions much faster than any traditional analytical computer operating on bits.

And yet despite its promising beginning, developments in the quantum computer may have been stalled. According to a study published in Science by researchers at the University of Southern California, the quantum computer may not actually be faster than classical models. Scientists gave the D-Wave computer a series of problems that became increasingly complicated; the time it took to do so increased exponentially with the degree of difficulty of the problem. Rather than saving time with complex problems, the inconclusive results show that quantum computers may behave similarly to the already existing computers.

Rather than finding failure in the D-Wave, Lidar speculates that perhaps this wasn’t the right test for the D-Wave. He believes the problems given to the quantum computer were too easy, and could have been calculated by a traditional computer all along. In other words, the full potential of the quantum computer was never realized. In order to fully test the quantum computer in the future, research must be conducted to find even the right type of questions.

Of course, the ‘correct’ type of problem for a quantum computer can only be tested by experts with the right resources; Lider claims that quantum computers are best suited for optimization problems that may take a heavy physics background to even theorize.

And then there are the physical problems: First, in order to work properly, quantum processors — commercially called “Rainier chips” — must be kept in a magnetized, supercool chamber to prevent them from overheating. This intense technology habitat can only be created in advanced labs with the electricity to keep such chambers at close to absolute zero; NASA’s Ames Research Center has one, as does the University of Southern California. Additionally, D-Wave computers still have relatively low memory, which makes it ill-suited for recognizing and predicting patterns as scientists had originally hoped. Finally, with a price tag of approximately $10 million for the computer alone, only a handful of labs and organizations can even afford to invest in them, let alone the extensive research needed to understand them further.

Lider and his colleagues at USC are hardly discouraged; they recognize that as they learn more about the operations of the D-Wave computer, they will understand how to calculate problems suited for it while keeping it in its optimal environment to enhance its quantum abilities. As testing continues, hopefully the initial goals of using this space-age computer to enhance artificial intelligence will come to fruition.

NASA Google Quantum Computer: The World’s Most Expensive Computer Thinks Like a Human

Who would have thought that quantum physics would find its way into a computer?

Just last Friday, NASA teamed up with Google to invest in the world’s first quantum computer. This computer is no MacBook Pro: At the steep price of $15 million, this doozy of a processor will use quantum computing for unheard-of calculation speeds 3600 times faster than those of conventional computers.

The Canadian D-Wave-Two is the first commercially available computational system that supposedly utilizes quantum tunneling to solve complex mathematical equations. This process represents a complete overhaul of the way computer scientists have thought about processing.

The first computer developed in 1948 filled rooms and took hours to process what can now be done in the blink of an eye. In addition to accelerating computational number crunching, engineers have compacted these machines to fit in the palm of your hand. Based on this knowledge, it hardly seems probable that we could need even smaller and faster computers.

Most computers function using the simple — albeit complex to the untrained eyes — binary systems in which everything can be coded down to zeroes and ones. These codes both store information and execute operations as needed.

The D-Wave System uses the peculiar nature of incredibly tiny bits of matter. While most things we know are bound to existing in one state at a time, quantum particles have a funny way of being both simultaneously. Rather than thinking of information as data stored in zeroesor ones, quantum computing analyzes quibits — short for quantum bits — as both at once.The effect of this type of computer processing system allows the computer to predict the outcome of multiple scenarios at once.

Remember those simultaneous equations and quadratics where x had a couple possible outcomes from high school? Or even just think of a time when you tried to run a scenario through your head and tried to figure out all the possible outcomes, depending on a number of different situations. Quantum computing solves these types of problems, except with extremely complex scenarios. This state-of-the-art computer can solve lengthy, multi-variable equations in a fraction of a second, whereas typical software currently used in spaceships takes up to half an hour.

Though smaller versions of this type of computer have existed and shared quibits of data between them, NASA and Google’s new supercomputer would be the first of its kind. Despite the fact that the computer will be shared between the two, given the computing power there’s no doubt the two companies will revolutionize technology in their respective fields. Engineers at NASA’s Ames Research Center in California estimate that it will be available to be installed in NASA as early as autumn of this year.

Though preliminarily applications of this computer would be strictly on the ground for planning and scheduling for NASA, the implications for future artificial intelligence are out of this world. By running a number of different scenarios simultaneously, spacecrafts could be programmed to make complicated decisions similar to those made by astronauts. At present, the ability to make decisions based on a number of variables separates computers from humans. Imagine if instead of putting actual humans into space, we had a computer that could analyze all sorts of information — from temperatures to fuel levels to anything else astronauts manage while in space — in order to make decisions for a successful mission. While this computer may put astronauts out of a job, it stretches the horizons of our space explorations quite literally.

The D-Wave-Two isn’t perfect: It’s still about the size of a garden shed and requires cooling systems to keep the parts from overheating. And of course, the steep cost of $15 million will keep most prospective buyers at bay. But given the fact that this is what we said about the first modern computers, this machine is bound to evolve quickly and advance technology’s capabilities beyond what we can imagine.

Original article available from Policy Mic