February 7, 2018
The Race to Quantum Computers
by Vivek Wadhwa
As progress in the field accelerates at an exponential rate, 2018 should see an avalanche of breakthroughs toward “quantum supremacy.”
While much of the media attention has been focused on the race among nations to develop the most powerful artificial intelligence systems, an equally crucial race has been heating up: the race to build the first working quantum computers.
As progress in the field accelerates at an exponential rate, 2018 should see an avalanche of breakthroughs. It is a race for “quantum supremacy,” when a quantum computer demonstrably and markedly outperforms a classical supercomputer for any class of problems.
Booth Google and IBM, two leaders in quantum computing, have laid out plans to achieve this goal. Intel also has a horse in the race, announcing a new 49 qubit neuromorphic chip designed for quantum computing research.
The stakes are enormous. Quantum computers promise to set a new paradigm for solving some of the hardest math and computing problems today—problems such as analyzing the interactions of multiple genes in health outcomes, modeling the energy states of chemicals and predicting the behavior of atomic particles. They also might make the internet inherently insecure by quickly cracking modern cryptography used to lock our IT infrastructure and the web.
One thing is for sure: The era of quantum computing is coming on soon, and the world will never be the same.
Put simply, quantum computers use a unit of computing called a qubit. While regular semiconductors represent information as a series of 1s and 0s, qubits exhibit quantum properties and can compute as both a 1 and a 0 simultaneously. That means two qubits could represent the sequence 1-0, 1-1, 0-1, 0-0 at the same moment in time. This compute power increases exponentially with each qubit. A quantum computer with as few as 50 qubits could, in theory, pack more computing power than the most powerful supercomputers on earth today.
This comes at a timely juncture. Moore’s Law dictated that computing power per unit would double every 18 months while the price per computing unit would drop by half. While Moore’s Law has largely held true, the amount of money required to squeeze out these improvements is now significantly greater than it was. In other words, semiconductor companies and researchers must spend more and more money in R&D to achieve each jump in speed. Quantum computing, on the other hand, is in rapid ascent.
One company, D-Wave Systems, is selling a quantum computer that it says has 2,000 qubits. However, D-Wave computers are controversial. While some researchers have found good uses for D-Wave machines, these quantum computers have not beaten classical computers and are only useful for certain classes of problems—optimization problems. Optimization problems involve finding the best possible solution from all feasible solutions. So, for example, complex simulation problems with multiple viable outcomes may not be as easily addressable with a D-Wave machine. The way D-Wave performs quantum computing, as well, is not considered to be the most promising for building a true supercomputer-killer.
Google, IBM and a number of startups are working on quantum computers that promise to be more flexible and likely more powerful because they will work on a wider variety of problems. A few years ago, these flexible machines of two or four qubits were the norm. During the past year, company after company has announced more powerful quantum computers. In November 2017, IBM announced that it has built such a quantum machine that uses 50 qubits, breaking the critical barrier beyond which scientists believe quantum computers will shoot past traditional supercomputers.
The downside? The IBM machine can only maintain a quantum computing state for 90 microseconds at a time. This instability, in fact, is the general bane of quantum computing. The machines must be super-cooled to work, and a separate set of calculations must be run to correct for errors in calculations due to the general instability of these early systems. That said, scientists are making rapid improvements to the instability problem and hope to have a working quantum computer running at room temperature within five years.
And here’s where the confluence of quantum computing and AI looks so promising. As we are seeing the first major impacts of wide-scale artificial intelligence, we are also realizing that classic semiconductor-based computing limits our ability to solve the biggest problems that we had hoped artificial intelligence could tackle. They expect quantum computers to start performing very useful calculations well before they’re ready to leave the freezer.
See also: 7 Steps for Inventing the Future
Quantum computing promises to step into that breach and provide the rocket fuel needed to solve these grand challenges. Precisely targeted medical treatments, radically cheaper energy production and new types of super-strong materials are all breakthroughs that quantum computing could make possible by performing billions and billions of calculations simultaneously in a relatively small package. Google researchers demonstrated the promise when they used quantum computing to simulate the electron structure of a hydrogen molecule, a key step toward moving chemical design from empirical measurement and educated guesses to more proper engineering and simulation. (This will also work for drug discovery.)
The perils of quantum computing are also real. Quantum computers will be able to easily crack most forms of encryption in use today (although security experts are already at work on creating codes that are not crackable by qubit attack). Should Russia or China, for example, gain quantum computing dominance—which is entirely possible—they could use their advantage for even more sophisticated hacking and decrypting of encoded communications.
Between governments, big companies, startups and university labs, some of the brightest engineering minds are rushing toward quantum supremacy. This literally could shift the global balance of power.
This article was written by Vivek Wadhwa and Alex Salkever.