Forget about bits and bytes and silicon chips. Forget about mega-this and giga-that. Discard the impatience you sometimes feel while waiting for your laptop to perform some supposedly “complicated” function. Rid your mind of all those annoying little icons — from primitive-looking wristwatches to twirling coloured spheres — that indicate time is passing, and passing, while your doddering digital device stumbles from one slowpoke application to another.
Where computers are concerned, quantum mechanics are the future.
“It’s not an instantaneous transition,” says Laflamme, 50, formerly of Quebec City and now executive director of the Institute for Quantum Computing at the University of Waterloo. “It’s a process. But a quantum computer is the Holy Grail.”
Comparing a fully realized quantum computing device to even the most powerful classical computers now in existence is a bit like comparing a vast galaxy of swirling planets and radiant stars to a basement bachelor apartment in a transit-starved neighbourhood of Toronto.
In other words, there is really no comparison at all.
But there are two formidable obstacles facing anyone in the fledgling quantum computing business.
Second, no matter how difficult they might be to fabricate, quantum computers are even more difficult to explain.
Despite these obstacles, quantum computing, or the prospect of quantum computing, is very much of the moment. Last month, two scientists — Frenchman Serge Haroche and American David J. Wineland — gathered in Stockholm to receive this year’s Nobel Prize for physics, honouring work they have done separately to advance the development of quantum computers.
They are far from alone.
In offices and laboratories scattered around the globe — in Russia, China, Singapore, Europe, the United States and Canada — some of the finest scientific minds are busily pursuing the same goal, a computing device whose fundamental components are so small as to be invisible to humans but whose calculating power would far outstrip any conventional computer you could imagine, even one as big as the universe itself.
Here in Canada, the main centre for this area of research — there are others — was founded 10 years ago at the University of Waterloo. Now the Institute for Quantum Computing is in the process of moving into palatial new quarters, the Mike & Ophelia Lazaridis Quantum-Nano Centre, designed by Toronto architectural firm KPMB and built at a cost of $ 160 million.
Headed by Laflamme, the institute brings 25 faculty members from roughly half a dozen countries together with about 40 post-doctoral fellows and roughly 90 students in the quest for something that is at once almost unimaginably small — on the scale of atoms and electrons — and yet colossally powerful.
Perhaps the best way to begin an explanation of what a quantum computer is might be to explain what it is not.
It is not a classical computer.
“A small quantum computer can do things no classical computer can do,” says Laflamme, a wiry, animated man with mobile facial features and a changeable yet nearly constant grin. “It’s a totally different class.”
Let’s face it, a classical computer is essentially a pretty stupid device. Its guts are a network of electrical circuits that can do just two things. They can be on or they can be off. They can be set at either “one” or “zero.”
Essentially, that’s all that classical computing is, a highly developed narrative of zeros or ones.
Here’s how it works.
A single computer circuit, called a bit, has just two possible states (0 or 1). Double that to two circuits and you get four states (00, 11, 01 or 10). With three circuits, you double the possible number of combinations yet again, this time to eight (001, 011, 111, 110, 100, 000, 101, 010). When you get to eight circuits, you have 256 possible combinations, and this arrangement is called a “byte.”
The processing power and memory capacity of laptop computers are nowadays calculated in gigabytes, meaning billions of bytes, and these numbers just keep getting bigger — but the trend is not going to last forever, not for classical computers.
At least since the 1970s, computer scientists have been familiar with a phenomenon known as Moore’s law, the principle that technological innovation causes computing power to double on average every two years. Or, to put the same idea another way, a computer circuit decreases in size, on average, by 50 per cent every two years, meaning you can fit twice as many into the same space.
At this rate, the computer industry is on track to run head-on into an immovable wall in approximately 10 years. Sometime between 2020 and 2030, processing circuits will have become so small — as small as atoms — that they will be regulated by the same physical imperative that governs atoms: the inability to get any smaller.
With any luck, however, this is where quantum computing will come in.
Put in its sparest terms, the difference between classical computing and quantum computing is a simple matter of a single conjunction — the substitution of “and” for “or.”
Recall that a conventional computer circuit, or a bit, can represent a 0 or a 1. It can be switched off or it can switched on. One or the other.
But a quantum bit is a device of an entirely different kind.
Owing to the bizarre properties that govern the microscopic realm of quantum mechanics, a quantum bit — known as a “qubit” — can represent a 0 and a 1, both of them at the same time. It can be simultaneously off and on.
Known as the uncertainty principle, this seemingly impossible attribute has no equivalent in the world that we humans can see and touch and feel, the domain of automobiles, maple trees, wide-screen TVs, bachelor apartments, cufflinks, pumpkin pie and classical computers.
But it seems to be critical to the nature of atomic and subatomic particles (including electrons, photons, protons, neutrinos and other minuscule dots of nature) that they exist in a haze of possibility rather than in any fixed state.
A microscopic particle can be both here and there at the same instant of time, which means that it can also be in every intermediate location between here and there, also at the same time.
“It turns out that this is incredibly powerful,” says Laflamme. “Just changing an ‘or’ to an ‘and.’”
Even though an atom-sized classical bit would be extremely small, it would still be restricted to just two possible states — off and on. By contrast, an atom-sized qubit would possess a near infinity of states, simultaneously.
What this means, by a common calculation, is that a quantum computer consisting of just 300 atoms would possess more sheer calculating power than a conventional machine harnessing every last atom in the universe, yielding an almost unimaginable increase in our ability to solve problems of immense complexity.
Such a machine could perform in an instant certain difficult calculations — for example, the factoring of very large numbers — that the largest conventional computer would need billions of years to figure out, if it could figure them out at all.
It turns out that classical computers are not very good at factoring large numbers, a weakness that has long been exploited by cryptographers to safeguard data on the Internet. It is easy to multiply two prime numbers in order to produce a much larger number, but it turns out to be horrendously difficult to engineer the same process in reverse, to find the two prime divisors of a large number, a process called factoring.
The only way classical computers can address the challenge is by systematic trial and error — trying out two numbers to see if they work, discarding them, trying out two different numbers, and so on. There’s no shortcut.
This defect in conventional computers is used to secure your banking information on the Internet, along with much else. Even armed with powerful computers, would-be hackers still cannot find a way to expose the key — the two original prime numbers used to secure the code that protects your data. The only method available is trial and error, examining every possible combination of divisors, one pair at a time, which could take forever.
By contrast, a quantum computer could crack such privacy barriers in an instant, by the dazzling expedient of testing every possible combination of divisors, not one by one, but all at once, something no conventional computer could do. The right answer would reveal itself almost immediately.
No wonder governments around the world are pouring huge amounts of money into quantum computing. Whoever is first to build a sophisticated quantum machine would suddenly be able to crack just about every secret code in cyberspace.
But that is far from all.
According to Laflamme, quantum computers will eventually ignite a revolution in a variety of fields other than cryptography. These might well include the development of new drugs, or the creation of new superconducting materials, or a range of other innovations currently impossible to predict.
“We are just scratching at the surface of this,” he says. “Today, we are just dipping our big toe in the shallow end.”
At least in a very basic form, quantum computers do now exist, but they are extremely limited both in complexity and capability, restricted to no more than a dozen qubits and able to perform only the simplest of mathematical feats. The highest number so far factored by a quantum device, using an algorithm developed in the 1990s by U.S. scientist Peter Shor, is 21.
Laflamme takes a visitor on a tour of the laboratory facilities at IQC’s existing premises — windowless rooms cluttered with an endless variety of complicated machines and devices, pipes and wires, including equipment that fabricates qubit chips from aluminum and niobium. The resulting slivers of metal are about the size of a baby’s fingernail and are extremely difficult to produce.
“Before we make one chip that works, we make 1,000 that don’t,” says Laflamme.
Even the chips that do work aren’t up to much in computational terms.
“If you want to add two plus two, we can do it,” he says. “If you want to add very large numbers, we can’t.”
But the journey is made in stages, and Laflamme believes that a quantum machine able to outpace conventional computers is no longer such a distant prospect.
“The boundary is about 50 qubits,” he says. “When we get there, we’re reaching a place where we’re departing the classical world.”
One big obstacle involves the inherent instability of the quantum universe, a territory where superposition — the ability of particles to occupy multiple locations at once — mysteriously collapses the very instant a human observer interferes with the process in any way, even by looking at a particle through an electron microscope.
Somewhat like a performer struck by stage fright, an observed particle will instantly stop behaving in a quantum fashion and instead adopt just one position or state. Somehow, researchers will have to overcome this phenomenon.
Laflamme, for one, is certain they will.