What is so difficult about building a quantum computer anyway?

NV1.jpg

When talking about quantum computing, we typically focus on “what is it?”, “why do we want it?” and the “when do we get it?” ..... maybe even “how much will it cost?”. An equally valid question to ask is “why don’t we have one already?” Although it sounds like Veruca Salt has suddenly developed an interest in quantum technology, there are some interesting physics and engineering issues behind why building a quantum computer is fundamentally more difficult than any computer or electronic engineering that has gone before. After all, the phone in my pocket has literally billions of transistors all switching on and off synchronised to a clock which ticks more than a billion times per second. That is enough ticks and tocks, and bits and blocks to make even Dr. Seuss proud - and yet we practically take it for granted.

Given this phenomenal ability to design and manufacture such extraordinary devices, how hard could it be to wire up a few thousands qubits in order to discover the next wonder-drug, or read the odd foreign dictator’s email?

Well, there are several important differences between conventional (classical) computers and quantum computers. To understand these differences, lets first discuss the fundamental components of a quantum computer.  During the late 90s, David DiVincenzo proposed a list of “criteria” that any proposed quantum computer would have to fulfil to be useful. These criteria can be summarised as follows:

1. You must be able to build qubits and build them in a way that allows you “scale up” to thousands or millions of qubits for a full quantum computer.
2. You must be able to initialise these qubits in some known state.
3. These qubits must have long decoherence times.
4. You must be able to apply operations or gates to these qubits which are “universal”.
5. You must be able to measure (at least some of) the qubits.

(Note for the technical specialist: I know there are sometimes two additional criteria added regarding “flying qubits” but given modern error correction codes, I consider these to now be largely unnecessary or subsumed into the other five).

The first and probably biggest problem is that we don’t even know what to make our quantum computer out of. Unlike classical bits which can take one of two states (0 or 1), a qubit must be described by two numbers describing its population and phase. In particular, the population can take on any value from 0 to 1. Although the requirements for a qubit sounds exotic, there are many examples of such systems in nature and we can now manufacture many more. As a consequence, there are many many proposals in the literature on how to build a quantum computer. Recently the scientific community has narrowed this down to a few leading candidates; including ions held in electromagnetic traps, superconducting circuits, semiconductor devices (defects, donors or quantum dots) and photonic schemes using modes of light. All these approaches (and more) can in principle satisfy the DiVincenzo criteria but the devil is in the detail and this is what currently limits progress for each of these approaches. 

Qubit scalability.
Most of us have baked a cake at some point in our lives. If you are good at it, it might take 15 minutes to get set up, mix everything, put it in the pan, take it out at the end - plus perhaps 45 mins cooking time. What about if you wanted to bake two cakes? You don’t have to get all the ingredients out twice, both cakes can probably cook in the oven at the same time, you know the recipe well, you might even be able to mix both cakes in the same bowl (if they are the same flavour!). All up, two cakes might take a total of 70 minutes - considerably less than 2x(15+45)=120 minutes. Now what about 10 cakes? Bakers do this regularly, they have larger ovens, industrial mixers, cooling racks etc. What about 100 cakes? 1000? 100000?

This is the issue of large scale manufacturing. When you have to produce thousands or millions of copies of a particular item or product, the manufacturing process has to be redesigned from the ground up. Most of the quantum computing applications which are relevant to humanity right now (quantum chemistry, decryption etc.) require many thousands if not millions of qubits, so any would-be quantum computer manufacturer better have a credible plan for creating such large numbers of controllable qubits. This doesn’t just mean “stamping” them out one by one, but you must be able to fabricate them in a large scale way as well as calibrate and control them. All the existing quantum designs are based on “small scale” experiments done in labs around the world. However, turning these small scale experiments into full engineering solutions for a large scale quantum computer takes time, planning, testing and expertise in a technology which is exceptionally new and untested.

Decoherence.
If you buy two cheap plastic clocks, set them to the same time and then place them at opposite ends of your house, over time they will slowly drift our of synchronisation. They have slightly different mechanisms, their batteries might have slightly different levels of charge, they might even be in a warmer or colder part of the house. All these differences result in very slightly different speed of rotation of the hands, and so they speed up or slow down relative to each other.

Notice, I explicitly said cheap plastic clocks at either end of the house. If they are beautifully build grandparent clocks resting on a common wood floor, then far more interesting physics is at play.

Now imagine there are 100 cheap clocks. After a while there will be a large range of different times and so all the clocks disagree, however you might imagine that the average time is still approximately right. During operation of a quantum computer, the phase which is used to describe the state of each qubit varies as a function of time. If there are two qubits which have slightly different environments, their phase varies more quickly or more slowly and they get “out of sync”. This effect physicists refer to as dephasing, or more generally decoherence. We say that the “coherence” of the qubits is lost over time. 

Unfortunately, coherence is essential for a conventional quantum computer to function (I say conventional here as its currently less clear how important coherence is in annealing machines but this is an entire topic in itself). To build qubits that are perfectly coherent, we would have to control and understand all stray electric and magnetic fields, eliminate all vibrations, even isolate our computer from all light from the ultra-violet down well past the infrared. This is a level of complexity that has never been necessary and never even attempted in conventional computers. In fact, an important advantage of modern digital computers is that the bit being in 0 or 1 is all that matters. So if electrical noise or other random influences make the signal go to 1.1 or 0.9 .... it still counts as a 1. Equivalently, 0.1 or -0.1 are treated as 0. It is inherently robust to noise up to a certain level.

Quantum computers (at least with components that exist in labs right now) have no such inherent robustness. Each different type of quantum computer has different decoherence sources which come from the materials used to make the machine and the way in which it is designed. For the last 10-20 years or more, the designs of quantum computers and the materials used to build them have been evolving precisely to try and reduce the amount of decoherence. Great strides have been made, but the errors introduced by decoherence are still millions of times greater than error rates in conventional computers.

However, all (coherence) is not lost. Peter Shor showed that we can use the concept of measurement collapse in quantum mechanics to perform Quantum Error Correction. In short, the idea of quantum error correction is to use several (physical) qubits to represent the state of one “logical” qubit. This method of representing the state of the logical qubit (called an “encoding”) is performed in such a way that if the correct operations and measurements are performed on parts of this encoding, then the total system is collapsed into one of two states. Either no error, or one known error, which can be corrected. If this process is performed often enough, the effects of decoherence can be corrected. However, the specifics of when is “often enough” turns out to be one of the key critical issues in quantum computer design. This depends on how fast can you apply operations and measure your qubits, how many physical qubits are required for encoding one logical qubit, how fast you can apply the required corrections, and how strong was the decoherence in the first place.

Quantum control.
The remaining three DiVincenzo criteria can be loosely grouped under the heading “quantum control”. Control engineering is a vast and important field in the modern age, whether it is keeping people upright on a Segway, sending astronauts (or cosmonauts or taikonauts) into space, preventing your car skidding on a puddle or preventing a washing machine from destroying itself during the spin cycle. The ability to use small computers or electronic circuits to apply minor corrections to a machine to keep it operating correctly is one of the lesser appreciated but fundamentally important aspects of modern technology. A quantum computer will be no different. It will come with a myriad of electronic circuitry and computer systems to initialise each and every qubit at the start of a computation, to apply gate operations to actually perform the calculation and then to measure out the result of the calculation. (Although it should be said that due to the magic of quantum mechanics, its completely unclear if the calculation has actually been performed until after it has been measured!)

Although initialisation and measurement are generally understood for most major types of quantum computer designs, it is important to emphasis that these need to be performed exceptionally precisely (typically with something like a 99.9999% success rate). Similarly, there are sets of (quantum) logic gates which must be applied with similar precision. If one thinks of the usual computing gate operations (AND, OR, NOT, XOR, NAND, NOR etc.) in the quantum computing world all of these gates exist as well as more exotic examples like Hadamard gates, controlled-NOT gates, T-gates, S-gates, Toffoli gates and iSWAP gates. Although we now know quite a lot about how these gates work and how they need to be put together to perform interesting calculations, how to do it optimally is still very much open. Is it best to have the fastest gates so that we can beat decoherence? Should we use the gates that are easiest to implement so they can be done with greater precision? Do we need all the gates, or just a few of them but use them often? When trying to implement quantum error correction, do we just introduce more errors from all the gates we have to apply?. These questions all need to be answered, but the answer depends on which type of quantum computer you are building and how it performs, both on the drawing-board and in the lab.

Once all of these questions are settled, we have a type of quantum computer that we can scale up, that has long decoherence and that we control - we are done, right? Well, not quite. There are then software challenges. How does one perform quantum error correction in the most efficient way? How do we design the support hardware and software (the classical computer that will control the quantum computer)? How do we design our qubits so that when we make millions of them, they are all identical (or close enough to identical)?

For a long time, the way forward was unclear and everyone working in quantum computing had their own ideas about what a working quantum computer would look like. Now, things are settling and there are a few leading quantum computer designs. This new focus is not necessarily because we definitely know which way forward but because a few major ideas have progressed far enough that now we know that the principles are sound and it is worth pushing the designs as far as we can. The recent entry of commercial quantum computing efforts has also focused attention much more on the mundane engineering problems required to ultimately get quantum computers to work, without the additional scientific overhead of needing to publish, graduate or get tenure.

Ultimately, the quest to build a quantum computer may well prove to be one of humanities most impressive technological feats of the 21st century, simply because it requires such precise control over the materials it is built from and the software used to run it. It is engineering at a scale and at a level of precision that we could only dream of a few decades ago.

- Jared Cole (cole@h-bar.com.au), co-founder, h-bar quantum consultants