commentary

What is so difficult about building a quantum computer anyway?

NV1.jpg

When talking about quantum computing, we typically focus on “what is it?”, “why do we want it?” and the “when do we get it?” ..... maybe even “how much will it cost?”. An equally valid question to ask is “why don’t we have one already?” Although it sounds like Veruca Salt has suddenly developed an interest in quantum technology, there are some interesting physics and engineering issues behind why building a quantum computer is fundamentally more difficult than any computer or electronic engineering that has gone before. After all, the phone in my pocket has literally billions of transistors all switching on and off synchronised to a clock which ticks more than a billion times per second. That is enough ticks and tocks, and bits and blocks to make even Dr. Seuss proud - and yet we practically take it for granted.

Given this phenomenal ability to design and manufacture such extraordinary devices, how hard could it be to wire up a few thousands qubits in order to discover the next wonder-drug, or read the odd foreign dictator’s email?

Well, there are several important differences between conventional (classical) computers and quantum computers. To understand these differences, lets first discuss the fundamental components of a quantum computer.  During the late 90s, David DiVincenzo proposed a list of “criteria” that any proposed quantum computer would have to fulfil to be useful. These criteria can be summarised as follows:

1. You must be able to build qubits and build them in a way that allows you “scale up” to thousands or millions of qubits for a full quantum computer.
2. You must be able to initialise these qubits in some known state.
3. These qubits must have long decoherence times.
4. You must be able to apply operations or gates to these qubits which are “universal”.
5. You must be able to measure (at least some of) the qubits.

(Note for the technical specialist: I know there are sometimes two additional criteria added regarding “flying qubits” but given modern error correction codes, I consider these to now be largely unnecessary or subsumed into the other five).

The first and probably biggest problem is that we don’t even know what to make our quantum computer out of. Unlike classical bits which can take one of two states (0 or 1), a qubit must be described by two numbers describing its population and phase. In particular, the population can take on any value from 0 to 1. Although the requirements for a qubit sounds exotic, there are many examples of such systems in nature and we can now manufacture many more. As a consequence, there are many many proposals in the literature on how to build a quantum computer. Recently the scientific community has narrowed this down to a few leading candidates; including ions held in electromagnetic traps, superconducting circuits, semiconductor devices (defects, donors or quantum dots) and photonic schemes using modes of light. All these approaches (and more) can in principle satisfy the DiVincenzo criteria but the devil is in the detail and this is what currently limits progress for each of these approaches. 

Qubit scalability.
Most of us have baked a cake at some point in our lives. If you are good at it, it might take 15 minutes to get set up, mix everything, put it in the pan, take it out at the end - plus perhaps 45 mins cooking time. What about if you wanted to bake two cakes? You don’t have to get all the ingredients out twice, both cakes can probably cook in the oven at the same time, you know the recipe well, you might even be able to mix both cakes in the same bowl (if they are the same flavour!). All up, two cakes might take a total of 70 minutes - considerably less than 2x(15+45)=120 minutes. Now what about 10 cakes? Bakers do this regularly, they have larger ovens, industrial mixers, cooling racks etc. What about 100 cakes? 1000? 100000?

This is the issue of large scale manufacturing. When you have to produce thousands or millions of copies of a particular item or product, the manufacturing process has to be redesigned from the ground up. Most of the quantum computing applications which are relevant to humanity right now (quantum chemistry, decryption etc.) require many thousands if not millions of qubits, so any would-be quantum computer manufacturer better have a credible plan for creating such large numbers of controllable qubits. This doesn’t just mean “stamping” them out one by one, but you must be able to fabricate them in a large scale way as well as calibrate and control them. All the existing quantum designs are based on “small scale” experiments done in labs around the world. However, turning these small scale experiments into full engineering solutions for a large scale quantum computer takes time, planning, testing and expertise in a technology which is exceptionally new and untested.

Decoherence.
If you buy two cheap plastic clocks, set them to the same time and then place them at opposite ends of your house, over time they will slowly drift our of synchronisation. They have slightly different mechanisms, their batteries might have slightly different levels of charge, they might even be in a warmer or colder part of the house. All these differences result in very slightly different speed of rotation of the hands, and so they speed up or slow down relative to each other.

Notice, I explicitly said cheap plastic clocks at either end of the house. If they are beautifully build grandparent clocks resting on a common wood floor, then far more interesting physics is at play.

Now imagine there are 100 cheap clocks. After a while there will be a large range of different times and so all the clocks disagree, however you might imagine that the average time is still approximately right. During operation of a quantum computer, the phase which is used to describe the state of each qubit varies as a function of time. If there are two qubits which have slightly different environments, their phase varies more quickly or more slowly and they get “out of sync”. This effect physicists refer to as dephasing, or more generally decoherence. We say that the “coherence” of the qubits is lost over time. 

Unfortunately, coherence is essential for a conventional quantum computer to function (I say conventional here as its currently less clear how important coherence is in annealing machines but this is an entire topic in itself). To build qubits that are perfectly coherent, we would have to control and understand all stray electric and magnetic fields, eliminate all vibrations, even isolate our computer from all light from the ultra-violet down well past the infrared. This is a level of complexity that has never been necessary and never even attempted in conventional computers. In fact, an important advantage of modern digital computers is that the bit being in 0 or 1 is all that matters. So if electrical noise or other random influences make the signal go to 1.1 or 0.9 .... it still counts as a 1. Equivalently, 0.1 or -0.1 are treated as 0. It is inherently robust to noise up to a certain level.

Quantum computers (at least with components that exist in labs right now) have no such inherent robustness. Each different type of quantum computer has different decoherence sources which come from the materials used to make the machine and the way in which it is designed. For the last 10-20 years or more, the designs of quantum computers and the materials used to build them have been evolving precisely to try and reduce the amount of decoherence. Great strides have been made, but the errors introduced by decoherence are still millions of times greater than error rates in conventional computers.

However, all (coherence) is not lost. Peter Shor showed that we can use the concept of measurement collapse in quantum mechanics to perform Quantum Error Correction. In short, the idea of quantum error correction is to use several (physical) qubits to represent the state of one “logical” qubit. This method of representing the state of the logical qubit (called an “encoding”) is performed in such a way that if the correct operations and measurements are performed on parts of this encoding, then the total system is collapsed into one of two states. Either no error, or one known error, which can be corrected. If this process is performed often enough, the effects of decoherence can be corrected. However, the specifics of when is “often enough” turns out to be one of the key critical issues in quantum computer design. This depends on how fast can you apply operations and measure your qubits, how many physical qubits are required for encoding one logical qubit, how fast you can apply the required corrections, and how strong was the decoherence in the first place.

Quantum control.
The remaining three DiVincenzo criteria can be loosely grouped under the heading “quantum control”. Control engineering is a vast and important field in the modern age, whether it is keeping people upright on a Segway, sending astronauts (or cosmonauts or taikonauts) into space, preventing your car skidding on a puddle or preventing a washing machine from destroying itself during the spin cycle. The ability to use small computers or electronic circuits to apply minor corrections to a machine to keep it operating correctly is one of the lesser appreciated but fundamentally important aspects of modern technology. A quantum computer will be no different. It will come with a myriad of electronic circuitry and computer systems to initialise each and every qubit at the start of a computation, to apply gate operations to actually perform the calculation and then to measure out the result of the calculation. (Although it should be said that due to the magic of quantum mechanics, its completely unclear if the calculation has actually been performed until after it has been measured!)

Although initialisation and measurement are generally understood for most major types of quantum computer designs, it is important to emphasis that these need to be performed exceptionally precisely (typically with something like a 99.9999% success rate). Similarly, there are sets of (quantum) logic gates which must be applied with similar precision. If one thinks of the usual computing gate operations (AND, OR, NOT, XOR, NAND, NOR etc.) in the quantum computing world all of these gates exist as well as more exotic examples like Hadamard gates, controlled-NOT gates, T-gates, S-gates, Toffoli gates and iSWAP gates. Although we now know quite a lot about how these gates work and how they need to be put together to perform interesting calculations, how to do it optimally is still very much open. Is it best to have the fastest gates so that we can beat decoherence? Should we use the gates that are easiest to implement so they can be done with greater precision? Do we need all the gates, or just a few of them but use them often? When trying to implement quantum error correction, do we just introduce more errors from all the gates we have to apply?. These questions all need to be answered, but the answer depends on which type of quantum computer you are building and how it performs, both on the drawing-board and in the lab.

Once all of these questions are settled, we have a type of quantum computer that we can scale up, that has long decoherence and that we control - we are done, right? Well, not quite. There are then software challenges. How does one perform quantum error correction in the most efficient way? How do we design the support hardware and software (the classical computer that will control the quantum computer)? How do we design our qubits so that when we make millions of them, they are all identical (or close enough to identical)?

For a long time, the way forward was unclear and everyone working in quantum computing had their own ideas about what a working quantum computer would look like. Now, things are settling and there are a few leading quantum computer designs. This new focus is not necessarily because we definitely know which way forward but because a few major ideas have progressed far enough that now we know that the principles are sound and it is worth pushing the designs as far as we can. The recent entry of commercial quantum computing efforts has also focused attention much more on the mundane engineering problems required to ultimately get quantum computers to work, without the additional scientific overhead of needing to publish, graduate or get tenure.

Ultimately, the quest to build a quantum computer may well prove to be one of humanities most impressive technological feats of the 21st century, simply because it requires such precise control over the materials it is built from and the software used to run it. It is engineering at a scale and at a level of precision that we could only dream of a few decades ago.

- Jared Cole (cole@h-bar.com.au), co-founder, h-bar quantum consultants

Quantum Schmantum in Australia: The surprising depth of quantum technology research Downunder

Australia is a relatively small country in terms of research culture and influence on the world stage. The idealised self-image of Australians is that we “punch above our weight” and achieve great things with scarce resources - a romantic ideal which dates from when we were an isolated outpost of British colonial expansion. It can certainly be argued that Australian scientific contributions compare favourably to anything being done in other parts of the world. However, statistically speaking, we are still small compared to the scientific powerhouses of the United States, United Kingdom, Germany, Japan and, in the last 20 years, China. Per capita we perform better but still lag behind the nordic countries. With a population of just over 24 million and an economy strongly reliant on primary industry (mining, agriculture etc.) the country’s scientific research tends to focus on “areas of critical mass”. Some areas of focus are understandable from a social and economic point of view (mining, agriculture, medical research). Others are more coincidental, for example astrophysics is particularly strong due to our Southern Hemisphere location and a strong history of support from the Commonwealth Scientific and Industrial Research Organisation (CSIRO). Therefore when looking from an outside perspective, it may seem surprising that a major strength in Australian physics research is Quantum Technology.

To understand what I mean by strength, lets discuss the quantum technology research landscape in Australia, in 2017. The Australian Research Council (ARC) Centres of Excellence programme is considered the premiere funding vehicle for fundamental and applied research. This programme focuses on groups of 10-20 lead investigators who typically already have a tenured position within an Australian university. A position within a successful Centre of Excellence is hotly contested as it typically funds postdoctoral researchers, equipment, graduate student places, travel etc. for each of the lead researchers (or “Chief Investigators”). The focus is on big goals, collaborative and interdisciplinary research and a unified research effort, beyond the usual 1-5 person research teams funded through the “Discovery" programme - the standard ARC grant. The time period over which a Centre of Excellence is funded (7 years, with possible renewal) is also more than twice as long as a Discovery grant. More than anything else, a Centre of Excellence (CoE) gives stability to a scientist’s research.

What is surprising is how many of these Centres of Excellence are currently funded (or have been funded in the past), which have a Quantum Technology aspect. The CoE for Quantum Computation and Communication Technology (CQC2T) is obviously both the most visible and best funded of these Centres. It has existed in a similar form since 1999 and in fact predates the Centre of Excellence scheme. As well as obtaining the highest level of ARC funding, it has additional government, industry and military funding - over $10 million AUD per year at last count. The vast majority of this investment is focused on the singular goal of designing and building a silicon based quantum computer. Given the collaborative nature of a CoE, this has resulted in an exceptionally high level of output in all areas of quantum computing that the Centre focuses on, both theory and experiment.

Although CQC2T gains most of the attention, there is an impressive depth of quantum technology research in other CoEs. The CoE for Engineered Quantum Systems (EQUS) includes several lead investigators that are CQC2T alumni. However EQUS is focused on quantum technology more broadly. This includes quantum measurement, control, sensing and simulation. In short everything except quantum computing specifically. 

There are also a series of other CoEs with significant quantum physics research focused on technology and applications, but do not specifically badge themselves as quantum technology centres. These include:

  • the Centre for Ultrahigh Bandwidth devices for optical systems (CUDOS) which focuses on photonic engineering and optical devices for communication and other technology applications.
  • the Centre for Nanoscale BioPhotonics (CNBP) which researches biomedical imaging applications and the control of light at the single photon level for medical imaging, diagnosis, and single cell manipulation.
  • the Centre for Future Low-energy Electronics Technologies (FLEET) focusing on low-energy electronics using novel materials include two-dimensional films and topological insulators.
  • the Centre for Exciton Science (ACEx) researching the generation, manipulation and control of excitons in molecular and nanoscale materials for solar energy harvesting, lighting and security applications.

You may notice two things immediately from that list. One, it is necessary to have an acronym for your Centre - the more memorable the better. Two, you notice that the focus and selling point of these Centres is far from quantum computing and quantum technology in general. Yet, a closer look at the investigator list for each of these Centres will find many examples of former Centre for Quantum Computation members. 

Dig a little deeper and in the ARC fellowships for early-career, mid-career and senior researchers (DECRAFuture, Laureate Fellowships respectively) you will also find many examples of quantum technology research - often also Centre for Quantum Computation alumni (or other closely related groups). In the most recent round, notable examples include Dr. Marcus Doherty (ANU), Dr. Gerardo Paz-Silva (Griffith), Dr. Lachlan Rogers (Macquarie), Dr Christopher Ferrie (USyd), Dr. Fabio Costa (UQ), Dr. Peter Rohde (UTS), Prof. Andrew Greentree (RMIT). This again reflects the great strength of quantum technology research in Australia.

The fact that such a strong quantum technology research focus appears in many different guises is very much a result of the way the CoE programme functions and how physics research in Australia evolves to fit the funding model imposed upon it. Each lead investigator has their own interests and focus, but where these interests best fit in the CoE scheme varies as a function of time and as a function of the CoE groupings. We see young researchers who “grew up” in one Centre move on with their research interests, eventually rejoining or forming a new cluster that starts to accrete researchers until sufficient critical mass is achieved to become a funded CoE. This itself is not so surprising for such a collaborative, long term scheme. What is unusual is the large number of Australian investigators that currently could be referred to as working in the quantum technology space, yet they are not part of the two big quantum technology based Centres.

Beyond ARC funded schemes, there are other examples of large scale investment in research in the quantum computing space in Australia. Microsoft have for quite some time had a strong presence in quantum information and computing theory via their StationQ research team. Recently this effort has stepped up a gear and moved strongly into experimental realisations of quantum computing, incorporating Prof. David Reilly's lab at the University of Sydney (who is also a member of EQUS). Just down the road, the University of Technology Sydney has formed the UTS Centre for Quantum Software and Information using a combination of UTS and ARC funding. Although these efforts are still technically University based, it is indicative of the worldwide pivot towards commercialisation of quantum computing technology - by the university, government and private sectors.

The reason for this strong focus on quantum physics and quantum technology in Australia is due to a range of factors including historical precedent, governmental policy and playing to the Australian psyche. Since at least the 1980s, Australian and New Zealand have an exceptionally strong representation in the field of quantum optics. A standard collection of textbooks on quantum optics includes the names of many antipodien authors such as Walls, Gardiner, Carmichael, Bachor, Milburn and Wiseman. This is partly the influence of the great Dan Walls on New Zealand physics, and by extension Australia. However, it is also an artefact of a time when the fields of particle physics and condensed matter were dominated by the USA and USSR. Quantum optics was a “cheap and cheerful” science where real progress could be made with the limited resources available south of the equator.

With the advent of quantum computing in the mid 90s, the tools used in quantum optics were perfectly suited to this new and exciting field. For the first time in many decades, brand new concepts and results in quantum physics were appearing monthly, sometimes weekly. For the quantum optics specialists of New Zealand and Australia and their students, it was an easy jump into this new field. Twenty years later, it is no coincidence that we have an entire generation of established physicists with a sound knowledge of quantum technology. 

Add to this strong quantum technology research environment, several quirks of the Australian system. First, in Australia PhD students are essentially “free”. They are paid by government scholarships which cover both their fees and a stipend, and therefore don’t cost the doctoral supervisor’s grants anything other than conference travel or computer resources. The result of this funding arrangement is that the secret to getting high quality PhD students is not necessarily to have large grants, but to have interesting projects and a stimulating research culture - something that quantum computing and technology has had right from the start. Secondly, due to high cost of living and good working conditions, Australian postdoctoral positions are well paid and therefore expensive. This means that once a student completes their PhD, the number of local positions is very limited and going overseas for more experience is necessary if one wants to make a career as a physicist. The result is that many labs around the world have an Australian working in quantum technology. Even burgeoning commercial quantum computing efforts such as Google and IBM have key members who learned their trade in the Centre for Quantum Computation during its formative years.

These quirks have resulted in an effective system for training specialists in quantum technology and spreading them throughout the world. However, there are two more ingredients which have contributed to the exceptionally strong focus of Australian quantum physics research. One is that the Australian diaspora, by and large, are still trying to come home. A strong sense of national identity and in general excellent living conditions (and weather) make Australia an attractive proposition, even for those who weren’t born here. It is an effect also seen in Australian actors and business leaders. Even after spending many decades in either Europe or North America, they will often take a position back in Australia at some time before retirement. This means that academic positions at Australian Universities are increasingly hard fought rarities which attract a raft of exceptional candidates. Each newly formed Centre of Excellence or collaborative research group has no space for weak members. 

Of course, the return of highly trained expats applies to all branches of science and academia in general. What seems to be different about physics and quantum technology in Australia is that physicists are adaptable. Sitting somewhere between the intellectual safety-harness of formal logic in mathematics, and the application driven focus of engineering, chemistry and biology - physics in the 21st century is often about being able to tell a good story to explain your work’s significance. As this has become paramount to obtaining acceptance from our peers, it is a relatively straight-forward step to apply this to convincing grant agencies of the important of the research.

In addition, the last decade or so have seen an almost blind faith in publication metrics. Job applications include total citations, h-index and lists of high impact journal publications as a matter of course. The short-listing of job applicants by HR departments and Research & Innovation offices has removed the subtlety of judging research potential. Now, sheer numbers of high impact journal papers which gain many citations is the key to the elusive tenured position. This is a game for which quantum technology is perfectly suited. New tools, new applications and new concepts appear all the time. A junior researcher can make a name for herself with just a couple of key results that spark a new flurry of activity in the research community. Contrast this with the slow and steady incremental work in many other branches of physics and it is little wonder that since the turn of the century quantum technology research has had such a grip on physics.

This of course brings us to pontificating about the future. Can this expansion continue? Well, in terms of quantum computing, in 2017 we really are at the pointy end of the business. Quantum computing is now a research reality in commercially funded labs. It is just a matter of time before enough qubits are wired together to perform a calculation that cannot be simulated classically, even in its simplest form. Quantum cryptographic systems can be purchased from several companies worldwide. Quantum metrology and sensing is becoming more mainstream in the scientific community and will eventually cross over to become mundane in the commercial sector as well. However, the pace of discovery in academia is slowing. The problems are harder, the progress is more incremental. Having said that, the foundation of quantum physics knowledge that has been built in Australia will not disappear any time soon. Physicists are adaptable, always looking for unsolved problems to hit with shiny new hammers. Whether it is new problems or new tools, the career incentives continue to favour those who find them. The question is simply can the quantum technology community focus its energy on problems of enough significance to mankind to continue justifying tax-payer funding. Finding things to do is never difficult for an academic, finding worthwhile things to do is the challenge.

- Jared Cole, co-founder, h-bar quantum consultants

Postscript: Please email me if you believe I have left out a significant quantum technology research effort within Australia. Also, special thanks to A/Prof. Tom Stace for providing the inspiration for the title of this article.

Full disclosure: A/Prof. Jared Cole is currently a chief investigator within ACEx and an associate investigator within FLEET. His PhD was in quantum computing within the CoE for Quantum Computation Technology (the precursor to CQC2T) from 2003-2006.

 

 

Quantum technologies and the launch of h-bar quantum consultants

It is with great pleasure that I am writing the first blog post for the official launch of h-bar quantum consultants. h-bar aims to provide professional advice services to the burgeoning quantum technology industry. Liaising between academia, government and business to provide detailed and up-to-date advice on new technology with a very steep learning curve.

The translation of quantum technology from the laboratory to commercial devices promises to be one of the great challenges of 21st century science and engineering. The United Kingdom’s National Quantum Technologies Programme and the recent announcement by the European Commission of a €1 billion Quantum Technologies Flagship are both targeted specifically at developing commercial quantum technologies. Worldwide we also have large scale investment in quantum technology research by government agencies in the United States, Canada, China, Japan, South Korea and Australia. Despite all this public investment in quantum technology commercialisation, for many applications we are still at the very start of the long road from research and development to market. It is a very exciting time to be in the field.

So what is quantum technology? Quantum physics is often referred to as “modern” physics, yet the principle of quantised energy which underlies quantum mechanics was first discussed more than 100 years ago. Throughout the 20th century a range of new technologies have been developed which in some way rely on this fundamental understanding of the universe. The operation of transistors, LEDs, MRI machines, lasers and many more are understood using the principles of quantum mechanics. However, recently these technologies are increasingly referred to as “first generation” quantum technologies. 

This of course begs the question, what is a “second generation” quantum technology? An example of a useful definition is given by Georgescu and Nori - technologies harnessing quantum superposition, uncertainty or quantum correlations are “second-generation” quantum technologies. However, this is quite a technical definition which does’t help non-specialists understand what such a distinction means and why is it important to differentiate at all?

For me, a simpler definition is that second generation quantum technologies are those which require (or benefit from) control over the quantum mechanical wavefunction of a system. The wavefunction is a central concept from quantum theory. It provides a mathematical description of the state of a system, i.e. what is a quantum mechanical system doing right now? However, many of the counter-intuitive results of quantum mechanics that can be confusing at first sight come from the fact that measuring the wavefunction directly is particularly difficult. Rather we infer the value of the wavefunction from the probabilities of measurement outcomes. 

The reason that control (or lack thereof) of the wavefunction provides a good definition is that most first generation technologies can be understood using a mathematical theory based on the probabilities only. This is also why they have been quickly incorporated into existing technologies over the last 50 years. It is only in the last 10-20 years that we have developed the technology to control the wavefunction itself. With this enhanced control we have discovered a raft of new applications including quantum cryptography, quantum computing, quantum metrology and quantum sensing. These technologies promise to allow us to hide our data more completely, solve tough mathematical problems more efficiently and sense the world around us with higher precision than ever before. However, we are still just at the very beginning.

The late 19th century developments in electromagnetism lead to large-scale technology applications in radio and electronic engineering after the first world war. The discovery and harnessing of nuclear physics during the second world war lead to the field of nuclear engineering following declassification of the field in the 1950s and 60s. We are only now starting to see the first generation of “quantum engineers”.

So where does h-bar fit in with all of this? As quantum technology becomes more of a commercial reality, it will be essential to have good information flow between scientists, engineers, business and government. Here at h-bar, we provide this service, linking stake holders and helping translate between the very different wants and needs of the fledgling quantum technology industry. We provide frank and impartial advice on all aspects of quantum technologies. Having played our part in the development of these technologies, now we aim to shepherd them through to full commercial applications.

- Jared Cole, co-founder, h-bar quantum consultants