#quantum

What is so difficult about building a quantum computer anyway?

NV1.jpg

When talking about quantum computing, we typically focus on “what is it?”, “why do we want it?” and the “when do we get it?” ..... maybe even “how much will it cost?”. An equally valid question to ask is “why don’t we have one already?” Although it sounds like Veruca Salt has suddenly developed an interest in quantum technology, there are some interesting physics and engineering issues behind why building a quantum computer is fundamentally more difficult than any computer or electronic engineering that has gone before. After all, the phone in my pocket has literally billions of transistors all switching on and off synchronised to a clock which ticks more than a billion times per second. That is enough ticks and tocks, and bits and blocks to make even Dr. Seuss proud - and yet we practically take it for granted.

Given this phenomenal ability to design and manufacture such extraordinary devices, how hard could it be to wire up a few thousands qubits in order to discover the next wonder-drug, or read the odd foreign dictator’s email?

Well, there are several important differences between conventional (classical) computers and quantum computers. To understand these differences, lets first discuss the fundamental components of a quantum computer.  During the late 90s, David DiVincenzo proposed a list of “criteria” that any proposed quantum computer would have to fulfil to be useful. These criteria can be summarised as follows:

1. You must be able to build qubits and build them in a way that allows you “scale up” to thousands or millions of qubits for a full quantum computer.
2. You must be able to initialise these qubits in some known state.
3. These qubits must have long decoherence times.
4. You must be able to apply operations or gates to these qubits which are “universal”.
5. You must be able to measure (at least some of) the qubits.

(Note for the technical specialist: I know there are sometimes two additional criteria added regarding “flying qubits” but given modern error correction codes, I consider these to now be largely unnecessary or subsumed into the other five).

The first and probably biggest problem is that we don’t even know what to make our quantum computer out of. Unlike classical bits which can take one of two states (0 or 1), a qubit must be described by two numbers describing its population and phase. In particular, the population can take on any value from 0 to 1. Although the requirements for a qubit sounds exotic, there are many examples of such systems in nature and we can now manufacture many more. As a consequence, there are many many proposals in the literature on how to build a quantum computer. Recently the scientific community has narrowed this down to a few leading candidates; including ions held in electromagnetic traps, superconducting circuits, semiconductor devices (defects, donors or quantum dots) and photonic schemes using modes of light. All these approaches (and more) can in principle satisfy the DiVincenzo criteria but the devil is in the detail and this is what currently limits progress for each of these approaches. 

Qubit scalability.
Most of us have baked a cake at some point in our lives. If you are good at it, it might take 15 minutes to get set up, mix everything, put it in the pan, take it out at the end - plus perhaps 45 mins cooking time. What about if you wanted to bake two cakes? You don’t have to get all the ingredients out twice, both cakes can probably cook in the oven at the same time, you know the recipe well, you might even be able to mix both cakes in the same bowl (if they are the same flavour!). All up, two cakes might take a total of 70 minutes - considerably less than 2x(15+45)=120 minutes. Now what about 10 cakes? Bakers do this regularly, they have larger ovens, industrial mixers, cooling racks etc. What about 100 cakes? 1000? 100000?

This is the issue of large scale manufacturing. When you have to produce thousands or millions of copies of a particular item or product, the manufacturing process has to be redesigned from the ground up. Most of the quantum computing applications which are relevant to humanity right now (quantum chemistry, decryption etc.) require many thousands if not millions of qubits, so any would-be quantum computer manufacturer better have a credible plan for creating such large numbers of controllable qubits. This doesn’t just mean “stamping” them out one by one, but you must be able to fabricate them in a large scale way as well as calibrate and control them. All the existing quantum designs are based on “small scale” experiments done in labs around the world. However, turning these small scale experiments into full engineering solutions for a large scale quantum computer takes time, planning, testing and expertise in a technology which is exceptionally new and untested.

Decoherence.
If you buy two cheap plastic clocks, set them to the same time and then place them at opposite ends of your house, over time they will slowly drift our of synchronisation. They have slightly different mechanisms, their batteries might have slightly different levels of charge, they might even be in a warmer or colder part of the house. All these differences result in very slightly different speed of rotation of the hands, and so they speed up or slow down relative to each other.

Notice, I explicitly said cheap plastic clocks at either end of the house. If they are beautifully build grandparent clocks resting on a common wood floor, then far more interesting physics is at play.

Now imagine there are 100 cheap clocks. After a while there will be a large range of different times and so all the clocks disagree, however you might imagine that the average time is still approximately right. During operation of a quantum computer, the phase which is used to describe the state of each qubit varies as a function of time. If there are two qubits which have slightly different environments, their phase varies more quickly or more slowly and they get “out of sync”. This effect physicists refer to as dephasing, or more generally decoherence. We say that the “coherence” of the qubits is lost over time. 

Unfortunately, coherence is essential for a conventional quantum computer to function (I say conventional here as its currently less clear how important coherence is in annealing machines but this is an entire topic in itself). To build qubits that are perfectly coherent, we would have to control and understand all stray electric and magnetic fields, eliminate all vibrations, even isolate our computer from all light from the ultra-violet down well past the infrared. This is a level of complexity that has never been necessary and never even attempted in conventional computers. In fact, an important advantage of modern digital computers is that the bit being in 0 or 1 is all that matters. So if electrical noise or other random influences make the signal go to 1.1 or 0.9 .... it still counts as a 1. Equivalently, 0.1 or -0.1 are treated as 0. It is inherently robust to noise up to a certain level.

Quantum computers (at least with components that exist in labs right now) have no such inherent robustness. Each different type of quantum computer has different decoherence sources which come from the materials used to make the machine and the way in which it is designed. For the last 10-20 years or more, the designs of quantum computers and the materials used to build them have been evolving precisely to try and reduce the amount of decoherence. Great strides have been made, but the errors introduced by decoherence are still millions of times greater than error rates in conventional computers.

However, all (coherence) is not lost. Peter Shor showed that we can use the concept of measurement collapse in quantum mechanics to perform Quantum Error Correction. In short, the idea of quantum error correction is to use several (physical) qubits to represent the state of one “logical” qubit. This method of representing the state of the logical qubit (called an “encoding”) is performed in such a way that if the correct operations and measurements are performed on parts of this encoding, then the total system is collapsed into one of two states. Either no error, or one known error, which can be corrected. If this process is performed often enough, the effects of decoherence can be corrected. However, the specifics of when is “often enough” turns out to be one of the key critical issues in quantum computer design. This depends on how fast can you apply operations and measure your qubits, how many physical qubits are required for encoding one logical qubit, how fast you can apply the required corrections, and how strong was the decoherence in the first place.

Quantum control.
The remaining three DiVincenzo criteria can be loosely grouped under the heading “quantum control”. Control engineering is a vast and important field in the modern age, whether it is keeping people upright on a Segway, sending astronauts (or cosmonauts or taikonauts) into space, preventing your car skidding on a puddle or preventing a washing machine from destroying itself during the spin cycle. The ability to use small computers or electronic circuits to apply minor corrections to a machine to keep it operating correctly is one of the lesser appreciated but fundamentally important aspects of modern technology. A quantum computer will be no different. It will come with a myriad of electronic circuitry and computer systems to initialise each and every qubit at the start of a computation, to apply gate operations to actually perform the calculation and then to measure out the result of the calculation. (Although it should be said that due to the magic of quantum mechanics, its completely unclear if the calculation has actually been performed until after it has been measured!)

Although initialisation and measurement are generally understood for most major types of quantum computer designs, it is important to emphasis that these need to be performed exceptionally precisely (typically with something like a 99.9999% success rate). Similarly, there are sets of (quantum) logic gates which must be applied with similar precision. If one thinks of the usual computing gate operations (AND, OR, NOT, XOR, NAND, NOR etc.) in the quantum computing world all of these gates exist as well as more exotic examples like Hadamard gates, controlled-NOT gates, T-gates, S-gates, Toffoli gates and iSWAP gates. Although we now know quite a lot about how these gates work and how they need to be put together to perform interesting calculations, how to do it optimally is still very much open. Is it best to have the fastest gates so that we can beat decoherence? Should we use the gates that are easiest to implement so they can be done with greater precision? Do we need all the gates, or just a few of them but use them often? When trying to implement quantum error correction, do we just introduce more errors from all the gates we have to apply?. These questions all need to be answered, but the answer depends on which type of quantum computer you are building and how it performs, both on the drawing-board and in the lab.

Once all of these questions are settled, we have a type of quantum computer that we can scale up, that has long decoherence and that we control - we are done, right? Well, not quite. There are then software challenges. How does one perform quantum error correction in the most efficient way? How do we design the support hardware and software (the classical computer that will control the quantum computer)? How do we design our qubits so that when we make millions of them, they are all identical (or close enough to identical)?

For a long time, the way forward was unclear and everyone working in quantum computing had their own ideas about what a working quantum computer would look like. Now, things are settling and there are a few leading quantum computer designs. This new focus is not necessarily because we definitely know which way forward but because a few major ideas have progressed far enough that now we know that the principles are sound and it is worth pushing the designs as far as we can. The recent entry of commercial quantum computing efforts has also focused attention much more on the mundane engineering problems required to ultimately get quantum computers to work, without the additional scientific overhead of needing to publish, graduate or get tenure.

Ultimately, the quest to build a quantum computer may well prove to be one of humanities most impressive technological feats of the 21st century, simply because it requires such precise control over the materials it is built from and the software used to run it. It is engineering at a scale and at a level of precision that we could only dream of a few decades ago.

- Jared Cole (cole@h-bar.com.au), co-founder, h-bar quantum consultants

Quantum Schmantum in Australia: The surprising depth of quantum technology research Downunder

Australia is a relatively small country in terms of research culture and influence on the world stage. The idealised self-image of Australians is that we “punch above our weight” and achieve great things with scarce resources - a romantic ideal which dates from when we were an isolated outpost of British colonial expansion. It can certainly be argued that Australian scientific contributions compare favourably to anything being done in other parts of the world. However, statistically speaking, we are still small compared to the scientific powerhouses of the United States, United Kingdom, Germany, Japan and, in the last 20 years, China. Per capita we perform better but still lag behind the nordic countries. With a population of just over 24 million and an economy strongly reliant on primary industry (mining, agriculture etc.) the country’s scientific research tends to focus on “areas of critical mass”. Some areas of focus are understandable from a social and economic point of view (mining, agriculture, medical research). Others are more coincidental, for example astrophysics is particularly strong due to our Southern Hemisphere location and a strong history of support from the Commonwealth Scientific and Industrial Research Organisation (CSIRO). Therefore when looking from an outside perspective, it may seem surprising that a major strength in Australian physics research is Quantum Technology.

To understand what I mean by strength, lets discuss the quantum technology research landscape in Australia, in 2017. The Australian Research Council (ARC) Centres of Excellence programme is considered the premiere funding vehicle for fundamental and applied research. This programme focuses on groups of 10-20 lead investigators who typically already have a tenured position within an Australian university. A position within a successful Centre of Excellence is hotly contested as it typically funds postdoctoral researchers, equipment, graduate student places, travel etc. for each of the lead researchers (or “Chief Investigators”). The focus is on big goals, collaborative and interdisciplinary research and a unified research effort, beyond the usual 1-5 person research teams funded through the “Discovery" programme - the standard ARC grant. The time period over which a Centre of Excellence is funded (7 years, with possible renewal) is also more than twice as long as a Discovery grant. More than anything else, a Centre of Excellence (CoE) gives stability to a scientist’s research.

What is surprising is how many of these Centres of Excellence are currently funded (or have been funded in the past), which have a Quantum Technology aspect. The CoE for Quantum Computation and Communication Technology (CQC2T) is obviously both the most visible and best funded of these Centres. It has existed in a similar form since 1999 and in fact predates the Centre of Excellence scheme. As well as obtaining the highest level of ARC funding, it has additional government, industry and military funding - over $10 million AUD per year at last count. The vast majority of this investment is focused on the singular goal of designing and building a silicon based quantum computer. Given the collaborative nature of a CoE, this has resulted in an exceptionally high level of output in all areas of quantum computing that the Centre focuses on, both theory and experiment.

Although CQC2T gains most of the attention, there is an impressive depth of quantum technology research in other CoEs. The CoE for Engineered Quantum Systems (EQUS) includes several lead investigators that are CQC2T alumni. However EQUS is focused on quantum technology more broadly. This includes quantum measurement, control, sensing and simulation. In short everything except quantum computing specifically. 

There are also a series of other CoEs with significant quantum physics research focused on technology and applications, but do not specifically badge themselves as quantum technology centres. These include:

  • the Centre for Ultrahigh Bandwidth devices for optical systems (CUDOS) which focuses on photonic engineering and optical devices for communication and other technology applications.
  • the Centre for Nanoscale BioPhotonics (CNBP) which researches biomedical imaging applications and the control of light at the single photon level for medical imaging, diagnosis, and single cell manipulation.
  • the Centre for Future Low-energy Electronics Technologies (FLEET) focusing on low-energy electronics using novel materials include two-dimensional films and topological insulators.
  • the Centre for Exciton Science (ACEx) researching the generation, manipulation and control of excitons in molecular and nanoscale materials for solar energy harvesting, lighting and security applications.

You may notice two things immediately from that list. One, it is necessary to have an acronym for your Centre - the more memorable the better. Two, you notice that the focus and selling point of these Centres is far from quantum computing and quantum technology in general. Yet, a closer look at the investigator list for each of these Centres will find many examples of former Centre for Quantum Computation members. 

Dig a little deeper and in the ARC fellowships for early-career, mid-career and senior researchers (DECRAFuture, Laureate Fellowships respectively) you will also find many examples of quantum technology research - often also Centre for Quantum Computation alumni (or other closely related groups). In the most recent round, notable examples include Dr. Marcus Doherty (ANU), Dr. Gerardo Paz-Silva (Griffith), Dr. Lachlan Rogers (Macquarie), Dr Christopher Ferrie (USyd), Dr. Fabio Costa (UQ), Dr. Peter Rohde (UTS), Prof. Andrew Greentree (RMIT). This again reflects the great strength of quantum technology research in Australia.

The fact that such a strong quantum technology research focus appears in many different guises is very much a result of the way the CoE programme functions and how physics research in Australia evolves to fit the funding model imposed upon it. Each lead investigator has their own interests and focus, but where these interests best fit in the CoE scheme varies as a function of time and as a function of the CoE groupings. We see young researchers who “grew up” in one Centre move on with their research interests, eventually rejoining or forming a new cluster that starts to accrete researchers until sufficient critical mass is achieved to become a funded CoE. This itself is not so surprising for such a collaborative, long term scheme. What is unusual is the large number of Australian investigators that currently could be referred to as working in the quantum technology space, yet they are not part of the two big quantum technology based Centres.

Beyond ARC funded schemes, there are other examples of large scale investment in research in the quantum computing space in Australia. Microsoft have for quite some time had a strong presence in quantum information and computing theory via their StationQ research team. Recently this effort has stepped up a gear and moved strongly into experimental realisations of quantum computing, incorporating Prof. David Reilly's lab at the University of Sydney (who is also a member of EQUS). Just down the road, the University of Technology Sydney has formed the UTS Centre for Quantum Software and Information using a combination of UTS and ARC funding. Although these efforts are still technically University based, it is indicative of the worldwide pivot towards commercialisation of quantum computing technology - by the university, government and private sectors.

The reason for this strong focus on quantum physics and quantum technology in Australia is due to a range of factors including historical precedent, governmental policy and playing to the Australian psyche. Since at least the 1980s, Australian and New Zealand have an exceptionally strong representation in the field of quantum optics. A standard collection of textbooks on quantum optics includes the names of many antipodien authors such as Walls, Gardiner, Carmichael, Bachor, Milburn and Wiseman. This is partly the influence of the great Dan Walls on New Zealand physics, and by extension Australia. However, it is also an artefact of a time when the fields of particle physics and condensed matter were dominated by the USA and USSR. Quantum optics was a “cheap and cheerful” science where real progress could be made with the limited resources available south of the equator.

With the advent of quantum computing in the mid 90s, the tools used in quantum optics were perfectly suited to this new and exciting field. For the first time in many decades, brand new concepts and results in quantum physics were appearing monthly, sometimes weekly. For the quantum optics specialists of New Zealand and Australia and their students, it was an easy jump into this new field. Twenty years later, it is no coincidence that we have an entire generation of established physicists with a sound knowledge of quantum technology. 

Add to this strong quantum technology research environment, several quirks of the Australian system. First, in Australia PhD students are essentially “free”. They are paid by government scholarships which cover both their fees and a stipend, and therefore don’t cost the doctoral supervisor’s grants anything other than conference travel or computer resources. The result of this funding arrangement is that the secret to getting high quality PhD students is not necessarily to have large grants, but to have interesting projects and a stimulating research culture - something that quantum computing and technology has had right from the start. Secondly, due to high cost of living and good working conditions, Australian postdoctoral positions are well paid and therefore expensive. This means that once a student completes their PhD, the number of local positions is very limited and going overseas for more experience is necessary if one wants to make a career as a physicist. The result is that many labs around the world have an Australian working in quantum technology. Even burgeoning commercial quantum computing efforts such as Google and IBM have key members who learned their trade in the Centre for Quantum Computation during its formative years.

These quirks have resulted in an effective system for training specialists in quantum technology and spreading them throughout the world. However, there are two more ingredients which have contributed to the exceptionally strong focus of Australian quantum physics research. One is that the Australian diaspora, by and large, are still trying to come home. A strong sense of national identity and in general excellent living conditions (and weather) make Australia an attractive proposition, even for those who weren’t born here. It is an effect also seen in Australian actors and business leaders. Even after spending many decades in either Europe or North America, they will often take a position back in Australia at some time before retirement. This means that academic positions at Australian Universities are increasingly hard fought rarities which attract a raft of exceptional candidates. Each newly formed Centre of Excellence or collaborative research group has no space for weak members. 

Of course, the return of highly trained expats applies to all branches of science and academia in general. What seems to be different about physics and quantum technology in Australia is that physicists are adaptable. Sitting somewhere between the intellectual safety-harness of formal logic in mathematics, and the application driven focus of engineering, chemistry and biology - physics in the 21st century is often about being able to tell a good story to explain your work’s significance. As this has become paramount to obtaining acceptance from our peers, it is a relatively straight-forward step to apply this to convincing grant agencies of the important of the research.

In addition, the last decade or so have seen an almost blind faith in publication metrics. Job applications include total citations, h-index and lists of high impact journal publications as a matter of course. The short-listing of job applicants by HR departments and Research & Innovation offices has removed the subtlety of judging research potential. Now, sheer numbers of high impact journal papers which gain many citations is the key to the elusive tenured position. This is a game for which quantum technology is perfectly suited. New tools, new applications and new concepts appear all the time. A junior researcher can make a name for herself with just a couple of key results that spark a new flurry of activity in the research community. Contrast this with the slow and steady incremental work in many other branches of physics and it is little wonder that since the turn of the century quantum technology research has had such a grip on physics.

This of course brings us to pontificating about the future. Can this expansion continue? Well, in terms of quantum computing, in 2017 we really are at the pointy end of the business. Quantum computing is now a research reality in commercially funded labs. It is just a matter of time before enough qubits are wired together to perform a calculation that cannot be simulated classically, even in its simplest form. Quantum cryptographic systems can be purchased from several companies worldwide. Quantum metrology and sensing is becoming more mainstream in the scientific community and will eventually cross over to become mundane in the commercial sector as well. However, the pace of discovery in academia is slowing. The problems are harder, the progress is more incremental. Having said that, the foundation of quantum physics knowledge that has been built in Australia will not disappear any time soon. Physicists are adaptable, always looking for unsolved problems to hit with shiny new hammers. Whether it is new problems or new tools, the career incentives continue to favour those who find them. The question is simply can the quantum technology community focus its energy on problems of enough significance to mankind to continue justifying tax-payer funding. Finding things to do is never difficult for an academic, finding worthwhile things to do is the challenge.

- Jared Cole, co-founder, h-bar quantum consultants

Postscript: Please email me if you believe I have left out a significant quantum technology research effort within Australia. Also, special thanks to A/Prof. Tom Stace for providing the inspiration for the title of this article.

Full disclosure: A/Prof. Jared Cole is currently a chief investigator within ACEx and an associate investigator within FLEET. His PhD was in quantum computing within the CoE for Quantum Computation Technology (the precursor to CQC2T) from 2003-2006.

 

 

Blueprint for an ion-trap quantum computer

Science Advances, Vol. 3, no. 2, e1601540 (2017)

Today in the journal Science Advances, researchers from the ion trapping group of the University of Sussex in the U.K. Aarhus University in Denmark, Siegen University in Germany, Google Inc and Riken in Japan have proposed a fundamentally new architecture for an ion-trap quantum computer.  I was a part of this research and am very excited to work on a method for ion-trap quantum computing that can form the basis of a large-scale machine.  

Ion-trap quantum computers have been one of the leading technologies for large-scale quantum computing.  The underlying technology is very mature and was developed initially to be used as very accurate atomic clocks.  When quantum computing was initially developed in the 1980's and 1990's, ion traps were one of the first technologies to experimentally demonstrate individual quantum bits (qubits) and since then, technology development has been pronounced.  

In an ion-trap quantum computer, individual qubits are ionised atoms. Some systems use Calcium, some use Beryllium and some use Ytterbium.  As the atom is ionised (i.e. carries a net positive charge) it can be trapped by an electromagnetic field, holding it in place.  The ion qubit is then held in an electromagnetic field inside a ultra-high-vacuum container.  This vacuum is required to make sure that the ion is not knocked out of the trap due to collisions with other atoms flying around inside the system.  The qubit itself is defined by the quantum state of a single electron of the ion.  Two stable electronic states are chosen to represent the binary zero and one states and these states can be manipulated via lasers or by manipulating the magnetic field environment of the ion.  

Manipulation of a single ion qubit is now routine in laboratories around the world.  Injecting and trapping an ion, performing single qubit quantum gates and reading out individual qubits can be done with extremely low error rates, in multiple systems, and many small-scale tests and protocols have been demonstrated over the past decade and a half.  

Operations on multiple qubits are also possible through coupling ions through motional degrees of freedom between two (or more) trapped ions.  Because individual ions are positively charged, if they are placed in the same trap, they will experience a mutual repulsion due to their respective positive charges. This mutual repulsion changes slightly when the electronic configuration changes between each individual ion and hence can be used to enact quantum logic gates between two qubits.  Again, through careful control of the system, experimentalists have enacted logic operations between qubits and realised small-scale programmable ion-trap quantum computers.  

The question that physicists and engineers are now addressing is scalability, namely how do we increase the number of qubits in the system to enact complex and required error correction protocols and scale the system to sufficient size to perform quantum algorithms that cannot be realised on even the most powerful classical supercomputers?

An ion-trap X-junction, the building block of an ion trap quantum computer.  The gold coloured base plate consists of a series of electrodes that are used to manipulate the electromagnetic field used to trap individual ions.  This allows us to trap ions in separate regions of the machine to "load" ions (injecting qubits into the computer), measure the quantum state of ions and to entangle ions together (performing gate operations between two ions)

Scaling ion-trap computers to the level of millions (if not billions) of qubits requires very careful design.  Luckily, ion-trap computers have a rather unique property: qubits can be moved (shuttled) around, they are not fixed in place.  By manipulating electromagnetic fields that are used to trap individual ions, they can be moved and shuttled around the computer.  This allows us to trap ions separately and move them around to inject or "load" them into the computer, measure them in dedicated readout zones and to entangle them with other ions in the computer, fast and with very low error rate. 

X-junctions are fabricated together in a grid.  Each X-junction contains a single ion qubit that can be initialised, interacted with its four neighbours to the north, east, south and west and measured.  Repeating this structure allows for an arbitrarily large error-corrected quantum computer, capable of implementing any algorithm.

Even with the very low error rates that experimentalists can achieve with ion-trap technology, they are still not good enough for large-scale algorithms such as Shor's factoring algorithm or Grover's search algorithm.  Active error correction codes are still needed.  The ion-trap architecture is consequently designed around a class of topological error correction codes, known as surface codes.  Surface codes are a desirable method for large-scale, error-corrected quantum computers as they are amenable to system design and have very good performance.  Surface codes only require error rates for each physical operation in our computer to be below approximately 1% before they begin working effectively.  Error rates at 1% or lower are already experimentally achievable in ion-trap systems. 

In other designs for ion-trap computers, physicists have imagined building small mini-computers, each containing anywhere between 10-100 physical ion qubits.  These mini-computers would then be linked together with photons and optical fiber.  This would allow scale-up by connecting together separate and comparatively small ion-traps to form a larger computer.  unfortunately, the downside to this approach is that establishing an optical connection between separated ion-traps is both very slow and very noisy, two things that are detrimental to a functional and useful quantum computer.

In our approach, we decided that a monolithic design for an ion trap is better.  The X-junction shown above allows an individual ion to interact with its four neighbours, hence to scale the computer to arbitrary size, we just physically connect may X-junctions together and shuttle ion qubits between X-junctions to perform gates.

A module is a 36x36 array of X-junctions fabricated with necessary control electronics and mounted on a steel frame with piezo-actuators that allow for aligning modules together.  Each module houses 36 qubits in our quantum computer.

We define a module that consists of an array of 36x36 X-junctions, each junction containing a single qubit in our quantum computer.  This module contains all the control structures necessary to manipulate the qubits in the ion-trap.  Below the surface of the trap (where each individual qubit hovers about 100 micro-meters above the electrodes) there are layers of electronic control and cooling.  Finally, the module is fabricated to a set of piezo-actuators and then fabricated to a support frame.  The piezo-actuators are used such that two modules can be aligned together and ions transported across the junction between two modules.  Our analysis showed that provided each module was aligned to less than 10 micro-meters in either the x,y or z direction, we could still reliably shuttle ions between modules.

If this module can be built, scaling the quantum computer to arbitrary size simply requires fabricating more and more modules and connecting them together.  In this way, the ion-trap quantum computer can operate as fast as possible with very low error rates and does not require us to build and integrate in additional quantum technology such as photonic interconnects which have so far proven to be difficult to build reliably, with good performance.    

By connecting modules together we scan scale the computer to arbitrary size.  Shown is several connected vacuum systems containing approximately 2.2 million X-junctions.  This system would occupy the space of a mid-sized office and be able to run fully error corrected quantum algorithms.  The entire computer is housed in a ultra-high vacuum, to eliminate any stray atoms that could collide with ion qubits.

Scaling an ion-trap quantum computer will require some very high quality engineering.  Each module contains enough X-junctions to accomodate 36 ion qubits and occupies a physical space of 90mm x 90mm, this is a comparatively large footprint for a quantum computer.  We can envisage a much larger system, as illustrated, which contains 2.2 million X-junctions in a series of connected vacuum chambers (hence 2.2 million qubits).  The size of each chamber is 4.5m x 4.5m, about the size of a mid-sized office.  Additionally, the entire quantum computer must maintain an ultra-high vacuum inside for the length of time necessary to run a quantum algorithm (which may be anywhere from seconds to weeks).  

While the engineering challenges are significant, they are not impossible and much of the research in the ion-trap community is focused on these issues.  One significant adaptation that we made in this architecture is the elimination of a significant amount of laser control.  In more traditional ion-trap quantum computers, every operation on ion qubits (except for shuttling) is mediated by precisely focused laser beams.  For a system containing millions of qubits, the amount of laser control would be significant and potentially very costly to the design of a large-scale machine.  

We remove costly and difficult laser control for each ion qubit by a microwave pulse that is broadcast over the entire computer.  Ions that we wish to address with the pulse are "tuned in" via manipulating the local magnetic field environment with control wires embedded under the surface of the ion trap.

In 2016, the ion-trap group at Sussex University (who lead the work on this paper) demonstrated a new technique to control and manipulate ion qubits.  Instead of using tightly focused laser beams, the group use a microwave pulse that was broadcast over the entire ion-trap.  The ions that they wanted to react to this microwave pulse were "tuned in" via precise control of the magnetic field environment around a particular ion.  In this way you could use one microwave pulse to enact operations on large numbers of qubits simultaneously by tuning in the relevant qubits by changing local magnetic fields.  This eliminates the need to have selective laser control for every ion qubit in the machine.  Controlling the local magnetic field to each X-junction is performed with wires embedded underneath the surface of the ion-trap.  By controlling electrical current through these wires, we can alter the magnetic field near a particular ion and "tune them in" to global microwave control pulses applied over the entire computer. 

We believe that this model of an ion-trap quantum computer may be significantly easier to engineer and ultimately build than other designs.  Many of the components of this monolithic design have already been demonstrated experimentally and much of the challenge left is to put all these pieces together and to slowly scale the system to first 10's of qubits then to 100's, 1000's and hopefully millions in the not too distant future. 

The future of ion-trap quantum computing looks very bright and this technology is a direct competitor to superconducting quantum computing designs pioneered by places like IBM and Google.  Both technologies maturing at the same time gives us tremendous flexibility in how we adapt quantum computing technology to specific commercial tasks in this new and exciting technology sector.

- Simon Devitt, co-founder of h-bar quantum consultants.