Journal Articles

CVu Journal Vol 12, #1 - Jan 2000
Browse in : All > Journals > CVu > 121 (30)

Note: when you create a new publication type, the articles module will automatically use the templates user-display-[publicationtype].xt and user-summary-[publicationtype].xt. If those templates do not exist when you try to preview or display a new article, you'll get this warning :-) Please place your own templates in themes/yourtheme/modules/articles . The templates will get the extension .xt there.

Title: Quantum Computing

Author: Administrator

Date: 03 January 2000 13:15:35 +00:00 or Mon, 03 January 2000 13:15:35 +00:00

Summary: 

Body: 

I attended a mini-conference on Quantum Computing on October 30th and 31st 1998. It consisted of five lectures (there were supposed to be six but a family emergency whisked one lecturer away) and I wrote 38 A4 pages of notes. It was sponsored by National University of Ireland Maynooth's Departments of Mathematical Physics; Computer Science; Chemistry; and Mathematics; the Dublin Institute for Advanced Studies; the Irish Computer Society; and Trinity College Dublin's Mathematics Department. It was organised by NUI Maynooth's Dr. Jason Twamley. It was held in Trinity, and followed some months later by another in NUI Maynooth which I didn't manage to attend.

The fundamental difference between classical bits and qubits (quantum bits) is that qubits rely on superposition: if one's state isn't measured it's on; off; and everything in-between at the same time. This gives us a different breed of computer.

The first lecture, "Quantum Computation : theory and experiments", was given by Prof. Artur Ekert (Oxford University). He said that computational complexity could be P(olynomial); or NP (difficult to solve, but once you have the answer it's easy to double check in polynomial time); or EXP(onential). No matter how fast your classical computer is, an exponential algorithm is still exponential.

He presented an example of factoring a one hundred decimal digit number. An exponential classical solution was meant to take about 10^44 seconds. This could be cut by a mammoth proportion by reducing the process of factoring to a periodic function quantum computers could handle in a few seconds. Artur Ekert said that there's no way of improving technology in order to make difficult problems easy but with superpositions quantum programs can have instructions for a new algorithm with no classical counterpart thus giving polynominal time resolution to factoring.

Some of us were eager to see the quantum algorithm for factorising. Artur Ekert didn't want to go through a worked example there and then, but was going to show us in his next lecture. However, this is the man who had to leave because of a family emergency. Two of the other lecturers showed us instead at the end of the last day. One of them asked some way through the workings out (of factorising the number four) if the American lecturing mathematics in Cork (on the south coast of Ireland) who asked for the demonstration understood it so far: "Everything except the quantum."

Ekert stated that simulating quantum algorithms on classical computers can be done but is exponentially inefficient. Someone had gone through 2 years of his PhD in Switzerland only to find this out - pretty unfortunate. Ekert claimed that this is an obvious result to a physicist but not to a poor computer scientist.

Ekert showed us the formula D = T/t where D is the figure of merit. This tells us roughly how many coherent steps can be taken before the whole system decays. If there is low noise then there's a long decoherence time, T. The figure t is referred to as the fast coherent switching time.

There are three technologies currently in use for qubits: the ion trap (Boulder, Colorado; Los Alamos; Oxford; Innsbruck; and elsewhere); Cavity QED (Paris; and Caltech); and Nuclear Magnetic Resonance (Oxford; Los Alamos; and elsewhere). Cavity QED was only briefly mentioned - I don't know what it is.

Ekert gave a figure of merit of ten thousand to ion traps. In these, some ions are confined in an electric field. They are separated by a distance greater than the wavelength of the laser light used to probe transitions. When one ion wobbles there is an electrostatic interaction which causes others to wobble and we can probe in this way.

Dr. Andrew M. Steane (Oxford) said in his lecture, "Experiments in quantum information processing", that the pros of ion traps are that they produce the highest precision qubits; the experimental complexity can be scaled well; and that the same qubit can be repeatedly measured. On the downside these are demanding experiments; and there's no billion-dollar industry directly related to the field.

Steane imagined that it would take one or two years (remember: this was a year ago) for "a sort of magician" called David Weinland to have a five qubit computer in his lab in Boulder, Colorado. For himself, he reckoned three years. He's trying to catch up with him: but second place isn't too bad. He said that he certainly sees a way to ten, even one hundred, ions in a computer.

Cavity QED systems were barely covered, but as well as ion traps Nuclear Magnetic Resonance (NMR - also known as nuclear induction) was explained. Dr. Jonathan A. Jones (Oxford) spoke about NMR in his lecture "NMR Quantum Computing". He said that NMR is familiar to chemists but has been neglected by physicists.

An NMR quantum computer needs: 1. qubits; 2. an adequate set of logic gates; 3. an initialisation operation ("CLEAR"); and 4. a readout mechanism. Parts 1 and 2 are simple: people have been doing them since the 1960s but only realised this in the 1990s. Jones said that NMR was never at an incoherent phase like optics. He said that this is like not starting with candles but with lasers. Spin-0 nuclei can be completely ignored (as qubits). They can be used to build a framework for the spinned particles you care about.

CLEAR is non-unitary, and so must be achieved by a non-computational process. Typically it is achieved by cooling to the ground state but this doesn't work at reasonable temperatures for NMR.

In 1996 Cory et al. showed how to use NMR techniques to distil an effective pure ground state from the equilibrium Boltzmann distribution. This process is exponentially inefficient and so limits current NMR computers to about twenty qubits.

Surprisingly, the state of an NMR readout can be monitored during the readout! (I'll take it that plenty of you won't know all that much about quantum mechanics and point out that measuring or observing physical systems on the molecular and smaller scales affects - i.e. alters - those systems. Brought to the normal macro scale, if you tried to read the number displayed on a bus, it could change just because you started to read it. And this is one of the principles in the family upon which our CDs; barcode scanners; and transistors rely upon. Oh dear...)

Jones said that the exponential inefficiency of the preparation of pseudo-pure states is the most widely discussed problem yet the least important. More important, rarely discussed problem areas he mentioned are: decoherence; pulse sequence complexity; and selective excitation problems.

Early papers pretended that decoherence wasn't a problem...they lied. It's not a big problem when you're doing nothing. It's when you compute decoherence kicks in. It may limit NMR quantum computers to about five to ten qubits.

With "linear" systems where the spins are only coupled to near neighbours, the complexity will only grow linearly. However, there will be a polynomial overhead making the effects of decoherence more serious.

Jones said that there are only six non-radioactive spin nuclei and there's a very small frequency range to address individual qubits. Again this imposes a limit of five to ten qubits.

Jones claimed that NMR can't go as far as ion traps which he feels could reach twelve qubits. Even if nuclear induction has a limited future in the quantum computing field, it's still something for researchers to play with. Not that NMR is the only avenue with a limited lifespan. If there's enough will and money put into the field, a proper quantum computer could be built. For now researchers do what they can (even if badly) to show that quantum computing can be done. Some day engineers will do it properly.

Nearly everyone agrees that a solid-state system would work. Dubbed as "essentially forever", such systems' decoherence times would be in the order of 10^7 seconds (about 16.5 weeks). There would be half a dozen things needed that are very difficult and expensive.

Three types of algorithms were identified for quantum computers: 1. what can be regarded as period finding; 2. algorithms like Dr. Lov K. Grover's (Bell Labs) quantum search algorithm (which for half a million phone numbers would take a thousand queries instead of a quarter of a million) - quantum computers are useful for NP problems; and 3. simulating quantum mechanical processes - this excites chemists.

As well as the three previously mentioned lectures, "Few Particles - Lots of Physics" which was to go under the more descriptive title of "Entanglement and Quantum Information" was given by Dr. Martin Plenio (Imperial College) and "Teleportation of continuous variables" was given by Dr. Samuel Braunstein (Bangor).

Further reading: Dr Mark Harman[1], Department of Mathematical and Computing Sciences, Goldsmiths College, University of London wrote an article called "How small is a bit?" which appeared in EXE: The Software Developers' Magazine, September 1999 issue.

I am quite happy to publish reports of conferences that members have attended. In this case I noticed that several of the speakers were Oxford based which made me wonder how popular talks from them would be at one of our events. If you want me to see if I can get one or more to talk at a future JaCC please let me know.

Notes: 

More fields may be available via dynamicdata ..