More Progress towards Quantum Computing

Stephen DeAngelis

January 19, 2015

By almost any standard the quantum world is weird. One of the weirder things about the sub-atomic world is that particles can behave as either waves or as particles. This phenomenon is called wave-particle duality. Referring to this phenomenon, the late Albert Einstein wrote, “It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with a new kind of difficulty. We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do.” In the 1920s, Werner Heisenberg added to the confusion when he developed his famous “uncertainty principle,” which stated that it is possible to know the location of a quantum particle, or the speed and direction in which it is travelling, but not both.” Three researchers associated with the Centre for Quantum Technologies at the National University of Singapore, Patrick J. Coles, Jedrzej Kaniewski, and Stephanie Wehner, have shown the uncertainty principle is the key to understanding wave-particle duality, thus simplifying quantum mechanics. [“Equivalence of wave–particle duality to entropic uncertainty,” Nature, 19 December 2014] They explain:

“A single particle can exhibit wave behaviour, yet that wave behaviour disappears when one tries to determine the particle’s path inside the interferometer. This idea has been formulated quantitatively as an inequality, for example, by Englert and Jaeger, Shimony and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg’s uncertainty principle, although this has been debated. Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle in terms of entropies, namely, the min- and max-entropies. This observation unifies two fundamental concepts in quantum mechanics.”

If that still sounds a bit confusing to you, you are not alone. Richard Feynman, a Nobel-prize-winning theoretical physicist, once stated that wave-particle duality is “a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery [in quantum mechanics,].” [“Strangest things about quantum physics may stem from overconfidence,” by Kevin Fogarty (@KevinFogarty), Computerworld, 31 December 2014] Fogarty provides a great explanation of what this all means for quantum computing. He writes:

“The authors have found similar connections between uncertainty and other fundamental quantum-mechanical phenomena, including quantum entanglement – the apparent ability of two particles to communicate across vast distances with no time delay, which Einstein described as ‘spooky action at a distance.’ If the connection is real and the authors’ conclusions are accurate, it could resolve much of the perversely counterintuitive behavior that makes quantum mechanics bizarre as well as simply complicated. It could also point the way toward explanations about the behavior of the universe that don’t assume the smallest, most fundamental bits of it behave in ways completely inconsistent with all the rest. ‘Simplify’ might not be the right description for physics based on a wider application of uncertainty, however. It might not explain the discovery this week of what researchers insist are particles that are half matter and half light energy – one popular description of wave-particle duality in photons, for example. … If wave-particle duality is an error of uncertainty, it’s likely the light/matter particle is a mistake – and so would a large chunk of the research on the quantum world during the past century. It would, however, make much more straightforward investigations into ways to trap and manipulate individual atoms for use in subatomic-scale computing devices, for example, or precise control of the flight path, or shape of individual photons to make them more useful in quantum or optical computing.”

The half-light/half-matter particles to which Fogarty refers were discovered by a team of City College of New York physicists led by Dr. Vinod Menon. [“Half-Light, Half-Matter Quantum Particles a Step toward Practical Quantum Computing Platforms,” Scientific Computing, 5 January 2015] According to the article, “Professor Menon and his team were able to discover half-light, half-matter particles in atomically thin semiconductors (thickness ~ a millionth of a single sheet of paper) consisting of a two-dimensional (2-D) layer of molybdenum and sulfur atoms arranged similar to graphene. They sandwiched this 2-D material in a light trapping structure to realize these composite quantum particles.” Professor Menon asserts, “Besides being a fundamental breakthrough, this opens up the possibility of making devices which take the benefits of both light and matter.” The article explains, because of the discovery, “one can start envisioning logic gates and signal processors that take on best of light and matter. The discovery is also expected to contribute to developing practical platforms for quantum computing.” If Fogarty is correct and a conflict does exist between the findings of two studies discussed above, there should be some interesting discussions taking place in the months ahead. That fact that two apparently breakthrough studies have just been released demonstrates two things. First, progress is still being made; and, second, understanding the world at the quantum level remains difficult.

Given all of the challenges that must be overcome, it is little wonder that a universal quantum computer has been so difficult to build. In addition to understanding the theories and principles behind quantum mechanics, researchers trying to create quantum computers need to develop fundamental building blocks (like stable qubits), assemble the hardware, and develop special algorithms to run on the completed system. As I’ve out in previous posts, progress is being made in each of those areas. Below are some of the latest advances.


A team of researchers from Harvard University, the University of California, Santa Barbara, and the University of Chicago “has taken a major step forward in effectively enhancing the fluorescent light emission of diamond nitrogen vacancy centers — a key step to using the atom-sized defects in future quantum computers.” [“A qubit candidate shines brighter,” R&D Magazine, 2 January 2015] The article explains:

“In the race to design the world’s first universal quantum computer, a special kind of diamond defect called a nitrogen vacancy (NV) center is playing a big role. NV centers consist of a nitrogen atom and a vacant site that together replace two adjacent carbon atoms in diamond crystal. The defects can record or store quantum information and transmit it in the form of light, but the weak signal is hard to identify, extract and transmit unless it is intensified.”

Alexander Hellemans (@scienceandwords) writes, “If a practical quantum computer is going to become an everyday thing, qubits have to remain in two states at one time for much longer than they do now. One of the possible candidates for a longer lasting qubit is a copper ion embedded in a large molecule.” He reports that a group from the University of Stuttgart “has developed a way to protect the spin of a copper ion by placing it inside a molecule that has relatively few spin-carrying atoms and keeping it far away from hydrogen atoms that carry spins.” [“Long Live the Copper Qubit!IEEE Spectrum, 24 October 2014] If the use of diamonds and copper haven’t convinced you that a variety of particles can be used as qubits, a group from the University of Oxford has demonstrated “how trapped calcium ions can be manipulated and prepared to store qubits with record high fidelity.” [“The quest for the quality qubit: quantum computer based on trapped ions has error rate of only 0.07%,” by Tibi Puiu (), ZME Science, 25 November 2014]


Jacob Aron () reports, “The first piece of software to show the potential of quantum computing has finally been run on a real machine, 20 years after it was initially dreamed up. Although it doesn’t do anything useful on its own, implementing the algorithm could lead to more practical computers powered by the strange properties of quantum mechanics.” [“Historic quantum software is run for the first time,” New Scientist, 23 October 2014] The algorithm in question was designed by Daniel Simon. Aron writes, “Designing an algorithm that takes advantage of a quantum computer is tricky, so there aren’t many around. In 1994, Daniel Simon, then at the University of Montreal, Canada, came up with one of the earliest examples. Crucially, his was the first that showed a quantum computer could solve a problem exponentially faster than an ordinary computer. Previous algorithms had only shown a slight speed boost, or none at all.”

Another article reports that researchers from the Centre for Quantum Photonics (CQP) at the University of Bristol, together with collaborators from the University of Queensland (UQ) and Imperial College London, have developed “a new way to run a quantum algorithm using much simpler methods than previously thought.” [“New Way to run Quantum Algorithm uses Much Simpler Methods,” Scientific Computing, 15 September 2014] The algorithm in question is called Boson Sampling and was developed by researchers at MIT. “Unlike other quantum algorithms, Boson Sampling has the benefit of being practical for near-term implementations, with the only experimental drawback being the difficulty of generating the dozens of single photons required for the important quantum victory.” The University of Bristol researchers discovered a way to “chain together many standard two-photon sources in such a way as to give a dramatic boost to the number of photons generated.”


On the hardware front, Sharon Gaudin (@sgaudin) reports, “Researchers in Australia have developed silicon-wrapped quantum technology that could solve problems that have held back the development of powerful quantum computers.” [“New tech could let quantum machines tackle huge problems,” Computerworld, 23 October 2014] The technology uses silicon as a shield to protect finicky qubits from outside interference. And James Maynard reports that researchers at Princeton University have developed a maser the size of a grain of rice that could be used in quantum computers. [“Rice Grain-Sized Maser Invention a Significant Step Forward for Quantum Computing,” Tech Times, 17 January 2015] What’s a maser? Maynard explains:

“Masers are similar to lasers, except they use microwaves instead of light to create a beam. Princeton University researchers linked together pairs of artificial molecules, made of semiconducting material, known as quantum dots. Extremely thin nanowires, constructed from indium arsenide, were used to connect the tiny units to each other. The pairs were then placed inside a tiny well of niobium, a superconductor, about one-quarter inch apart from each other. That cavity was lined with mirrors on two sides. When a stream of individual electrons was fed to the system, the quantum dots produced microwave radiation, which bounced between the mirrors, creating a focused beam, similar to a laser.”

The second question you might ask is: How will a rice-size maser help researchers develop a quantum computer? Maynard explains, “Princeton researchers created the new device in order to study how these pairs could be used to manage quantum bits, the basic unit of computation in quantum computers.”

Despite constant strides, the “wow moment” in quantum computing has yet to be achieved. Although there are skeptics who believe that developing a universal quantum computer is impossible, there have never been so many organizations and researchers applying themselves to the task of building one. That’s why I expect to continue to see increasingly large strides being made towards that goal.