Home » Quantum Computing » The Race to a Programmable Quantum Computer Will Be Won with Qubits

The Race to a Programmable Quantum Computer Will Be Won with Qubits

April 23, 2021

supplu-chain

There is a furious international race to develop programmable quantum computers. James Norman, President of QS Investors, explains, “Quantum computers can be game changers because they can solve important problems no existing computer can. While conventional computing scales linearly, QC scales exponentially when adding new bits. Exponential scaling always wins, and it’s never close.”[1] Quantum computers scale exponentially because they use quantum bits (aka qubits). In traditional computers, a bit can represent either a one or a zero. Because of the weirdness found at the quantum level of physics, a qubit can simultaneously represent both a one and a zero thanks to a state known as “superposition.” The term “qubit” was coined Benjamin Schumacher, a theoretical physicist, who asserts the term was created in jest during a conversation with another theoretical physicist, William Wootters. Dr. Jürgen Graf, from the University of Konstanz, asserts, “The race for the quantum computer will most likely be decided at the quantum bit (qubit) ― the smallest information unit of the quantum computer.”[2] Graf adds, “The coupling of several qubits into a computing system is currently one of the greatest challenges in the development of quantum computers. A key question is which physical system and which material are best suited for qubits.”

 

The Fragility of Qubits

 

Concerning qubits in a superposition state, Graf observes, “Only at the moment of measurement is this intermediate state brought to a fixed value. In other words: Whereas normal bits have a defined value at any given time, qubits take on a defined value only at the respective moment of measurement. This property is the basis for the massive computing power that quantum computers can harness for some problems.” The problem with qubits is that they are difficult to create and just as difficult to maintain in their superposition state. Most of us have a difficult time comprehending the size the particles at the sub-atomic level. Because they are so tiny and fragile, they can be easily disturbed (i.e., knocked out of their superposition states). To mitigate this challenge, most efforts to create qubits require well-shielded systems operating at extremely cold temperatures. And, as Graf points out, “An even more difficult task is interconnecting quantum bits because a single quantum bit is not sufficient to carry out an arithmetic operation.”

 

To demonstrate the fragility of qubits, a UK-based company, Quantum Motion, recently announced what it considers a breakthrough. “By cooling [a silicon] chip down to a temperature just above absolute zero (−273°C), and by using tiny transistors, the Quantum Motion team [was] able to isolate a single electron and measure its quantum state for an astounding nine seconds.”[3] As journalist Maija Palmer writes, “Nine seconds may not sound very impressive to the layperson — but in the world of quantum computing, where quantum states are more commonly measured in nanoseconds, it is an unimaginably long stretch. Even chipmaker Intel, which is testing a similar silicon-based approach in collaboration with Delft-based startup QuTech, talks about times of 1 second — and this is several orders of magnitude longer than what has been achieved by quantum companies using the superconducting approach.” Palmer adds, “Quantum Motion has demonstrated just one qubit, which may not seem very impressive compared to the 50+ qubits that have been achieved by Google and IBM. But, says [John Morton, professor of nanoelectronics at UCL and cofounder of Quantum Motion], if the goal is to reach 1m qubits — the level at which it is believed quantum computers will start to become truly usable — the difference is minimal.” Professor David Reilly, who holds a joint position with Microsoft and the University of Sydney, agrees. He states, “To realize the potential of quantum computing, machines will need to operate thousands if not millions of qubits.”[4]

 

Nevertheless, as science writer Dennis Overbye (@overbye) explains, even a few hundred qubits can be powerful. He writes, “Eight bits make a byte; the active working memory of a typical smartphone might employ something like 2 gigabytes, or two times 8 billion bits. That’s a lot of information, but it pales in comparison to the information capacity of only a few dozen qubits. Because each qubit represents two states at once, the total number of states doubles with each added qubit. One qubit is two possible numbers, two is four possible numbers, three is eight and so forth. It starts slow but gets huge fast. ‘Imagine you had 100 perfect qubits,’ said Dario Gil, the head of IBM’s research lab in Yorktown Heights, N.Y., in a recent interview. ‘You would need to devote every atom of planet Earth to store bits to describe that state of that quantum computer. By the time you had 280 perfect qubits, you would need every atom in the universe to store all the zeros and ones.’ How this is accomplished is an engineer’s dream and nightmare.”[5]

 

Is Silicon the Future of Qubits?

 

Morton asserts silicon is a promising source of qubits. “Silicon, he said, was already ideal for making qubits, as it was a ‘quiet material’ — 98% of silicon atoms nuclei without spin that would interfere with the spin of the electron being measured. But more importantly, if the silicon approach works, the quantum computer industry would not have to build a new set of chip foundries — they could use the infrastructure that is already there. It would also be easier to combine quantum and classic computers if both use the same silicon chip and transistor architecture.” Graf adds, “Silicon-based quantum bits have the advantage that, being only a few nanometers in size, they are decidedly smaller than superconductor systems. Consequently, many more of them can be put into a computer chip — potentially millions. ‘Moreover, industry already has decades of experience with silicon semiconductor technology. The development and production of silicon-based qubits benefits enormously from this — which is no small advantage,” Guido Burkard [professor of theoretical condensed matter physics and quantum information at the University of Konstanz] explains.”

 

Silicon can also be used to make artificial atoms that have demonstrated potential usefulness in creating qubits. A press release from the University of New South Wales explains, “UNSW quantum computing researchers [have] created artificial atoms in a silicon ‘quantum dot’, a tiny space in a quantum circuit where electrons are used as qubits (or quantum bits), the basic units of quantum information. Scientia Professor Andrew Dzurak explains that unlike a real atom, an artificial atom has no nucleus, but it still has shells of electrons whizzing around the center of the device, rather than around the atom’s nucleus. … ‘What really excites us about our latest research,’ [Dzurak stated,] ‘is that artificial atoms with a higher number of electrons turn out to be much more robust qubits than previously thought possible, meaning they can be reliably used for calculations in quantum computers. This is significant because qubits based on just one electron can be very unreliable.'”[6] Dzurak adds, “By using silicon CMOS technology we can significantly reduce the development time of quantum computers with the millions of qubits that will be needed to solve problems of global significance, such as the design of new medicines, or new chemical catalysts to reduce energy consumption.”

 

The ARC Centre of Excellence in Future Low-Energy Electronics Technologies recently reported, “A new study demonstrates a path towards scaling individual qubits to a mini-quantum computer, using holes. The study identifies a ‘sweet spot’ where the qubit is least sensitive to noise (ensuring longer retention of information) and simultaneously can be operated the fastest.”[7] Although the work is currently theoretical, it’s another step towards creating a quantum computer of the future. Prof Sven Rogge of the Centre for Quantum Computing and Communication Technology (CQC2T) states, “This theoretical prediction is of key importance for scaling up quantum processors and first experiments have already been carried out.” Assistant Professor Joe Salfi, from the University of British Columbia, adds, “Our recent experiments on hole qubits using acceptors in silicon already demonstrated longer coherence times than we expected. It is encouraging to see that these observations rest on a firm theoretical footing. The prospects for hole qubits are bright indeed.”

 

With so many breakthroughs being made with qubits, the future of quantum computing is looking ever brighter.

 

Footnotes
[1] James Norman, “Quantum Computing Will Revolutionize Data Analysis. Maybe Soon,” Seeking Alpha, 14 March 2018.
[2] Jürgen Graf, “The road to quantum computing is paved in qubits,” Phys.org, 29 March 2021.
[3] Maija Palmer, “Quantum Motion unveils 9-second silicon qubit,” Sifted, 31 March 2021.
[4] University of Sydney, “Beyond qubits: Sydney takes next big step to scale up quantum computing,” EurekAlert! 1 February 2021.
[5] Dennis Overbye, “Quantum Computing Is Coming, Bit by Qubit,” The New York Times, 21 October 2019.
[6] University of New South Wales, “Artificial atoms create stable qubits for quantum computing,” Phys.org, 11 February 2020.
[7] ARC Centre of Excellence in Future Low-Energy Electronics Technologies, “Qubits composed of holes could be the trick to build faster, larger quantum computers,” Science Daily, 2 April 2021.

Related Posts: