two-men-hold-a-large-gold-cylinder-that-is-suspended-by-several-narrow-gold-pipes

Separating the Truth from the Quantum Computing Hype IEEE Quantum Week provides an opportunity to address problems and celebrate progress.

The purpose of this guest article is to honour IEEE Quantum Week 2022. The opinions presented here are the author’s alone and do not reflect those of IEEE Spectrum or the IEEE.

Few industries encourage as much irrational hype as quantum computing. The extent of most people’s knowledge of quantum physics is that it is unpredictable, potent, and almost existentially odd. I gave IEEE Spectrum an update on the development of quantum computing a few years ago and examined both industry-wide positive and negative statements. And just like in 2019, I’m still fervently optimistic today. Despite the fact that the hype has outrun the results, there has been significant progress in recent years.

Let’s talk about the hype first.

There has been indisputable buzz about quantum computing over the past five years, including hype about methods, schedules, applications, and more. The technology’s commercialisation, according to suppliers, was just a few years away as recently as 2017, when they announced a 5,000-qubit device would be available by 2020 (which never materialised). There was even what I would refer to as antihype, with some people doubting that quantum computers would ever become a reality (I hope they end up being wrong).

Companies now focus on ten years rather than a few years, although they nevertheless publish road plans that indicate commercially viable systems as early as 2029. In an effort to assist institutions in making the transition to new security systems, the Department of Homeland Security even developed a road map to protect against the dangers posed by quantum computing. This is an example of how hype-fueled expectations are becoming institutionalised. This fosters a “adapt or you’ll fall behind” mentality for both postquantum cryptography security and quantum computer applications.

The hoopla surrounding quantum computing, or phase two of Gartner’s five-phase development model, may have already peaked, according to the market research company famous for its “Hype Cycle.” This indicates that the business sector is poised to hit “the trough of disillusionment.” According to McKinsey & Company, “based on announced hardware roadmaps for gate-based quantum computing players, fault tolerant quantum computing is predicted between 2025 and 2030.” Since we still have a long way to go before we reach quantum practicality—the moment at which quantum computers can accomplish anything special to transform our lives—I don’t think this is totally feasable.

Quantum practicality, in my perspective, is still ten to fifteen years off. But rather than just being constant, the pace of that goal’s advancement is quickening. The more we learn, the faster we advance, as we saw with Moore’s Law and semiconductor advancement. It has taken decades for semiconductor technology to advance to this point, with each step quickening the process. With quantum computing, comparable advancements are anticipated.

In fact, we are finding that the lessons we gained from designing transistors at Intel are also assisting in accelerating our current work on developing quantum computing. We can use the infrastructure for making transistors that already exists, for instance, to develop silicon spin qubits faster and with higher quality. In a high-volume fabrication facility, we have begun mass producing qubits on 300-millimeter silicon wafers, enabling us to pack a collection of more than 10,000 quantum dots on a single wafer. Our expertise in semiconductors is also being used to develop a cryogenic quantum control chip called Horse Ridge, which is assisting in addressing the connection issues related to quantum computing by reducing the amount of cabling now clogging the dilution refrigerator. And thanks to the cryoprober, which was created as a result of our expertise testing semiconductors, our team is now able to receive testing findings from quantum devices in hours as opposed to the days or weeks it previously took.

It’s probable that others are also gaining from their own prior study and experience. The entanglement of logical qubits in a fault-tolerant circuit employing real-time quantum error correction, for instance, was demonstrated in a recent study by Quantinuum. While still in its infancy, it serves as an illustration of the kind of development required in this crucial area. Google has a new open-source library for building quantum computers called Cirq. Cirq, along with comparable libraries from IBM, Intel, and other companies, is promoting the improvement of quantum algorithms. And as a final illustration, IBM’s 127-qubit Quantum Eagle processor demonstrates constant progress toward increasing the qubit count.

There are still a few significant difficulties as well.

First, we still require more advanced hardware and superior qubits. Even though the best one- and two-qubit gates reach the required level of fault tolerance, the community has not yet done so on a significantly bigger system.

Second, no one has yet proposed a quantum computer connecting technique that is as aesthetic as the way modern microprocessors are wired. Currently, each qubit needs a number of control connections. This strategy is unworkable as we work to build a massive quantum computer.

Third, we require quick feedback and control loops for qubits. Horse Ridge is a prelude to this since we anticipate improved latency as a result of placing the control chip inside the refrigerator and consequently nearer the qubit chip.

Correction of errors comes last. A considerable number of qubits have not yet been subjected to an error-correction algorithm, despite some recent signs of progress in correction and mitigation.

These are obstacles we will conquer because new research consistently demonstrates innovative techniques and advancements. For instance, many in the industry are attempting to construct quantum system-on-chips by integrating qubits and the controller on the same die (SoCs).

But a fault-tolerant quantum computer is still quite a ways off. Intel anticipates being competitive (or ahead of others) in terms of qubit count and performance over the following ten years, but as I’ve already said, nobody will have a system big enough to provide compelling value for another ten to fifteen years. The industry must keep evolving its qubit numbers and raising the bar on quality. The subsequent step should be the creation of thousands of high-quality qubits, which is still a few years away, followed by scaling that up to millions.

Let’s not forget that Google needed 53 qubits to develop an application that could perform a supercomputer task. We’ll need to see system sizes that are orders of magnitude greater if we want to investigate novel applications that go beyond the capabilities of today’s supercomputers.

Although quantum computing has advanced significantly over the last five years, there is still a long way to go, therefore long-term funding will be required. In the lab, important advancements are taking place that hold out a lot of hope for what the future may hold. It’s crucial that we avoid falling for the hype for the time being and instead concentrate on practical results.

Total
0
Shares