
Chetan Nayak, Microsoft technical fellow and educator, University of California at Santa Barbara.
Photo by John Brecher for Microsoft
Science & Tech
Why the new qubit could enhance ultrafast quantum computing
Microsoft’s innovation appears to be a more stable, dependable alternative
Microsoft revealed last month that it had developed a “topological qubit,” which the firm claims can operate a quantum computer more reliably than previously created quantum qubits, and which they anticipate will accelerate the creation of ultrafast quantum computers capable of addressing the most formidable computing challenges—far exceeding the capabilities of even supercomputers built through traditional methods.
The long-established domain of quantum computing endeavors to exploit the unique forces present at the subatomic level. Central is the concept of “superposition,” the idea that something can exist in two states simultaneously.
In classical computation, data is held as bits, either a 1 or a 0. In quantum computation, superposition means that data can be stored in a qubit as a 1, a 0, or a mixture of both. This exponentially enhances the machine’s capacity.
For instance, in December, Google revealed a quantum chip that accomplished a computation in a mere five minutes that would require a conventional supercomputer 10 septillion years.
Microsoft’s topological qubit is made from indium arsenide and aluminum, which transitions to a superconductor when cooled to extremely low temperatures. It is the culmination of nearly twenty years of research by a Microsoft team headed by Chetan Nayak, Microsoft technical fellow and educator at the University of California at Santa Barbara.
In this refined discussion, Nayak, who commenced his journey in physics as an undergraduate at Harvard College in the late 1980s and early 1990s, conversed with the Gazette regarding the breakthrough and about his experiences navigating the often-challenging landscape of innovation.
What distinguishes Microsoft’s new qubit — the topological qubit — from conventional ones?
A qubit is a quantum mechanical two-level system. It can represent a 0 or a 1 like an ordinary bit, but due to the principles of quantum mechanics, it can also exist in a superposition of 0 and 1.
This occurs at sufficiently microscopic scales, and as components on microprocessors areminiaturized, we are approaching the conditions where quantum mechanics will become significant for classical computing. This poses a challenge because you want 0s and 1s to be precisely defined without unwanted fluctuations. However, it also presents an opportunity.
Richard Feynman and others acknowledged as early as the 1980s that the natural world is ultimately governed by quantum mechanics; hence, to simulate nature, one must utilize what is known as a quantum computer.
Thus, challenges related to quantum mechanics, such as simulating materials like high temperature superconductors or in chemistry, like modeling catalysts for nitrogen fixation to produce fertilizers or decompose microplastics, are predominantly addressed through experimental methods, high throughput, and trial and error. This is both costly and time-consuming.
With a quantum computer, these simulations could be executed because it operates by leveraging the same underlying physical principles employed by nature.
The risk, however, is that your qubits may resemble Schrödinger’s cat. In reality, it cannot simultaneously inhabit a superposition of being dead and alive since the environment effectively entangles with it, collapsing the wave function.
Consequently, qubits will ultimately— or in some instances quite rapidly— lose their superposition, resulting in the loss of all the additional advantages that quantum mechanics provides. This aspect is precisely what quantum error correction aims to address.
“Actually holding that physical processor in my hand, and feeling its tangible reality, was quite remarkable.”
A topological qubit is based on the premise that since error correction is necessary and concerns about the fragility of quantum states exist, the more such processes can occur at the hardware level, the more favorable your situation becomes.
The concept is that the quantum mechanical states—the quantum mechanical wave functions— exhibit similar mathematical structures. If you can engineer, or discover in nature, a physical system that organizes itself into quantum mechanical states whereby these wave functions possess that topological configuration, then the information you encode will be extraordinarily stable—not infinitely stable, but remarkably stable and potentially without other severe compromises.
Perhaps it doesn’t need to be oversized; it doesn’t need to be sluggish; and it could be easy to manage since the number of control signals required tends to be minimal. It strikes a sweet balance by embedding substantial stability and rigidity into the wave functions without severe drawbacks.
So, it’s a more secure, a more resilient system than the qubits used currently. How near is this to operating a genuine computer?
Our ultimate ambition is to realize a million-qubit quantum computer. This is the scale at which quantum computers will manage to solve crucial problems, such as developing new materials and chemistry.
It was in contemplating scale that we devised the roadmap we possess. We did not desire any solutions or technologies that would only reach 100 or 1,000 qubits. As of today, we possess only a handful of qubits, as demonstrated on the chip we have been showcasing; however, we have a roadmap leading to much larger systems.
We have engaged in a contract with DARPA, the Defense Advanced Research Projects Agency. While specifics are confidential, we have committed to delivering something substantial that will feature fault tolerance within a rigorously set timeline. It may not be a million yet, but it will be sufficiently advanced along the way.
It’s going to be abundantly evident that we can reach our destination.
Life is fleeting. This is an experience I aim to witness in years, not in decades, and our CEO shares this vision.
It appears that there were several significant obstacles. What did you find to be the most daunting?
In our quest to create topological qubits, our circumstances resembled the early phases of classical computing when individuals were constructing machines with vacuum tubes.
Semiconductors were not thoroughly comprehended, leading to considerable fundamental research aimed at grasping their true nature. At times, they appeared to be metals while at other times, they resembled insulators. The true potential lies in their ability to be adjusted between the two states: the capacity for switching and control.
It was essential for people to discern which properties were inherent and which were due to certain devices being more contaminated than others. This understanding contributed to the invention of the transistor, although substantial applications were still years away — it took time before it evolved into computers—followed by integrated circuits, and then the progress took off.
We recognized that possessing the correct material was crucial for achieving this new phase of matter. Additionally, we understood relatively early on that this material needed specific characteristics. It had to represent a blend of a superconductor and a semiconductor, amalgamating many beneficial traits of semiconductors along with the intriguing features of superconductors. We also needed to accomplish this while minimizing impurities or defects during the process.
Once we identified this as the primary challenge, the foundational issue that needed resolution before moving forward, and concentrated considerable efforts on that, we found ourselves in a much more favorable position.
“Our ultimate aspiration is to develop a million-qubit quantum computer. … We didn’t seek solutions or technologies that could only achieve 100 or 1,000 qubits.”
Naturally, in the initial stages, there was a lot of exploration aimed at deciphering the situation. However, I believe that the initial stride toward resolving a problem is to formulate it clearly. Without an accurate definition of the issue, arriving at a solution is unlikely. A detailed articulation of the problem depended significantly on our capacity to simulate these devices.
Yet, we could not utilize standard simulations commonly found in the semiconductor realm. The optimal scenario would have been access to a quantum computer capable of material simulations, which we lacked.
Consequently, we were required to create tailored, in-house simulations that assisted us in identifying the right material combinations and, naturally, how to innovate the synthesis and fabrication techniques to produce these new types of materials.
The third component was testing. Once we had assembled these three elements, it didn’t ensure success, but it at least signified that we had a solid strategy and the capability to commence the process.
What was your reaction when you actually held the chip in your hand?
It was quite astonishing, but the moment I truly felt a rush of excitement was when I began to observe the data from one of these chips, reflecting our expectations. That occurred within the past year during one of those moments where I felt a profound thrill and exclaimed, “Oh, wow.”
Throughout my 19 years of experience, there were challenges, yet particularly in the last four years, I experienced many instances where I thought, “We genuinely seem to have a grasp of what we’re doing here, and I perceive a clear path ahead.”
There were occasions that astonished us with our rapid progress. Nevertheless, holding that tangible processor in my hands and experiencing its reality was undeniably exhilarating.
When you completed your studies at Harvard College in the early 1990s, your degree was in physics, correct?
Yes, I obtained my undergraduate degree in physics from Harvard. I attended from ’88 to ’92, and it was an incredible experience. I resided in Dunster House. I revisited last year to check on one of the laboratories. I enjoyed jogging along the Charles River that morning, and simply walking from the hotel through Harvard Square to the Jefferson Lab brought back a flood of cherished memories, although the Square has undergone considerable changes.
I still maintain contact with my college roommates and close friends from my time at Harvard. We have a WhatsApp group that keeps us connected.
Many faculty members present during my time as a student are no longer there, but a few emeritus professors remain along with numerous outstanding new faculty whom I didn’t know as a student but have come to know professionally as a physicist over the past 10 to 15 years.
Your journey into this specific field began with your doctoral research at Princeton?
I attribute it to some experiences toward the end of my final year at Princeton; that’s when I embarked on this path. During my undergraduate years, I had a vague interest in similar topics, but quantum computing was not yet an established field.
Some skepticism has been voiced from certain sectors regarding your data. How do you respond to skeptics who express doubt about your findings?
Initially, skepticism is beneficial in science. It is a standard aspect of the process; whenever something groundbreaking arises, skepticism is warranted.
We shared numerous new findings at the Station Q conference. This annual gathering, held in Santa Barbara, unites over 100 individuals from across the field, including both academic and industry professionals. There were one or two scientists from Harvard, as well as representatives from Google and Intel.
Attendees at the conference received a 90-minute presentation, had opportunities to ask questions, and engaged in discussions informally during coffee breaks and dinners. However, the wider community has not yet been privy to this information, nor has it seen our paper, leading to many inquiries.
A section of attendees was well-acquainted with the latest results and expressed excitement and positive feedback regarding both the work and findings. Conversely, individuals who have yet to encounter these recent developments remain skeptical, which is completely natural.
I am set to deliver a talk at the American Physical Society Global Summit—marking the 100th anniversary of quantum mechanics, as well as the centenary of Schrodinger’s equation. During this event, an even larger audience will have the opportunity to learn about our latest results.
We are also preparing to release a paper around the same timeframe, which will provide many more individuals the chance to evaluate the most current data and make their own assessments.
What is the next step?
We published a paper last week that outlines a strategic plan. It doesn’t encompass everything we have shared with DARPA, but it represents the portion we are confident can be made public. We are moving forward at full speed.
We are focused on tackling substantial issues that ultimately revolve around comprehending nature in a deeper sense.
Some of my initial work in physics involved attempts to understand high-temperature superconductors. Their discovery was significant because superconductivity had previously been considered a phenomenon occurring only at extremely low temperatures.
It was later found that materials can indeed become superconducting above the boiling point of liquid nitrogen. The reasons behind this phenomenon remain unclear, which limits our ability to develop improved versions or achieve superconductivity at even higher temperatures, as we don’t know where to begin our search.
Thus, I’m thrilled that some of these pressing scientific challenges from early in my career, which I recognized as vital yet struggled to advance in, are now issues we can address with the aid of a quantum computer.