Collaboration between NSF quantum centers finds path to fault tolerance in neutral atom qubits

Like the classical computers we use every day, quantum computers can make mistakes when manipulating and storing the quantum bits (qubits) used to perform quantum algorithms. Theoretically, a quantum error correction protocol can correct these errors in a similar manner to classical error correction. However, quantum error correction is more demanding than its classical counterpart and has yet to be fully demonstrated, putting a limit on the functionality of quantum computers.

In a theory paper published in Nature Communications, UW–Madison physicist Shimon Kolkowitz and colleagues show a new way that quantum errors could be identified in one type of qubit known as neutral atoms. By pinpointing which qubit experienced an error, the study suggests that the requirements on quantum error correction can be significantly relaxed, approaching a level that neutral atom quantum computers have already achieved.

profile photo of Shimon Kolkowitz
Shimon Kolkowitz

The study is a collaboration between two National Science Foundation Quantum Leap Challenge Institutes, Hybrid Quantum Architectures and Networks (HQAN) and Robust Quantum Simulation (RQS). UW–Madison is a member of HQAN.

“In quantum computing, a lot of the overhead in an error-correcting code is figuring out which qubit had the error. If I know which qubit it is, then the amount of redundancy needed for the code is reduced,” Kolkowitz says. “Neutral atom qubits are right on the edge of what you would call this fault-tolerant threshold, but no one has been able to fully realize it yet.”

Neutral atom qubits are made up of single atoms trapped with light. The logical gates between the atoms are performed by exciting the atoms to “Rydberg” states where the atom’s electron is excited far beyond its normal location. This quantum computing technique was first pioneered and experimentally demonstrated at UW­–Madison by physics professors Mark Saffman and Thad Walker.

Currently, one error-corrected “logical” qubit is expected to require around 1000 physical qubits, exceeding the maximum number of qubits anyone has managed to wire together in any quantum computing system. Researchers have been studying different elements in Rydberg form as qubits for decades, gradually increasing their performance but not yet reaching the level required for error correction.

a cartoon schematic of the experimental setup
Schematic of a neutral atom quantum computer, where the physical qubits are individual 171Yb atoms.

Knowing that eliminating qubit errors is not practical, Kolkowitz and colleagues instead asked if there might be a way to convert the errors into a type known as erasure errors. Named for the fact that the qubit has effectively vanished, or been erased, erasure errors can be beneficial because it is much easier to tell if a qubit is missing than if it is in the correct state or not.

They investigated the largely unstudied (in this context) element ytterbium because it has a relatively simple outer electron structure and its nucleus can only exist in two quantum spin states, +1/2 and -1/2. When manipulated into a metastable electronic state — a temporary state different than the true ground state — the qubit states are then set by the nuclear +1/2 and -1/2 spin states, and the quantum gates involve coupling one of these two qubit states to the Rydberg state.

“We knew that if an atom falls out of the metastable electronic state to the true ground state, we can detect that with a laser and still not screw up any of the other qubits at all,” Kolkowitz says. “But what that means is if I can set up a situation where errors manifest themselves as falling down into the ground state, I can just shine this laser and constantly check for errors and identify the qubit it happened in.”

In their first calculations using this metastable ytterbium platform, they showed that though errors still occur in the qubits as expected, around 98% of them would be converted to the detectable erasure errors.

But is that 98% error conversion rate good enough for a quantum computer in practice? The answer depends on the error threshold, a value that differs depending on the gates and qubits being used. The researchers next ran simulations of different numbers of quantum gates with a 98% erasure error conversion rate, or no conversion to erasure errors at all (as is currently standard).

a plot with detected error fraction on the x axis and threshold error rate on the y axis. the resultant line starts near 0, 0 (really 0, 0,01) and curves up to 1.0, 0.05
With no error correction, the error threshold is just under 1%. But with 98% erasure conversion (green star), the threshold increases 4.4-fold to just over 4%.

Without erasure error conversion, the error threshold is just shy of 1%, meaning each quantum gate would have to operate at over a 99% success probability to be fault tolerant. With erasure error conversion, however, the error threshold increases 4.4-fold, to just over 4%. And, for example, Saffman’s group has already demonstrated better than 4% error rates in neutral atom qubits.

The higher error threshold also means that there can be a higher ratio of logical qubits to physics qubits. Instead of one logical qubit per 1000 physical qubits, there could be around ten.

“And that’s the point of this work,” Kolkowitz says. “We show that if you can do this erasure conversion, it relaxes the requirements on both the error rates and on the number of atoms that you need.”

Co-author Jeff Thompson, an RQS investigator at Princeton, is already working to demonstrate these results experimentally in his lab. While other elements have been used in neutral atom quantum computing experiments before, until recently ytterbium was largely an unknown. Thompson’s group did much of the preliminary work to characterize the element and show that it could work well as the building block for a quantum computer.

“There were many open questions about ytterbium, and I’d say a lot of people who were using other elements are now moving into ytterbium based on Jeff’s groundwork,” Kolkowitz says. “And I expect that this study will accelerate that trend.”

Yue Wu and Shruti Puri at Yale University collaborated on this study. This work was supported by the National Science Foundation QLCI grants OMA-2120757 and OMA-2016136. Kolkowitz’s part of the work was additionally supported by ARO W911NF-21-1-0012.

Margaret Fortman awarded Google quantum computing fellowship

This post was adapted from a story posted by the UW–Madison Graduate School

Two UW–Madison graduate students, including physics grad student Margaret Fortman, have been awarded 2022 Google Fellowships to pursue cutting-edge research. Fortman received the 2022 Google Fellowship in Quantum Computing, one of only four awarded.

profile picture of Margaret Fortman
Margaret Fortman

Google created the PhD Fellowship Program to recognize outstanding graduate students doing exceptional and innovative research in areas relevant to computer science and related fields. The fellowship attracts highly competitive applicants from around the world.

“These awards have been presented to exemplary PhD students in computer science and related fields,” Google said in its announcement. “We have given these students unique fellowships to acknowledge their contributions to their areas of specialty and provide funding for their education and research. We look forward to working closely with them as they continue to become leaders in their respective fields.”

The program begins in July when students are connected to a mentor from Google Research. The fellowship covers full tuition, fees, and a stipend for the academic year. Fellows are also encouraged to attend Google’s annual Global Fellowship Summit in the summer.

Fortman works to diagnose noise interference in quantum bits

Fortman, whose PhD research in Victor Brar’s group specializes in quantum computing, will use the fellowship support to develop a diagnostic tool to probe the source of noise in superconducting quantum bits, or qubits.

Quantum computing has the potential to solve problems that are difficult for standard computers, Fortman said, but the field has challenges to solve first.

“The leading candidate we have for making a quantum computer right now is superconducting qubits,” Fortman said. “But those are currently facing unavoidable noise that we get in those devices, which can actually come from the qubit material itself.”

Fortman works with a low-temperature ultra-high vacuum scanning tunneling microscope on the UW–Madison campus to develop a microscopic understanding of the origins of noise in qubits. She fabricates superconductors to examine under the microscope to identify the source of the noise, and hopefully be able to develop a solution for that interference.

In her time as a graduate student at UW–Madison, Fortman said she has enjoyed collaborating with colleagues in her lab and across campus.

“It’s pretty cool to be somewhere where world-renowned research is happening and to be involved with that,” she said. “My PI and I work in collaborations with other PIs at the university and they’re all doing very important research, and so it’s really cool to be a part of that.”

Fortman is excited to have a mentor at Google through the PhD Fellowship, having been paired with someone who has a similar disciplinary background and who is a research scientist with Google Quantum AI.

“He can be a resource in debugging some parts of my project, as well as general mentorship and advice on being a PhD student, and advice for future career goals,” Fortman said.

The second UW–Madison student who earned this honor is computer sciences PhD student Shashank Rajput, who received the 2022 Google Fellowship in Machine Learning.

The future of particle physics is also written from the South Pole

This post was originally published by the IceCube collaboration. Several UW–Madison physicists are part of the collaboration and are featured in this story

A month ago, the Seattle Community Summer Study Workshop—July 17-26, 2022, at the University of Washington—brought together over a thousand scientists in one of the final steps of the Particle Physics Community Planning Exercise. The meetings and accompanying white papers put the cherry on top of a period of collaborative work setting a vision for the future of particle physics in the U.S. and abroad. Later this year, the final report identifying research priorities in this field will be presented. Its main purpose is to advise the Department of Energy and the National Science Foundation on research for their agendas during the next decade.

As new and old detectors once again prepare to expand the frontiers of knowledge, we asked some IceCube collaborators about the role the South Pole neutrino observatory should play in the bright future that lies ahead for particle physics.

Q: What type of neutrinos are currently detected in IceCube? And will that change with the future extensions?

The vast majority of the neutrinos we detect are generated in the atmosphere by cosmic rays, but we also have on the order of 1,000 cosmic neutrinos at energies above 10 TeV. We use the atmospheric neutrinos for a wide range of science, first of all to study the neutrinos themselves.

IceCube has detected more than a million neutrinos to date. That’s already a big number for neutrino scientists, and we will detect even more in the future. The deployment of the IceCube Upgrade, an extension of our facility targeting neutrinos at lower energies, will increase the density of sensors in IceCube’s inner subdetector, DeepCore, by a factor of 10. And a second, larger extension is also in the works. With IceCube-Gen2, we will improve the detection at the highest energies, too: the IceCube volume will increase by almost a factor of 10, and our event rate for high-energy cosmic neutrinos will also grow by an order of magnitude.

Albrecht Karle, IceCube associate director for science and instrumentation and a professor of physics at the University of Wisconsin–Madison

Q: Are the futures of IceCube and that of particle physics intrinsically linked?

Absolutely! Many open questions in particle physics have neutrinos at the center. What’s their mass? What is the behavior of neutrino flavor mixing? Are there right-handed (sterile) neutrinos? Neutrinos are particularly attractive in the search for new physics. We can answer all these questions, to varying levels, within IceCube and especially moving forward with the IceCube Upgrade and IceCube-Gen2.

Erin O’Sullivan, an associate professor of physics at Uppsala University

IceCube, the Icecube Upgrade, and IceCube-Gen2 can all uniquely contribute to the study of particle physics, in particular, neutrino physics, beyond Standard Model (BSM) physics, and indirect searches of dark matter. The IceCube Upgrade provides complementary and independent measurements of neutrino oscillation in addition to the long-baseline experiments. And IceCube-Gen2 will be crucial to exploring the BSM features, such as sterile neutrinos and secret neutrino interactions, at an energy that cannot be reached by the underground facilities. It will also be a discovery machine for heavy dark matter particles.

Ke Fang, an assistant professor of physics at the University of Wisconsin–Madison

Q: Talking about discoveries, now that both IceCube and Super-Kamiokande have reported definitive observations of tau neutrinos in atmospheric and astrophysical neutrino data, why should the international particle physics community continue to improve their detection?  

The tau neutrino was discovered at Fermilab in an emulsion experiment where they observed double-bang events with a distance on the order of 1 mm separating production and decay. Since they represent the least studied neutrino and, in fact, one of the least studied particles, improved measurements of tau properties may reveal that the 3×3 matrix is not unitary and expose the first indication of physics beyond the 3-flavor oscillation scenario.

Francis Halzen, IceCube PI and a professor of physics at the University of Wisconsin–Madison

We are the only experiment operating currently (and in the foreseeable future) that is able to identify tau neutrinos on an event-by-event basis. We can do so by looking at the distinct morphological features they produce in our data at the highest energies. And with the IceCube Upgrade, we will also be the experiment that collects the most tau neutrinos.  I suspect that these neutrinos will surprise us again and point us towards new physics.

Carlos Argüelles, an assistant professor of physics at Harvard University.  

Four hundred years from now, people may see IceCube the way we see Galileo’s telescope, not as an end but as the beginning of a new branch of science. The astrophysical observation of tau neutrinos is but one piece in a large number of studies that IceCube can conduct, including the study of fundamental physics using astrophysical neutrinos.

Ignacio Taboada, IceCube spokesperson and a professor of physics at the Georgia Institute of Technology

Q: In 2019, the Wisconsin IceCube Particle Astrophysics Center joined the Interactions Collaboration, which includes all major particle physics laboratories around the globe. The IceCube letter of introduction to this community detailed some of the most accurate results to date in neutrino physics. What’s unique about IceCube neutrino science?

One unique aspect of IceCube is the breadth of neutrino energy that we can measure, all the way down to the MeV energy scale in the case of a galactic supernova and up to as far as a few PeV neutrinos, which are the highest energy neutrinos ever detected. Therefore, IceCube provides us with different windows to study the neutrino and understand its properties. Especially in the context of searching for new physics, this is important as these processes can manifest at a particular energy scale but not be visible at other energy scales.

Erin O’Sullivan, an associate professor of physics at Uppsala University

Q:  Let’s focus on high-energy neutrinos for a moment. What are the needs for their detection and why is the South Pole ice the perfect place for those searches? 

The highest energy neutrinos can be directly linked to the most powerful accelerators in the universe but also allow us to test the Standard Model at energies inaccessible to current or future planned colliders.

And why the South Pole? Well, what makes the South Pole such an optimal location are the exceptional optical and radio properties of its ice sheet, which is also the largest pool of ice on Earth. Neutrino event rates are very low at these energies and, thus, we need a huge detector to measure them.

Deep-ice Cherenkov optical sensors have already been proven as high-performing detectors for TeV and PeV neutrinos when deployed at depths of 1.4 km and greater below the surface. And radio technology is promising because radio waves can travel much further than optical photons in the ice, plus they work at shallow depths. So, when searching for the highest energy neutrinos using the South Pole ice sheet, radio neutrino detectors might be the only solution that scales up. Radio waves are able to travel further in the South Pole than in Greenland, for example. It’s a gift from nature to have this giant, pure block of ice to catch elusive neutrinos from the most powerful accelerators.

Lu Lu, an assistant professor of physics at the University of Wisconsin–Madison

Q: And what about the lowest energies? How does IceCube perform there? 

IceCube’s DeepCore detector was especially designed for that: a more dense layout of photodetectors embedded in the center of IceCube and located at about 2 km depth, it uses the surrounding IceCube sensors to eliminate essentially all background from the otherwise dominant cosmic ray muons. This means that DeepCore can now be analyzed as if it was at 10 km depth, deeper than any mine on Earth. In the near future, the IceCube Upgrade will add seven strings of new sensors inside DeepCore, which will hugely increase its precision for neutrino properties.

Albrecht Karle, IceCube associate director for science and instrumentation and a professor of physics at the University of Wisconsin–Madison 

IceCube’s low energies are what all other neutrino experiments would call high energies. This is a regime where the neutrino interactions are well predicted from accelerator experiments, which means that if deviations are found in the data we can claim new physics. Thus, IceCube and the upcoming IceCub Upgrade results are not only going to yield some of the most precise measurements on the neutrino oscillation parameters but also—and more importantly—test the neutrino oscillation framework.

Carlos Argüelles, an assistant professor of physics at Harvard University  

Q: And, last but not least, we should think about the people that will make all this possible. What efforts are underway to diversify who does science and make the field more equitable?

Four years ago, IceCube invited a few collaborations to join efforts to increase equity, diversity, inclusion, and accessibility (DEIA) in multimessenger astrophysics. With support from NSF, this was the birth of the Multimessenger Diversity Network (MDN). This network now includes a dozen participating collaborations, which is an indication of the growing awareness and action to increase DEIA across the field. Set up as a community of practice, where people share their knowledge and experiences with each other, the MDN is a reproducible and scalable model for other fields. We are excited to see this community of practice grow, to contribute with resources and experiences, and to learn from others.

For the first time in an official capacity, DEIA efforts were included in the Snowmass planning process and were also incorporated into the Astro2020 Decadal Survey. One take-away from these processes is that more resources and accountability are needed to speed up DEIA efforts.

Ellen Bechtol, MDN community manager and an outreach specialist at the Wisconsin IceCube Particle Astrophysics Center

Read more about IceCube and its future contributions to particle physics

  • Snowmass Neutrino Frontier: NF04 Topical Group Report. Neutrinos from natural sources. (Jul 2022)
  • CF7. Cosmic Probes of Fundamental Physics. Topical Group Report (Jul  2022).
  • “High-Energy and Ultra-High-Energy Neutrinos: A Snowmass White Paper”, M.Ackermann et al. arxiv.org/abs/2203.08096
  • “Tau Neutrinos in the Next Decade: from GeV to EeV,” R. S. Abraham et al. arxiv.org/abs/2203.05591
  • “Snowmass White Paper: Beyond the Standard Model effects on Neutrino Flavor,” C. Argüelles et al. arxiv.org/abs/2203.10811
  • “Snowmass 2021 White Paper: Cosmogenic Dark Matter and Exotic Particle Searches in Neutrino Experiments,” J. Berger et al. arxiv.org/abs/2207.02882
  • “White Paper on Light Sterile Neutrino Searches and Related Phenomenology,” M. A. Acero et al, arxiv.org/abs/2203.07323
  • “Ultra-High-Energy Cosmic Rays: The Intersection of the Cosmic and Energy Frontiers,” A. Coleman, arxiv.org/abs/2205.05845
  • “Advancing the Landscape of Multimessenger Science in the Next Decade,” K. Engle et al. arxiv.org/abs/2203.10074

Zweibel receives Astronomical Society of the Pacific’s most prestigious award

This post is adapted from an Astronomical Society of the Pacific press release

The Astronomical Society of the Pacific (ASP) has awarded the 2022 Catherine Wolfe Bruce Gold Medal to Ellen Zweibel. It is the most prestigious award given by ASP.

profile photo of Ellen Zweibel
Ellen Zweibel, W. L. Kraushaar professor of astronomy and physics (Photo by Althea Dotzour / UW–Madison)

Zweibel, the William L. Kraushaar professor of astronomy and physics at UW–Madison, was recognized for her contributions to the understanding of astrophysical plasmas, especially those associated with the Sun, stars, galaxies, and galaxy clusters. She has also made major contributions in linking plasma characteristics and behaviors observed in laboratories to astrophysical plasma phenomena occurring in the universe.

Most plasma effects in astrophysical systems are due to an embedded magnetic field. Many of them can be grouped into a small number of basic physical processes: how magnetic fields are generated, how they exchange energy with their environments (sometimes on explosively fast timescales), their role in global instabilities, how they cause a tiny fraction of thermal particles to be accelerated to relativistic energies, and how they mediate the interaction of these relativistic particles (cosmic rays) with their gaseous environments through waves and instabilities on microscales. Although all these processes occur in laboratory plasmas, it is in natural plasmas that they take their most extreme forms. Zweibel and her students and postdocs have used analytical theory and numerical simulations to study the generation and evolution of magnetic fields in the Sun and other stars, in galaxies, and in galaxy clusters, and have researched the effects of high energy cosmic ray particles in all of these environments. Their most recent work centers on the role of cosmic rays in star formation feedback: the self-regulation of the star formation rate in galaxies through energy and momentum input to the ambient medium by the stars themselves.

a gold medal that says astronomical society of the pacific around the rim and has an antiquity-looking woman and other details
The Catherine Wolfe Bruce Gold Medal (photo from the Astronomical Society of the Pacific)

Zweibel has authored over 242 refereed publications with over 8,000 citations. In 2016 she was awarded the American Physical Society’s James Clerk Maxwell Prize for Plasma Physics “For seminal research on the energetics, stability, and dynamics of astrophysical plasmas, including those related to stars and galaxies, and for leadership in linking plasma and other astrophysical phenomena.” She is a member of the National Academy of Sciences.

The Astronomical Society of the Pacific’s Catherine Wolfe Bruce Gold Medal was established in 1898 by Catherine Wolfe Bruce, an American philanthropist and patroness of astronomy. The ASP presents the medal annually to a professional astronomer in recognition of a lifetime of outstanding achievement and contributions to astrophysics research. It was first awarded in 1898 to Simon Newcomb. Previous recipients of the Bruce Medal include Giovanni V. Schiaparelli (1902), Edwin Hubble (1938), Fred Hoyle (1970), and Vera Rubin (2003)