Research, teaching and outreach in Physics at UW–Madison
News Archives
Plasma astrophysicist and emeritus professor Don Cox has passed away
Posted on
Don Cox
Professor Emeritus Donald P. Cox passed away October 26, 2022. He was 79. A plasma astrophysicist, Cox contributed many years to research in his scientific field, to students with whom he worked, and to the department’s teaching mission.
Cox came to the UW–Madison physics department in 1969 with the promise of a faculty position a year before receiving his PhD from the University of California, San Diego. Except for an extended leave of absence at Rice University while his wife completed her degree in Houston, Cox spent his entire professional career here.
He arrived in the era of a cold and quiet interstellar medium and a newly discovered and unexplained soft X-ray background. For the next four years, he and his students did much of the original work on X-ray plasma emissions from supernova remnants, combining a broad physical insight into global processes with laborious and careful compilations of the necessary atomic physics. At this time, astronomers were still searching for the source of the X-ray background, having apparently eliminated all viable production mechanisms.
Cox looked beyond his remnants and realized that the uniform cold medium that he had been producing them in was incompatible with their collective effects on it. He proceeded to turn astronomy’s conventional picture on its head, proposing the hot, violent, and dynamic picture of the interstellar medium that is taught today as a matter of fact. His subsequent work was marked by a lack of respect for convention and a desire to apply basic physics principles to the complexities of interstellar dynamics. His insight that star formation must have a negative feedback effect on future star formation is today a central tenant of research on galactic evolution.
In following his own path, Cox developed an international reputation as the most original thinker in his field. His legacy of fundamentally new ideas is supplemented by two generations of his students who continue his work.
The other side of Cox’s career at Wisconsin was his dedication to teaching, attested to by his many years as leader of the department’s undergraduate program, his election as a fellow of the Teaching Academy, and numerous unsolicited testimonials from students. His interest in teaching was clearly fueled by a desire to share his own joy and fascination with the ideas of physics. He spent hours with pencil and paper, solving a problem that had nothing to do with his research, just to show that some seemingly complex behavior can be derived from basic principles. He did this out of personal curiosity, but his willingness to share his enjoyment of the result was well known.
Modified from Department Archives, with special thanks to Prof. Dan McCammon for contributing to this piece
The U.S. Department of Energy (DOE) announced $4.3 million in funding for 16 projects in artificial intelligence (AI) research for high energy physics (HEP), including one from UW–Madison physics professor Gary Shiu for his work on applying knowledge gained from string theory research to improving AI techniques.
Gary Shiu
These awards support the DOE Office of Science initiative in artificial intelligence research to use AI techniques to deliver scientific discoveries that would not otherwise be possible, and to broaden participation in high energy physics research.
“AI and Machine Learning (ML) techniques in high energy physics are vitally important for advancing the field,” said Gina Rameika, DOE Associate Director of Science for High Energy Physics. “These awards represent new opportunities for university researchers that will enable the next discoveries in high energy physics.”
String theory addresses one of the deepest problems of contemporary physics, namely the reconciliation of gravity and quantum theory. It presents an enormously complex system that is well suited as a testbed for advancing AI techniques. The space of string theory solutions is vast, and the associated energy landscape is high-dimensional, computationally complex, and in general non-convex with unknown hidden structures. In recent years, a variety of AI methods have been used to tackle the string landscape.
“Now, the time is ripe for the pendulum to swing back: through studies of the string landscape, novel optimizers as well as AI techniques for discovering hidden structures of complex multi-dimensional data spaces are emerging,” Shiu says. “We propose to export lessons from string theory to advance AI algorithms generally for computationally complex constrained systems.”
Shiu expects that lessons drawn from this work would then be applied generally to AI applications in other large-scale computational problems.
Clint Sprott makes 2022 list of highly cited researchers
Posted on
Sixteen UW–Madison researchers — including emeritus professor of physics Clint Sprott — were recently recognized on the Institute for Scientific Information™ list of Highly Cited Researchers 2022. The list identifies scientists and social scientists who have demonstrated significant influence through publication of multiple highly-cited papers during the last decade.
Department of Energy grant to train students at the interface of high energy physics and computer science
Posted on
To truly understand our physical world, scientists look to the very small, subatomic particles that make up everything. Particle physics generally falls under the discipline of high energy physics (HEP), where higher and higher energy collisions — tens of teraelectronvolts, or about ten trillion times the energy of visible light — lead to the detection and characterization of particles and how they interact.
These collisions also lead to the accumulation of inordinate amounts of data, and HEP is increasingly becoming a field where researchers must be experts in both particle physics and advanced computing technologies. HEP graduate students, however, rarely enter graduate school with backgrounds in both fields.
Physicists from UW–Madison, Princeton University, and the University of Massachusetts-Amherst are looking to address the science goals of the HEP experiments by training the next generation of software and computing experts with a 5-year, ~$4 million grant from the U.S. Department of Energy (DOE) Office of Science, known as Training to Advance Computational High Energy Physics in the Exascale Era, or TAC-HEP.
“The exascale era is upon us in HEP and the complexity, computational needs and data volumes of current and future HEP experiments will increase dramatically over the next few years. A paradigm shift in software and computing is needed to tackle the data onslaught,” says Tulika Bose, a physics professor at UW–Madison and TAC-HEP principal investigator. “TAC-HEP will help train a new generation of software and computing experts who can take on this challenge head-on and help maximize the physics reach of the experiments.”
Tulika Bose
In total, DOE announced $10 million in funding today for three projects providing classroom training and research opportunities in computational high energy physics to train the next generation of computational scientists and engineers needed to deliver scientific discoveries.
At UW–Madison, TAC-HEP will annually fund four-to-six two-year training positions for graduate students working on a computational HEP research project with Bose or physics professors Keith Bechtol, Kevin Black, Kyle Cranmer, Sridhara Dasu, or Brian Rebel. Their research must broadly fit into the categories of high-performance software and algorithms, collaborative software infrastructure, or hardware-software co-design.
Bose’s research group, for example, focuses on proton-proton collisions in the Compact Muon Solenoid (CMS) at the CERN Large Hadron Collider (LHC). The high luminosity run of the LHC, starting in 2029, will bring unprecedented physics opportunities — and computing challenges, challenges that TAC-HEP graduate students will tackle firsthand.
“The annual data volume will increase by 30 times while the event reconstruction time will increase by nearly 25 times, requiring modernization of the software and computing infrastructure to handle the demands of the experiments,” Bose says. “Novel algorithms using modern hardware and accelerators, such as Graphics Processing Units, or GPUs, will need to be exploited together with a transformation of the data analysis process.”
TAC-HEP will incorporate targeted coursework and specialized training modules that will enable the design and development of coherent hardware and software systems, collaborative software infrastructure, and high-performance software and algorithms. Structured R&D projects, undertaken in collaboration with DOE laboratories (Fermilab and Brookhaven National Lab) and integrated within the program, will provide students from all three participating universities with hands-on experience with cutting-edge computational tools, software and technology.
The training program will also include student professional development including oral and written science communication and cohort-building activities. These components are expected to help build a cohort of students with the goal of increasing recruitment and retention of a diverse group of graduate students.
“Future high energy physics discoveries will require large accurate simulations and efficient collaborative software,” said Regina Rameika, DOE Associate Director of Science for High Energy Physics. “These traineeships will educate the scientists and engineers necessary to design, develop, deploy, and maintain the software and computing infrastructure essential for the future of high energy physics.
UW–Madison physicists key in revealing neutrinos emanating from galactic neighbor with a gigantic black hole
Posted on
On Earth, billions of subatomic particles called neutrinos pass through us every second, but we never notice because they rarely interact with matter. Because of this, neutrinos can travel straight paths over vast distances unimpeded, carrying information about their cosmic origins.
Although most of these aptly named “ghost” particles detected on Earth originate from the Sun or our own atmosphere, some neutrinos come from the cosmos, far beyond our galaxy. These neutrinos, called astrophysical neutrinos, can provide valuable insight into some of the most powerful objects in the universe.
For the first time, an international team of scientists has found evidence of high-energy astrophysical neutrinos emanating from the galaxy NGC 1068 in the constellation Cetus.
The detection was made by the National Science Foundation-supported IceCube Neutrino Observatory, a 1-billion-ton neutrino telescope made of scientific instruments and ice situated 1.5-2.5 kilometers below the surface at the South Pole.
These new results, to be published tomorrow (Nov. 4, 2022) in Science, were shared in a presentation given today at the Wisconsin Institute for Discovery.
“One neutrino can single out a source. But only an observation with multiple neutrinos will reveal the obscured core of the most energetic cosmic objects,” says Francis Halzen, a University of Wisconsin–Madison professor of physics and principal investigator of the IceCube project. “IceCube has accumulated some 80 neutrinos of teraelectronvolt energy from NGC 1068, which are not yet enough to answer all our questions, but they definitely are the next big step toward the realization of neutrino astronomy.”
Congratulations to Shimon Kolkowitz on his promotion to Associate Professor of Physics with tenure! Professor Kolkowitz is an AMO physicist whose research focuses on ultraprecise atomic clocks and nitrogen vacancy (NV) centers in diamonds, both of which have applications in quantum sensing. He joined the UW–Madison physics faculty as an assistant professor in January 2018. Since then, he has published numerous articles in top journals, including incredibly accurate comparisons of the rate that clocks run this year in the journal Nature.
Department Chair Mark Eriksson emphasizes Kolkowitz’s contributions across all aspects of his work: “Shimon, graduate students, and postdocs here at Wisconsin, have set records with their atomic clock, and at the same time, Shimon has played critically important roles in teaching and service, including guiding our graduate admissions through the pandemic and all that entails.”
Kolkowitz has been named a Packard Fellow, a Sloan Fellow, and has earned an NSF CAREER award, amongst other honors. He is also the Education, Workforce Development, and Outreach Major Activities Lead for Hybrid Quantum Architectures and Networks (HQAN), an NSF QLCI Institute of which UW–Madison is a member.
IceCube analysis indicates there are many high-energy astrophysical neutrino sources
Back in 2013, the IceCube Neutrino Observatory—a cubic-kilometer neutrino detector embedded in Antarctic ice—announced the first observation of high-energy (above 100 TeV) neutrinos originating from outside our solar system, spawning a new age in astronomy. Four years later, on September 22, 2017, a high-energy neutrino event was detected coincident with a gamma-ray flare from a cosmic particle accelerator, a blazar known as TXS 0506+056. The coincident observation provided the first evidence for an extragalactic source of high-energy neutrinos.
The identification of this source was possible thanks to IceCube’s real-time high-energy neutrino alert program, which notifies the community of directions and energies of individual neutrinos that are most likely to have come from astrophysical sources. These alerts trigger follow-up observations of electromagnetic waves from radio up to gamma-ray, aimed at pinpointing a possible astrophysical source of high-energy neutrinos. However, the sources of the vast majority of the measured diffuse flux of astrophysical neutrinos still remain a mystery, as do how many of those sources exist. Another mystery is whether the neutrino sources are steady or variable over time and, if variable, whether they vary over long or short time scales.
In a paper recently submitted to The Astrophysical Journal, the IceCube Collaboration presents a follow-up search that looked for additional, lower-energy events in the direction of the high-energy alert events. The analysis looked at low- and high-energy events from 2011-2020 and was conducted to search for the coincidence in different time scales from 1,000 seconds up to one decade. Although the researchers did not find an excess of low-energy events across the searched time scales, they were able to constrain the abundance of astrophysical neutrino sources in the universe.
Map of high-energy neutrino candidates (“alert events”) detected by IceCube. The map is in celestial coordinates, with the Galactic plane indicated by a line and the Galactic center by a dot. Two contours are shown for each event, for 50% and 90% confidence in the localization on the sky. The color scale shows the “signalness” of each event, which quantifies the likelihood that each event is an astrophysical neutrino rather than a background event from Earth’s atmosphere. Credit: IceCube Collaboration
This research also delves into the question of whether the astrophysical neutrino flux measured by IceCube is produced by a large number of weak sources or a small number of strong sources. To distinguish between the two possibilities, the researchers developed a statistical method that used two different sets of neutrinos: 1) alert events that have a high probability of being from an astrophysical source and 2) the gamma-ray follow-up (GFU) sample, where only about one to five out of 1,000 events per day are astrophysical.
“If there are a lot of GFU events in the direction of the alerts, that’s a sign that neutrino sources are producing a lot of detectable neutrinos, which would mean there are only a few, bright sources,” explained recent UW–Madison PhD student Alex Pizzuto, a lead on the analysis who is now a software engineer at Google. “If you don’t see a lot of GFU events in the direction of alerts, this is an indication of the opposite, that there are many, dim sources that are responsible for the flux of neutrinos that IceCube detects.”
Constraints on the luminosity (power) of each individual source as a function of the number density of astrophysical neutrino sources (horizontal axis). Previous IceCube measurements of the total astrophysical neutrino flux indicate that the true combination of the two quantities must lie within the diagonal band marked “diffuse.” The results of the new analysis are shown as an upper limit, compared to the sensitivity, which shows the range of results expected from background alone (no additional signal neutrinos associated with the directions of alert events). The upper limit is above the sensitivity because there is a statistical excess in the result (p = 0.018). Credit: IceCube Collaboration
They interpreted the results using a simulation tool called FIRESONG, which looks at populations of neutrino sources and calculates the flux from each of these sources. The simulation was then used to determine if the simulated sources might be responsible for producing a neutrino event.
“We did not find a clear excess of low-energy events associated with the high-energy alert events on any of the three time scales we analyzed,” said Justin Vandenbroucke, a physics professor at UW–Madison and colead of the analysis. “This implies that there are many astrophysical neutrino sources because, if there were few, we would detect additional events accompanying the high-energy alerts.”
Future analyses will take advantage of larger IceCube data sets and higher quality data from improved calibration methods. With the completion of the larger next-generation telescope, IceCube-Gen2, researchers will be able to detect even more dim neutrino sources. Even knowing the abundance of sources could provide important constraints on the identity of the sources.
“The future is very exciting as this analysis shows that planned improvements might reveal more astrophysical sources and populations,” said Abhishek Desai, postdoctoral fellow at UW–Madison and co-lead of the analysis. “This will be due to better event localization, which is already being studied and should be optimized in the near future.”
+ info “Constraints on populations of neutrino sources from searches in the directions of IceCube neutrino alerts,” The IceCube Collaboration: R. Abbasi et al. Submitted to TheAstrophysical Journal. arxiv.org/abs/2210.04930.
Alex Levchenko, Mark Rzchowski elected Fellows of the American Physical Society
Posted on
Congratulations to Profs. Alex Levchenko and Mark Rzchowski, who were elected 2022 Fellows of the American Physical Society!
Levchenko was elected for “broad contributions to the theory of quantum transport in mesoscopic, topological, and superconducting systems.” He was nominated by the Division of Condensed Matter Physics.
Rzchowski was elected for “pioneering discoveries and understanding of physical principles governing correlated complex materials and interfaces, including superconductors, correlated oxide systems multiferroic systems, and spin currents in noncollinear antiferromagnets.” He was nominated by the Division of Materials Physics.
APS Fellowship is a distinct honor signifying recognition by one’s professional peers for outstanding contributions to physics. Each year, no more than one half of one percent of the Society’s membership is recognized by this honor.
Two dwarf galaxies circling our Milky Way, the Large and Small Magellanic Clouds, are losing a trail of gaseous debris called the Magellanic Stream. New research shows that a shield of warm gas is protecting the Magellanic Clouds from losing even more debris — a conclusion that caps decades of investigation, theorizing and meticulous data-hunting by astronomers working and training at the University of Wisconsin–Madison.
The findings, published recently in the journal Nature, come courtesy of quasars at the center of 28 distant galaxies. These extremely bright parts of galaxies shine through the gas that forms a buffer, or corona, that protects the Magellanic Clouds from the pull of the Milky Way’s gravity.
“We use a quasar as a light bulb,” says Bart Wakker, senior scientist in UW–Madison’s Astronomy Department. “If there is gas at a certain place between us and the quasar, the gas will produce an absorption line that tells us the composition of the clouds, their velocity and the amount of material in the clouds. And, by looking at different ions, we can study the temperature and density of the clouds.”
The temperature, location and composition — silicon, carbon and oxygen — of the gases that shadow the passing light of the quasars are consistent with the gaseous corona theorized in another study published in 2020 by UW–Madison physics graduate student Scott Lucchini, Astronomy professors Elena D’Onghia and Ellen Zweibel and UW–Madison alumni Andrew Fox and Chad Bustard, among others.
That work explained the expected properties of the Magellanic Stream by including the effects of dark matter: “The existing models of the formation of the Magellanic Stream are outdated because they can’t account for its mass,” Lucchini said in 2020.
“Our first Nature paper showed the theoretical developments, predicting the size, location and movement of the corona’s gases,” says Fox, now an astronomer at the Space Telescope Science Institute and, with Lucchini, a co-author of both studies.
The new discovery is a collaboration with a team that includes its own stream of former UW–Madison researchers pulled out into the world through the 1990s and 2000s — former graduate students Dhanesh Krishnarao, who is leading the work and is now a professor at Colorado College, David French, now scientist at the Space Telescope Science Institute, and Christopher Howk, now a professor at the University of Notre Dame — and former UW–Madison postdoctoral researcher Nicolas Lehner, also a Notre Dame professor.
UW–Madison research leading to the new discovery dates back at least to an inkling of hot gases seen in a study of stars in the Magellanic Cloud published in 1980 by the late astronomy professor Blair Savage and his then-postdoc Klaas de Boer.
“All that fell into place to allow us to look for data from the Hubble Space Telescope and a satellite called the Far Ultraviolet Spectroscopic Explorer, FUSE — which UW also played an important role in developing,” Wakker says. “We could reinterpret that old data, collected for many different reasons, in a new way to find what we needed to confirm the existence of a warm corona around the Magellanic Clouds.”
“We solved the big questions. There are always details to work out, and people to convince,” D’Onghia says. “But this is a real Wisconsin achievement. There aren’t many times where you can work together to predict something new and then also have the ability to spot it, to collect the compelling evidence that it exists.”
Collaboration between NSF quantum centers finds path to fault tolerance in neutral atom qubits
Posted on
Like the classical computers we use every day, quantum computers can make mistakes when manipulating and storing the quantum bits (qubits) used to perform quantum algorithms. Theoretically, a quantum error correction protocol can correct these errors in a similar manner to classical error correction. However, quantum error correction is more demanding than its classical counterpart and has yet to be fully demonstrated, putting a limit on the functionality of quantum computers.
In a theory paper published in Nature Communications, UW–Madison physicist Shimon Kolkowitz and colleagues show a new way that quantum errors could be identified in one type of qubit known as neutral atoms. By pinpointing which qubit experienced an error, the study suggests that the requirements on quantum error correction can be significantly relaxed, approaching a level that neutral atom quantum computers have already achieved.
“In quantum computing, a lot of the overhead in an error-correcting code is figuring out which qubit had the error. If I know which qubit it is, then the amount of redundancy needed for the code is reduced,” Kolkowitz says. “Neutral atom qubits are right on the edge of what you would call this fault-tolerant threshold, but no one has been able to fully realize it yet.”
Neutral atom qubits are made up of single atoms trapped with light. The logical gates between the atoms are performed by exciting the atoms to “Rydberg” states where the atom’s electron is excited far beyond its normal location. This quantum computing technique was first pioneered and experimentally demonstrated at UW–Madison by physics professors Mark Saffman and Thad Walker.
Currently, one error-corrected “logical” qubit is expected to require around 1000 physical qubits, exceeding the maximum number of qubits anyone has managed to wire together in any quantum computing system. Researchers have been studying different elements in Rydberg form as qubits for decades, gradually increasing their performance but not yet reaching the level required for error correction.
Schematic of a neutral atom quantum computer, where the physical qubits are individual 171Yb atoms.
Knowing that eliminating qubit errors is not practical, Kolkowitz and colleagues instead asked if there might be a way to convert the errors into a type known as erasure errors. Named for the fact that the qubit has effectively vanished, or been erased, erasure errors can be beneficial because it is much easier to tell if a qubit is missing than if it is in the correct state or not.
They investigated the largely unstudied (in this context) element ytterbium because it has a relatively simple outer electron structure and its nucleus can only exist in two quantum spin states, +1/2 and -1/2. When manipulated into a metastable electronic state — a temporary state different than the true ground state — the qubit states are then set by the nuclear +1/2 and -1/2 spin states, and the quantum gates involve coupling one of these two qubit states to the Rydberg state.
“We knew that if an atom falls out of the metastable electronic state to the true ground state, we can detect that with a laser and still not screw up any of the other qubits at all,” Kolkowitz says. “But what that means is if I can set up a situation where errors manifest themselves as falling down into the ground state, I can just shine this laser and constantly check for errors and identify the qubit it happened in.”
In their first calculations using this metastable ytterbium platform, they showed that though errors still occur in the qubits as expected, around 98% of them would be converted to the detectable erasure errors.
But is that 98% error conversion rate good enough for a quantum computer in practice? The answer depends on the error threshold, a value that differs depending on the gates and qubits being used. The researchers next ran simulations of different numbers of quantum gates with a 98% erasure error conversion rate, or no conversion to erasure errors at all (as is currently standard).
With no error correction, the error threshold is just under 1%. But with 98% erasure conversion (green star), the threshold increases 4.4-fold to just over 4%.
Without erasure error conversion, the error threshold is just shy of 1%, meaning each quantum gate would have to operate at over a 99% success probability to be fault tolerant. With erasure error conversion, however, the error threshold increases 4.4-fold, to just over 4%. And, for example, Saffman’s group has already demonstrated better than 4% error rates in neutral atom qubits.
The higher error threshold also means that there can be a higher ratio of logical qubits to physics qubits. Instead of one logical qubit per 1000 physical qubits, there could be around ten.
“And that’s the point of this work,” Kolkowitz says. “We show that if you can do this erasure conversion, it relaxes the requirements on both the error rates and on the number of atoms that you need.”
Co-author Jeff Thompson, an RQS investigator at Princeton, is already working to demonstrate these results experimentally in his lab. While other elements have been used in neutral atom quantum computing experiments before, until recently ytterbium was largely an unknown. Thompson’s group did much of the preliminary work to characterize the element and show that it could work well as the building block for a quantum computer.
“There were many open questions about ytterbium, and I’d say a lot of people who were using other elements are now moving into ytterbium based on Jeff’s groundwork,” Kolkowitz says. “And I expect that this study will accelerate that trend.”
Yue Wu and Shruti Puri at Yale University collaborated on this study. This work was supported by the National Science Foundation QLCI grants OMA-2120757 and OMA-2016136. Kolkowitz’s part of the work was additionally supported by ARO W911NF-21-1-0012.