IceCube analysis indicates there are many high-energy astrophysical neutrino sources

This story was originally published by WIPAC

Back in 2013, the IceCube Neutrino Observatory—a cubic-kilometer neutrino detector embedded in Antarctic ice—announced the first observation of high-energy (above 100 TeV) neutrinos originating from outside our solar system, spawning a new age in astronomy. Four years later, on September 22, 2017, a high-energy neutrino event was detected coincident with a gamma-ray flare from a cosmic particle accelerator, a blazar known as TXS 0506+056. The coincident observation provided the first evidence for an extragalactic source of high-energy neutrinos.

The identification of this source was possible thanks to IceCube’s real-time high-energy neutrino alert program, which notifies the community of directions and energies of individual neutrinos that are most likely to have come from astrophysical sources. These alerts trigger follow-up observations of electromagnetic waves from radio up to gamma-ray, aimed at pinpointing a possible astrophysical source of high-energy neutrinos. However, the sources of the vast majority of the measured diffuse flux of astrophysical neutrinos still remain a mystery, as do how many of those sources exist. Another mystery is whether the neutrino sources are steady or variable over time and, if variable, whether they vary over long or short time scales.

In a paper recently submitted to The Astrophysical Journal, the IceCube Collaboration presents a follow-up search that looked for additional, lower-energy events in the direction of the high-energy alert events. The analysis looked at low- and high-energy events from 2011-2020 and was conducted to search for the coincidence in different time scales from 1,000 seconds up to one decade. Although the researchers did not find an excess of low-energy events across the searched time scales, they were able to constrain the abundance of astrophysical neutrino sources in the universe.

a map of celestial coordinates with ovoid lines shown as a heatmap of locations where neutrino candidate events likely originated
Map of high-energy neutrino candidates (“alert events”) detected by IceCube. The map is in celestial coordinates, with the Galactic plane indicated by a line and the Galactic center by a dot. Two contours are shown for each event, for 50% and 90% confidence in the localization on the sky. The color scale shows the “signalness” of each event, which quantifies the likelihood that each event is an astrophysical neutrino rather than a background event from Earth’s atmosphere. Credit: IceCube Collaboration

This research also delves into the question of whether the astrophysical neutrino flux measured by IceCube is produced by a large number of weak sources or a small number of strong sources. To distinguish between the two possibilities, the researchers developed a statistical method that used two different sets of neutrinos: 1) alert events that have a high probability of being from an astrophysical source and 2) the gamma-ray follow-up (GFU) sample, where only about one to five out of 1,000 events per day are astrophysical.

“If there are a lot of GFU events in the direction of the alerts, that’s a sign that neutrino sources are producing a lot of detectable neutrinos, which would mean there are only a few, bright sources,” explained recent UW–Madison PhD student Alex Pizzuto, a lead on the analysis who is now a software engineer at Google. “If you don’t see a lot of GFU events in the direction of alerts, this is an indication of the opposite, that there are many, dim sources that are responsible for the flux of neutrinos that IceCube detects.”

a graph with power of each individual source on the y-axis and number density of astrophysical neutrino sources on the x-axis. there is a clear indirect relationship, with the lines starting in the upper left and moving toward the lower right of the graph. three "lines" are shown: an upper blue band that says "diffuse," a middle black lines that says "upper limit; this analysis" and a blue-green band that has +/-1 sigma sensitivity
Constraints on the luminosity (power) of each individual source as a function of the number density of astrophysical neutrino sources (horizontal axis). Previous IceCube measurements of the total astrophysical neutrino flux indicate that the true combination of the two quantities must lie within the diagonal band marked “diffuse.” The results of the new analysis are shown as an upper limit, compared to the sensitivity, which shows the range of results expected from background alone (no additional signal neutrinos associated with the directions of alert events). The upper limit is above the sensitivity because there is a statistical excess in the result (p = 0.018). Credit: IceCube Collaboration

They interpreted the results using a simulation tool called FIRESONG, which looks at populations of neutrino sources and calculates the flux from each of these sources. The simulation was then used to determine if the simulated sources might be responsible for producing a neutrino event.

“We did not find a clear excess of low-energy events associated with the high-energy alert events on any of the three time scales we analyzed,” said Justin Vandenbroucke, a physics professor at UW–Madison and colead of the analysis. “This implies that there are many astrophysical neutrino sources because, if there were few, we would detect additional events accompanying the high-energy alerts.”

Future analyses will take advantage of larger IceCube data sets and higher quality data from improved calibration methods. With the completion of the larger next-generation telescope, IceCube-Gen2, researchers will be able to detect even more dim neutrino sources. Even knowing the abundance of sources could provide important constraints on the identity of the sources.

“The future is very exciting as this analysis shows that planned improvements might reveal more astrophysical sources and populations,” said Abhishek Desai, postdoctoral fellow at UW–Madison and co-lead of the analysis. “This will be due to better event localization, which is already being studied and should be optimized in the near future.”

+ info “Constraints on populations of neutrino sources from searches in the directions of IceCube neutrino alerts,” The IceCube Collaboration: R. Abbasi et al. Submitted to The Astrophysical Journal. arxiv.org/abs/2210.04930.

Alex Levchenko, Mark Rzchowski elected Fellows of the American Physical Society

images shows two profile pictures, Alex Levchenko on the left and Mark Rzchowski on the right.

Congratulations to Profs. Alex Levchenko and Mark Rzchowski, who were elected 2022 Fellows of the American Physical Society!

Levchenko was elected for “broad contributions to the theory of quantum transport in mesoscopic, topological, and superconducting systems.” He was nominated by the Division of Condensed Matter Physics.

Rzchowski was elected for “pioneering discoveries and understanding of physical principles governing correlated complex materials and interfaces, including superconductors, correlated oxide systems multiferroic systems, and spin currents in noncollinear antiferromagnets.” He was nominated by the Division of Materials Physics.

APS Fellowship is a distinct honor signifying recognition by one’s professional peers for outstanding contributions to physics. Each year, no more than one half of one percent of the Society’s membership is recognized by this honor.

See the full list of 2022 honorees at the APS Fellows archive.

Decades of work at UW–Madison underpin discovery of corona protecting Milky Way’s neighboring galaxies

a domed observatory with the night sky as a backdrop. the long exposure makes the stars look like they're rotating, with long blurry tails

This story was originally posted by UW–Madison News

Two dwarf galaxies circling our Milky Way, the Large and Small Magellanic Clouds, are losing a trail of gaseous debris called the Magellanic Stream. New research shows that a shield of warm gas is protecting the Magellanic Clouds from losing even more debris — a conclusion that caps decades of investigation, theorizing and meticulous data-hunting by astronomers working and training at the University of Wisconsin–Madison.

The findings, published recently in the journal Nature, come courtesy of quasars at the center of 28 distant galaxies. These extremely bright parts of galaxies shine through the gas that forms a buffer, or corona, that protects the Magellanic Clouds from the pull of the Milky Way’s gravity.

“We use a quasar as a light bulb,” says Bart Wakker, senior scientist in UW–Madison’s Astronomy Department. “If there is gas at a certain place between us and the quasar, the gas will produce an absorption line that tells us the composition of the clouds, their velocity and the amount of material in the clouds. And, by looking at different ions, we can study the temperature and density of the clouds.”

The temperature, location and composition — silicon, carbon and oxygen — of the gases that shadow the passing light of the quasars are consistent with the gaseous corona theorized in another study published in 2020 by UW–Madison physics graduate student Scott Lucchini, Astronomy professors Elena D’Onghia and Ellen Zweibel and UW–Madison alumni Andrew Fox and Chad Bustard, among others.

That work explained the expected properties of the Magellanic Stream by including the effects of dark matter: “The existing models of the formation of the Magellanic Stream are outdated because they can’t account for its mass,” Lucchini said in 2020.

“Our first Nature paper showed the theoretical developments, predicting the size, location and movement of the corona’s gases,” says Fox, now an astronomer at the Space Telescope Science Institute and, with Lucchini, a co-author of both studies.

The new discovery is a collaboration with a team that includes its own stream of former UW–Madison researchers pulled out into the world through the 1990s and 2000s — former graduate students Dhanesh Krishnarao, who is leading the work and is now a professor at Colorado College, David French, now scientist at the Space Telescope Science Institute, and Christopher Howk, now a professor at the University of Notre Dame — and former UW–Madison postdoctoral researcher Nicolas Lehner, also a Notre Dame professor.

UW–Madison research leading to the new discovery dates back at least to an inkling of hot gases seen in a study of stars in the Magellanic Cloud published in 1980 by the late astronomy professor Blair Savage and his then-postdoc Klaas de Boer.

“All that fell into place to allow us to look for data from the Hubble Space Telescope and a satellite called the Far Ultraviolet Spectroscopic Explorer, FUSE — which UW also played an important role in developing,” Wakker says. “We could reinterpret that old data, collected for many different reasons, in a new way to find what we needed to confirm the existence of a warm corona around the Magellanic Clouds.”

“We solved the big questions. There are always details to work out, and people to convince,” D’Onghia says. “But this is a real Wisconsin achievement. There aren’t many times where you can work together to predict something new and then also have the ability to spot it, to collect the compelling evidence that it exists.”

Read more about the research on NASA’s website.

Collaboration between NSF quantum centers finds path to fault tolerance in neutral atom qubits

Like the classical computers we use every day, quantum computers can make mistakes when manipulating and storing the quantum bits (qubits) used to perform quantum algorithms. Theoretically, a quantum error correction protocol can correct these errors in a similar manner to classical error correction. However, quantum error correction is more demanding than its classical counterpart and has yet to be fully demonstrated, putting a limit on the functionality of quantum computers.

In a theory paper published in Nature Communications, UW–Madison physicist Shimon Kolkowitz and colleagues show a new way that quantum errors could be identified in one type of qubit known as neutral atoms. By pinpointing which qubit experienced an error, the study suggests that the requirements on quantum error correction can be significantly relaxed, approaching a level that neutral atom quantum computers have already achieved.

profile photo of Shimon Kolkowitz
Shimon Kolkowitz

The study is a collaboration between two National Science Foundation Quantum Leap Challenge Institutes, Hybrid Quantum Architectures and Networks (HQAN) and Robust Quantum Simulation (RQS). UW–Madison is a member of HQAN.

“In quantum computing, a lot of the overhead in an error-correcting code is figuring out which qubit had the error. If I know which qubit it is, then the amount of redundancy needed for the code is reduced,” Kolkowitz says. “Neutral atom qubits are right on the edge of what you would call this fault-tolerant threshold, but no one has been able to fully realize it yet.”

Neutral atom qubits are made up of single atoms trapped with light. The logical gates between the atoms are performed by exciting the atoms to “Rydberg” states where the atom’s electron is excited far beyond its normal location. This quantum computing technique was first pioneered and experimentally demonstrated at UW­–Madison by physics professors Mark Saffman and Thad Walker.

Currently, one error-corrected “logical” qubit is expected to require around 1000 physical qubits, exceeding the maximum number of qubits anyone has managed to wire together in any quantum computing system. Researchers have been studying different elements in Rydberg form as qubits for decades, gradually increasing their performance but not yet reaching the level required for error correction.

a cartoon schematic of the experimental setup
Schematic of a neutral atom quantum computer, where the physical qubits are individual 171Yb atoms.

Knowing that eliminating qubit errors is not practical, Kolkowitz and colleagues instead asked if there might be a way to convert the errors into a type known as erasure errors. Named for the fact that the qubit has effectively vanished, or been erased, erasure errors can be beneficial because it is much easier to tell if a qubit is missing than if it is in the correct state or not.

They investigated the largely unstudied (in this context) element ytterbium because it has a relatively simple outer electron structure and its nucleus can only exist in two quantum spin states, +1/2 and -1/2. When manipulated into a metastable electronic state — a temporary state different than the true ground state — the qubit states are then set by the nuclear +1/2 and -1/2 spin states, and the quantum gates involve coupling one of these two qubit states to the Rydberg state.

“We knew that if an atom falls out of the metastable electronic state to the true ground state, we can detect that with a laser and still not screw up any of the other qubits at all,” Kolkowitz says. “But what that means is if I can set up a situation where errors manifest themselves as falling down into the ground state, I can just shine this laser and constantly check for errors and identify the qubit it happened in.”

In their first calculations using this metastable ytterbium platform, they showed that though errors still occur in the qubits as expected, around 98% of them would be converted to the detectable erasure errors.

But is that 98% error conversion rate good enough for a quantum computer in practice? The answer depends on the error threshold, a value that differs depending on the gates and qubits being used. The researchers next ran simulations of different numbers of quantum gates with a 98% erasure error conversion rate, or no conversion to erasure errors at all (as is currently standard).

a plot with detected error fraction on the x axis and threshold error rate on the y axis. the resultant line starts near 0, 0 (really 0, 0,01) and curves up to 1.0, 0.05
With no error correction, the error threshold is just under 1%. But with 98% erasure conversion (green star), the threshold increases 4.4-fold to just over 4%.

Without erasure error conversion, the error threshold is just shy of 1%, meaning each quantum gate would have to operate at over a 99% success probability to be fault tolerant. With erasure error conversion, however, the error threshold increases 4.4-fold, to just over 4%. And, for example, Saffman’s group has already demonstrated better than 4% error rates in neutral atom qubits.

The higher error threshold also means that there can be a higher ratio of logical qubits to physics qubits. Instead of one logical qubit per 1000 physical qubits, there could be around ten.

“And that’s the point of this work,” Kolkowitz says. “We show that if you can do this erasure conversion, it relaxes the requirements on both the error rates and on the number of atoms that you need.”

Co-author Jeff Thompson, an RQS investigator at Princeton, is already working to demonstrate these results experimentally in his lab. While other elements have been used in neutral atom quantum computing experiments before, until recently ytterbium was largely an unknown. Thompson’s group did much of the preliminary work to characterize the element and show that it could work well as the building block for a quantum computer.

“There were many open questions about ytterbium, and I’d say a lot of people who were using other elements are now moving into ytterbium based on Jeff’s groundwork,” Kolkowitz says. “And I expect that this study will accelerate that trend.”

Yue Wu and Shruti Puri at Yale University collaborated on this study. This work was supported by the National Science Foundation QLCI grants OMA-2120757 and OMA-2016136. Kolkowitz’s part of the work was additionally supported by ARO W911NF-21-1-0012.

Margaret Fortman awarded Google quantum computing fellowship

This post was adapted from a story posted by the UW–Madison Graduate School

Two UW–Madison graduate students, including physics grad student Margaret Fortman, have been awarded 2022 Google Fellowships to pursue cutting-edge research. Fortman received the 2022 Google Fellowship in Quantum Computing, one of only four awarded.

profile picture of Margaret Fortman
Margaret Fortman

Google created the PhD Fellowship Program to recognize outstanding graduate students doing exceptional and innovative research in areas relevant to computer science and related fields. The fellowship attracts highly competitive applicants from around the world.

“These awards have been presented to exemplary PhD students in computer science and related fields,” Google said in its announcement. “We have given these students unique fellowships to acknowledge their contributions to their areas of specialty and provide funding for their education and research. We look forward to working closely with them as they continue to become leaders in their respective fields.”

The program begins in July when students are connected to a mentor from Google Research. The fellowship covers full tuition, fees, and a stipend for the academic year. Fellows are also encouraged to attend Google’s annual Global Fellowship Summit in the summer.

Fortman works to diagnose noise interference in quantum bits

Fortman, whose PhD research in Victor Brar’s group specializes in quantum computing, will use the fellowship support to develop a diagnostic tool to probe the source of noise in superconducting quantum bits, or qubits.

Quantum computing has the potential to solve problems that are difficult for standard computers, Fortman said, but the field has challenges to solve first.

“The leading candidate we have for making a quantum computer right now is superconducting qubits,” Fortman said. “But those are currently facing unavoidable noise that we get in those devices, which can actually come from the qubit material itself.”

Fortman works with a low-temperature ultra-high vacuum scanning tunneling microscope on the UW–Madison campus to develop a microscopic understanding of the origins of noise in qubits. She fabricates superconductors to examine under the microscope to identify the source of the noise, and hopefully be able to develop a solution for that interference.

In her time as a graduate student at UW–Madison, Fortman said she has enjoyed collaborating with colleagues in her lab and across campus.

“It’s pretty cool to be somewhere where world-renowned research is happening and to be involved with that,” she said. “My PI and I work in collaborations with other PIs at the university and they’re all doing very important research, and so it’s really cool to be a part of that.”

Fortman is excited to have a mentor at Google through the PhD Fellowship, having been paired with someone who has a similar disciplinary background and who is a research scientist with Google Quantum AI.

“He can be a resource in debugging some parts of my project, as well as general mentorship and advice on being a PhD student, and advice for future career goals,” Fortman said.

The second UW–Madison student who earned this honor is computer sciences PhD student Shashank Rajput, who received the 2022 Google Fellowship in Machine Learning.

The future of particle physics is also written from the South Pole

This post was originally published by the IceCube collaboration. Several UW–Madison physicists are part of the collaboration and are featured in this story

A month ago, the Seattle Community Summer Study Workshop—July 17-26, 2022, at the University of Washington—brought together over a thousand scientists in one of the final steps of the Particle Physics Community Planning Exercise. The meetings and accompanying white papers put the cherry on top of a period of collaborative work setting a vision for the future of particle physics in the U.S. and abroad. Later this year, the final report identifying research priorities in this field will be presented. Its main purpose is to advise the Department of Energy and the National Science Foundation on research for their agendas during the next decade.

As new and old detectors once again prepare to expand the frontiers of knowledge, we asked some IceCube collaborators about the role the South Pole neutrino observatory should play in the bright future that lies ahead for particle physics.

Q: What type of neutrinos are currently detected in IceCube? And will that change with the future extensions?

The vast majority of the neutrinos we detect are generated in the atmosphere by cosmic rays, but we also have on the order of 1,000 cosmic neutrinos at energies above 10 TeV. We use the atmospheric neutrinos for a wide range of science, first of all to study the neutrinos themselves.

IceCube has detected more than a million neutrinos to date. That’s already a big number for neutrino scientists, and we will detect even more in the future. The deployment of the IceCube Upgrade, an extension of our facility targeting neutrinos at lower energies, will increase the density of sensors in IceCube’s inner subdetector, DeepCore, by a factor of 10. And a second, larger extension is also in the works. With IceCube-Gen2, we will improve the detection at the highest energies, too: the IceCube volume will increase by almost a factor of 10, and our event rate for high-energy cosmic neutrinos will also grow by an order of magnitude.

Albrecht Karle, IceCube associate director for science and instrumentation and a professor of physics at the University of Wisconsin–Madison

Q: Are the futures of IceCube and that of particle physics intrinsically linked?

Absolutely! Many open questions in particle physics have neutrinos at the center. What’s their mass? What is the behavior of neutrino flavor mixing? Are there right-handed (sterile) neutrinos? Neutrinos are particularly attractive in the search for new physics. We can answer all these questions, to varying levels, within IceCube and especially moving forward with the IceCube Upgrade and IceCube-Gen2.

Erin O’Sullivan, an associate professor of physics at Uppsala University

IceCube, the Icecube Upgrade, and IceCube-Gen2 can all uniquely contribute to the study of particle physics, in particular, neutrino physics, beyond Standard Model (BSM) physics, and indirect searches of dark matter. The IceCube Upgrade provides complementary and independent measurements of neutrino oscillation in addition to the long-baseline experiments. And IceCube-Gen2 will be crucial to exploring the BSM features, such as sterile neutrinos and secret neutrino interactions, at an energy that cannot be reached by the underground facilities. It will also be a discovery machine for heavy dark matter particles.

Ke Fang, an assistant professor of physics at the University of Wisconsin–Madison

Q: Talking about discoveries, now that both IceCube and Super-Kamiokande have reported definitive observations of tau neutrinos in atmospheric and astrophysical neutrino data, why should the international particle physics community continue to improve their detection?  

The tau neutrino was discovered at Fermilab in an emulsion experiment where they observed double-bang events with a distance on the order of 1 mm separating production and decay. Since they represent the least studied neutrino and, in fact, one of the least studied particles, improved measurements of tau properties may reveal that the 3×3 matrix is not unitary and expose the first indication of physics beyond the 3-flavor oscillation scenario.

Francis Halzen, IceCube PI and a professor of physics at the University of Wisconsin–Madison

We are the only experiment operating currently (and in the foreseeable future) that is able to identify tau neutrinos on an event-by-event basis. We can do so by looking at the distinct morphological features they produce in our data at the highest energies. And with the IceCube Upgrade, we will also be the experiment that collects the most tau neutrinos.  I suspect that these neutrinos will surprise us again and point us towards new physics.

Carlos Argüelles, an assistant professor of physics at Harvard University.  

Four hundred years from now, people may see IceCube the way we see Galileo’s telescope, not as an end but as the beginning of a new branch of science. The astrophysical observation of tau neutrinos is but one piece in a large number of studies that IceCube can conduct, including the study of fundamental physics using astrophysical neutrinos.

Ignacio Taboada, IceCube spokesperson and a professor of physics at the Georgia Institute of Technology

Q: In 2019, the Wisconsin IceCube Particle Astrophysics Center joined the Interactions Collaboration, which includes all major particle physics laboratories around the globe. The IceCube letter of introduction to this community detailed some of the most accurate results to date in neutrino physics. What’s unique about IceCube neutrino science?

One unique aspect of IceCube is the breadth of neutrino energy that we can measure, all the way down to the MeV energy scale in the case of a galactic supernova and up to as far as a few PeV neutrinos, which are the highest energy neutrinos ever detected. Therefore, IceCube provides us with different windows to study the neutrino and understand its properties. Especially in the context of searching for new physics, this is important as these processes can manifest at a particular energy scale but not be visible at other energy scales.

Erin O’Sullivan, an associate professor of physics at Uppsala University

Q:  Let’s focus on high-energy neutrinos for a moment. What are the needs for their detection and why is the South Pole ice the perfect place for those searches? 

The highest energy neutrinos can be directly linked to the most powerful accelerators in the universe but also allow us to test the Standard Model at energies inaccessible to current or future planned colliders.

And why the South Pole? Well, what makes the South Pole such an optimal location are the exceptional optical and radio properties of its ice sheet, which is also the largest pool of ice on Earth. Neutrino event rates are very low at these energies and, thus, we need a huge detector to measure them.

Deep-ice Cherenkov optical sensors have already been proven as high-performing detectors for TeV and PeV neutrinos when deployed at depths of 1.4 km and greater below the surface. And radio technology is promising because radio waves can travel much further than optical photons in the ice, plus they work at shallow depths. So, when searching for the highest energy neutrinos using the South Pole ice sheet, radio neutrino detectors might be the only solution that scales up. Radio waves are able to travel further in the South Pole than in Greenland, for example. It’s a gift from nature to have this giant, pure block of ice to catch elusive neutrinos from the most powerful accelerators.

Lu Lu, an assistant professor of physics at the University of Wisconsin–Madison

Q: And what about the lowest energies? How does IceCube perform there? 

IceCube’s DeepCore detector was especially designed for that: a more dense layout of photodetectors embedded in the center of IceCube and located at about 2 km depth, it uses the surrounding IceCube sensors to eliminate essentially all background from the otherwise dominant cosmic ray muons. This means that DeepCore can now be analyzed as if it was at 10 km depth, deeper than any mine on Earth. In the near future, the IceCube Upgrade will add seven strings of new sensors inside DeepCore, which will hugely increase its precision for neutrino properties.

Albrecht Karle, IceCube associate director for science and instrumentation and a professor of physics at the University of Wisconsin–Madison 

IceCube’s low energies are what all other neutrino experiments would call high energies. This is a regime where the neutrino interactions are well predicted from accelerator experiments, which means that if deviations are found in the data we can claim new physics. Thus, IceCube and the upcoming IceCub Upgrade results are not only going to yield some of the most precise measurements on the neutrino oscillation parameters but also—and more importantly—test the neutrino oscillation framework.

Carlos Argüelles, an assistant professor of physics at Harvard University  

Q: And, last but not least, we should think about the people that will make all this possible. What efforts are underway to diversify who does science and make the field more equitable?

Four years ago, IceCube invited a few collaborations to join efforts to increase equity, diversity, inclusion, and accessibility (DEIA) in multimessenger astrophysics. With support from NSF, this was the birth of the Multimessenger Diversity Network (MDN). This network now includes a dozen participating collaborations, which is an indication of the growing awareness and action to increase DEIA across the field. Set up as a community of practice, where people share their knowledge and experiences with each other, the MDN is a reproducible and scalable model for other fields. We are excited to see this community of practice grow, to contribute with resources and experiences, and to learn from others.

For the first time in an official capacity, DEIA efforts were included in the Snowmass planning process and were also incorporated into the Astro2020 Decadal Survey. One take-away from these processes is that more resources and accountability are needed to speed up DEIA efforts.

Ellen Bechtol, MDN community manager and an outreach specialist at the Wisconsin IceCube Particle Astrophysics Center

Read more about IceCube and its future contributions to particle physics

  • Snowmass Neutrino Frontier: NF04 Topical Group Report. Neutrinos from natural sources. (Jul 2022)
  • CF7. Cosmic Probes of Fundamental Physics. Topical Group Report (Jul  2022).
  • “High-Energy and Ultra-High-Energy Neutrinos: A Snowmass White Paper”, M.Ackermann et al. arxiv.org/abs/2203.08096
  • “Tau Neutrinos in the Next Decade: from GeV to EeV,” R. S. Abraham et al. arxiv.org/abs/2203.05591
  • “Snowmass White Paper: Beyond the Standard Model effects on Neutrino Flavor,” C. Argüelles et al. arxiv.org/abs/2203.10811
  • “Snowmass 2021 White Paper: Cosmogenic Dark Matter and Exotic Particle Searches in Neutrino Experiments,” J. Berger et al. arxiv.org/abs/2207.02882
  • “White Paper on Light Sterile Neutrino Searches and Related Phenomenology,” M. A. Acero et al, arxiv.org/abs/2203.07323
  • “Ultra-High-Energy Cosmic Rays: The Intersection of the Cosmic and Energy Frontiers,” A. Coleman, arxiv.org/abs/2205.05845
  • “Advancing the Landscape of Multimessenger Science in the Next Decade,” K. Engle et al. arxiv.org/abs/2203.10074

Zweibel receives Astronomical Society of the Pacific’s most prestigious award

This post is adapted from an Astronomical Society of the Pacific press release

The Astronomical Society of the Pacific (ASP) has awarded the 2022 Catherine Wolfe Bruce Gold Medal to Ellen Zweibel. It is the most prestigious award given by ASP.

profile photo of Ellen Zweibel
Ellen Zweibel, W. L. Kraushaar professor of astronomy and physics (Photo by Althea Dotzour / UW–Madison)

Zweibel, the William L. Kraushaar professor of astronomy and physics at UW–Madison, was recognized for her contributions to the understanding of astrophysical plasmas, especially those associated with the Sun, stars, galaxies, and galaxy clusters. She has also made major contributions in linking plasma characteristics and behaviors observed in laboratories to astrophysical plasma phenomena occurring in the universe.

Most plasma effects in astrophysical systems are due to an embedded magnetic field. Many of them can be grouped into a small number of basic physical processes: how magnetic fields are generated, how they exchange energy with their environments (sometimes on explosively fast timescales), their role in global instabilities, how they cause a tiny fraction of thermal particles to be accelerated to relativistic energies, and how they mediate the interaction of these relativistic particles (cosmic rays) with their gaseous environments through waves and instabilities on microscales. Although all these processes occur in laboratory plasmas, it is in natural plasmas that they take their most extreme forms. Zweibel and her students and postdocs have used analytical theory and numerical simulations to study the generation and evolution of magnetic fields in the Sun and other stars, in galaxies, and in galaxy clusters, and have researched the effects of high energy cosmic ray particles in all of these environments. Their most recent work centers on the role of cosmic rays in star formation feedback: the self-regulation of the star formation rate in galaxies through energy and momentum input to the ambient medium by the stars themselves.

a gold medal that says astronomical society of the pacific around the rim and has an antiquity-looking woman and other details
The Catherine Wolfe Bruce Gold Medal (photo from the Astronomical Society of the Pacific)

Zweibel has authored over 242 refereed publications with over 8,000 citations. In 2016 she was awarded the American Physical Society’s James Clerk Maxwell Prize for Plasma Physics “For seminal research on the energetics, stability, and dynamics of astrophysical plasmas, including those related to stars and galaxies, and for leadership in linking plasma and other astrophysical phenomena.” She is a member of the National Academy of Sciences.

The Astronomical Society of the Pacific’s Catherine Wolfe Bruce Gold Medal was established in 1898 by Catherine Wolfe Bruce, an American philanthropist and patroness of astronomy. The ASP presents the medal annually to a professional astronomer in recognition of a lifetime of outstanding achievement and contributions to astrophysics research. It was first awarded in 1898 to Simon Newcomb. Previous recipients of the Bruce Medal include Giovanni V. Schiaparelli (1902), Edwin Hubble (1938), Fred Hoyle (1970), and Vera Rubin (2003)

Cross-institutional collaboration leads to new control over quantum dot qubits

a greyscale image makes up the border of this square image, with a full-color square in the exact center. the image shows tiny tunnel-like features, all congregating in the middle

This story was originally published by the Chicago Quantum Exchange

Qubits are the building blocks of quantum computers, which have the potential to revolutionize many fields of research by solving problems that classical computers can’t.

But creating qubits that have the perfect quality necessary for quantum computing can be challenging.

Researchers at the University of Wisconsin–Madison, HRL Laboratories LLC, and University of New South Wales (UNSW) collaborated on a project to better control silicon quantum dot qubits, allowing for higher-quality fabrication and use in wider applications.

All three institutions are affiliated with the Chicago Quantum Exchange. The work was published in Physical Review Letters, and the lead author, J. P. Dodson, has recently transitioned from UW–Madison to HRL.

“Consistency is the thing we’re after here,” says Mark Friesen, Distinguished Scientist of Physics at UW–Madison and author on the paper.  “Our claim is that there is actually hope to create a very uniform array of dots that can be used as qubits.”

Sensitive quantum states

While classical computer bits use electric circuits to represent two possible values (0 and 1), qubits use two quantum states to represent 0 and 1, which allows them to take advantage of quantum phenomena like superposition to do powerful calculations.

Qubits can be constructed in different ways. One way to build a qubit is by fabricating a quantum dot, or a very, very small cage for electrons, formed within a silicon crystal. Unlike qubits made of single atoms, which are all naturally identical, quantum dot qubits are man-made—allowing researchers to customize them to different applications.

But one common wrench in the metaphorical gears of these silicon qubits is competition between different kinds of quantum states. Most qubits use “spin states” to represent 0 and 1, which rely on a uniquely quantum property called spin. But if the qubit has other kinds of quantum states with similar energies, those other states can interfere, making it difficult for scientists to effectively use the qubit.

In silicon quantum dots, the states that most often compete with the ones needed for computing are “valley states,” named for their locations on an energy graph—they exist in the “valleys” of the graph.

To have the most effective quantum dot qubit, the valley states of the dot must be controlled such that they do not interfere with the quantum information-carrying spin states. But the valley states are extremely sensitive; the quantum dots sit on a flat surface, and if there is even one extra atom on the surface underneath the quantum dot, the energies of the valley states change.

The study’s authors say these kinds of single-atom defects are pretty much “unavoidable,” so they found a way to control the valley states even in the presence of defects. By manipulating the voltage across the dot, the researchers found they could physically move the dot around the surface it sits on.

“The gate voltages allow you to move the dot across the interface it sits on by a few nanometers, and by doing that, you change its position relative to atomic-scale features,” says Mark Eriksson, John Bardeen Professor and chair of the UW–Madison physics department, who worked on the project. “That changes the energies of valley states in a controllable way.

“The take home message of this paper,” he says, “is that the energies of the valley states are not determined forever once you make a quantum dot. We can tune them, and that allows us to make better qubits that are going to make for better quantum computers.”

Building on academic and industry expertise

The host materials for the quantum dots are “grown” with precise layer composition. The process is extremely technical, and Friesen notes that Lisa Edge at HRL Laboratories is a world expert.

“It requires many decades of knowledge to be able to grow these devices properly,” says Friesen. “We have several years of collaborating with HRL, and they’re very good at making really high-quality materials available to us.”

The work also benefitted from the knowledge of Susan Coppersmith, a theorist previously at UW–Madison who moved to UNSW in 2018. Eriksson says the collaborative nature of the research was crucial to its success.

“This work, which gives us a lot of new knowledge about how to precisely control these qubits, could not have been done without our partners at HRL and UNSW,” says Eriksson. “There’s a strong sense of community in quantum science and technology, and that is really pushing the field forward.”

Opening doors to quantum research experiences with the Open Quantum Initiative

This past winter, Katie Harrison, then a junior physics major at UW–Madison, started thinking about which areas of physics she was interested in studying more in-depth.

“Physics is in general so broad, saying you want to research physics doesn’t really cut it,” Harrison says.

She thought about which classes she enjoyed the most and talked to other students and professors to help figure out what she might focus on. Quantum mechanics was high on her list. During her search for additional learning opportunities, she saw the email about the Open Quantum Initiative (OQI), a new fellowship program run by the Chicago Quantum Exchange (CQE).

“This could be something I’m interested in, right?” Harrison thought. “I’ll apply and see what happens.”

What happened was that Harrison was one of 12 undergraduate students accepted into the inaugural class of OQI Fellows. These students were paired with mentors at CQE member institutions, where they conducted research in quantum science information and engineering. OQI has a goal of connecting students with leaders in academia and industry and increasing their awareness of quantum career opportunities. The ten-week Fellowship ran through August 19.

11 students pose on a rock wall, all students are wearing the same Chicago Quantum Exchange hooded sweatshirt
OQI students attend a wrap-up at the University of Chicago on August 17. Each student presented at a research symposium that day, which also included a career panel from leaders across academia, government, and industry and an opportunity to network. | Photo provided by the Chicago Quantum Exchange

OQI also places an emphasis on establishing diversity, equity, and inclusion as priorities central to the development of the quantum ecosystem. Almost 70% of this year’s fellowship students are Hispanic, Latino, or Black, and half are the first in their family to go to college. In addition, while the field of quantum science and engineering is generally majority-male, the 2022 cohort is half female.

This summer, UW–Madison and the Wisconsin Quantum Institute hosted two students: Harrison with physics professor Baha Balantekin and postdoc Pooja Siwach; and MIT physics and electrical engineering major Kate Arutyunova with engineering physics professor Jennifer Choy, postdoc Maryam Zahedian and graduate student Ricardo Vidrio.

Harrison and Arutyunova met at OQI orientation at IBM’s quantum research lab in New York, and they hit it off immediately. (“We have the most matching energies (of the fellows),” Arutyunova says, with Harrison adding, “The synergy is real.”)

Four people stand in a lab in front of electronics equipment
OQI Fellow Kate Arutyunova with her research mentors. (L-R) Engineering Physics professor Jennifer Choy, graduate student Ricardo Vidrio, Kate Arutyunova, and postdoc Maryam Zahedian. | Photo provided by Kate Arutyunova

Despite their very different research projects — Harrison’s was theoretical and strongly focused on physics, whereas Arutyunova’s was experimental and with an engineering focus — they leaned on each other throughout the summer in Madison. They met at Union South nearly every morning at 7am to read and bounce ideas off each other. Then, after a full day with their respective research groups, they’d head back to Union South until it closed.

Modeling neutrino oscillations

Harrison’s research with Balantekin and Siwach investigated the neutrinos that escape collapsing supernovae cores. Neutrinos have a neutral charge and are relatively small particles, they make it out of cores without interacting with much — and therefore without changing much — so studying them helps physicists understand what is happening inside those stars. However, this is a difficult task because neutrinos oscillate between flavors, or different energy levels, and therefore require a lot of time and resources to calculate on a classical computer.

Harrison’s project, then, was to investigate two types of quantum computing methods, pulse vs circuit based, and determine if one might better fit their problem than the other. Previous studies suggest that pulsed based is likely to be better, but circuit based involves less complicated input calculations.

“I’ve been doing calibrations and calculating the frequencies of the pulses we’ll need to send to our qubits in order to get data that’s as accurate as a classical computer,” Harrison says. “I’m working with the circuit space, the mathematical versions of them, and then I’ll send my work to IBM’s quantum computers and they’ll calculate it and give results back.”

While she didn’t fully complete the project, she did make significant progress.

“(Katie) is very enthusiastic and she has gone a lot further than one would have expected an average undergraduate could have,” Balantekin says. “She started an interesting project, she started getting interesting results. But we are nowhere near the completion of the project, so she will continue working with us next academic year, and hopefully we’ll get interesting results.”

Developing better quantum sensors 

Over on the engineering side of campus, Arutyunova was studying different ways to introduce nitrogen vacancy (NV) centers in diamonds. These atomic-scale defects are useful in quantum sensing and have applications in magnetometry. Previous work in Choy’s group made the NV centers by a method known as nitrogen ion beam implantation. Arutyunova’s project was to compare how a different method, electron beam irradiation, formed the NV centers under different starting nitrogen concentrations in diamond.

Briefly, she would mark an edge of a very tiny (2 x 2 x 0.5 millimeter), nitrogen-containing diamond, and irradiate the sample with a scanning electron microscope. She used confocal microscopy to record the initial distribution of NV centers, then moved the sample to the annealing step, where the diamond is heated up to 1200 celsius in a vacuum annealing furnace. The diamonds are then acid washed and reexamined with the confocal microscope to see if additional NV centers are formed.

“It’s a challenging process as it requires precise coordinate-by-coordinate calculation for exposed areas and extensive knowledge of how to use the scanning electron microscope,” says Arutyunova, who will go back to MIT after the fellowship wraps. “I think I laid down a good foundation for future steps so that the work can be continued in my group.”

Choy adds:

Kate made significant strides in her project and her work has put us on a great path for our continued investigation into effective ways of generating color centers in diamond. In addition to her research contributions, our group has really enjoyed and benefited from her enthusiasm and collaborative spirit. It’s wonderful to see the relationships that Kate has forged with the rest of the group and in particular her mentors, Maryam and Ricardo. We look forward to keeping in touch with Kate on matters related to the project as well as her academic journey.

Beyond the summer fellowship

 Both Harrison and Arutyunova think that this experience has drawn them to the graduate school track, likely with a focus on quantum science. More importantly, it has helped them both to learn what they like about research.

“I would prefer to work on a problem and see the final output rather than a question where I do not have an idea of the application,” Arutyunova says. “And I realized how much I like to collaborate with people, exchange ideas, propose something, and listen to people and what they think about research.”

They also offer similar advice to other undergraduate students who are interested in research: do it, and start early.

“No matter when you start, you’re going to start knowing nothing,” Harrison says. “And if you start sooner, even though it’s scary and you feel like you know even less, you have more time to learn, which is amazing. And get in a research group where they really want you to learn.”

Search for neutrino emission associated with LIGO/Virgo gravitational waves

Gravitational waves (GWs) are a signature for some of the most energetic phenomena in the universe, which cause ripples in space-time that travel at the speed of light. These events, spurred by massive accelerating objects, act as cosmic messengers that carry with them clues to their origins. They are also probable sources for highly energetic neutrinos, nearly massless cosmic messengers hurtling through space unimpeded. Because neutrinos rarely interact with surrounding matter, they can reveal phenomena that are otherwise unobserved with electromagnetic waves. These high-energy neutrinos are detected by the IceCube Neutrino Observatory, a cubic-kilometer detector enveloped in Antarctic ice at the South Pole.

Both GWs and neutrinos are recently introduced messengers in astronomy and have yet to be detected by the same source. Such a major discovery would not only shed light on the sources of cosmic rays but would also help in understanding the most energetic processes in the universe. By coordinating traditional observations (from radio to gamma rays) with these new messengers, researchers can gain deeper insights into astrophysical sources that were unobtainable before.

Previously, the IceCube Collaboration looked for joint emission of GWs and high-energy neutrinos with data collected by IceCube, the Laser Interferometer Gravitational-Wave Observatory (LIGO), and the Virgo gravitational wave detector. These results were from GWs observed during the first two observing runs (O1 and O2) of LIGO and Virgo. IceCube researchers from the University of Wisconsin–Madison and Columbia University conducted an updated analysis of GWs from the third observing run (O3) of the LIGO/Virgo detectors. The increased number of GWs improved the researchers’ overall analysis. Their findings were recently submitted to The Astrophysical Journal.

Read the full story by WIPAC