IceCube analysis indicates there are many high-energy astrophysical neutrino sources

This story was originally published by WIPAC

Back in 2013, the IceCube Neutrino Observatory—a cubic-kilometer neutrino detector embedded in Antarctic ice—announced the first observation of high-energy (above 100 TeV) neutrinos originating from outside our solar system, spawning a new age in astronomy. Four years later, on September 22, 2017, a high-energy neutrino event was detected coincident with a gamma-ray flare from a cosmic particle accelerator, a blazar known as TXS 0506+056. The coincident observation provided the first evidence for an extragalactic source of high-energy neutrinos.

The identification of this source was possible thanks to IceCube’s real-time high-energy neutrino alert program, which notifies the community of directions and energies of individual neutrinos that are most likely to have come from astrophysical sources. These alerts trigger follow-up observations of electromagnetic waves from radio up to gamma-ray, aimed at pinpointing a possible astrophysical source of high-energy neutrinos. However, the sources of the vast majority of the measured diffuse flux of astrophysical neutrinos still remain a mystery, as do how many of those sources exist. Another mystery is whether the neutrino sources are steady or variable over time and, if variable, whether they vary over long or short time scales.

In a paper recently submitted to The Astrophysical Journal, the IceCube Collaboration presents a follow-up search that looked for additional, lower-energy events in the direction of the high-energy alert events. The analysis looked at low- and high-energy events from 2011-2020 and was conducted to search for the coincidence in different time scales from 1,000 seconds up to one decade. Although the researchers did not find an excess of low-energy events across the searched time scales, they were able to constrain the abundance of astrophysical neutrino sources in the universe.

a map of celestial coordinates with ovoid lines shown as a heatmap of locations where neutrino candidate events likely originated
Map of high-energy neutrino candidates (“alert events”) detected by IceCube. The map is in celestial coordinates, with the Galactic plane indicated by a line and the Galactic center by a dot. Two contours are shown for each event, for 50% and 90% confidence in the localization on the sky. The color scale shows the “signalness” of each event, which quantifies the likelihood that each event is an astrophysical neutrino rather than a background event from Earth’s atmosphere. Credit: IceCube Collaboration

This research also delves into the question of whether the astrophysical neutrino flux measured by IceCube is produced by a large number of weak sources or a small number of strong sources. To distinguish between the two possibilities, the researchers developed a statistical method that used two different sets of neutrinos: 1) alert events that have a high probability of being from an astrophysical source and 2) the gamma-ray follow-up (GFU) sample, where only about one to five out of 1,000 events per day are astrophysical.

“If there are a lot of GFU events in the direction of the alerts, that’s a sign that neutrino sources are producing a lot of detectable neutrinos, which would mean there are only a few, bright sources,” explained recent UW–Madison PhD student Alex Pizzuto, a lead on the analysis who is now a software engineer at Google. “If you don’t see a lot of GFU events in the direction of alerts, this is an indication of the opposite, that there are many, dim sources that are responsible for the flux of neutrinos that IceCube detects.”

a graph with power of each individual source on the y-axis and number density of astrophysical neutrino sources on the x-axis. there is a clear indirect relationship, with the lines starting in the upper left and moving toward the lower right of the graph. three "lines" are shown: an upper blue band that says "diffuse," a middle black lines that says "upper limit; this analysis" and a blue-green band that has +/-1 sigma sensitivity
Constraints on the luminosity (power) of each individual source as a function of the number density of astrophysical neutrino sources (horizontal axis). Previous IceCube measurements of the total astrophysical neutrino flux indicate that the true combination of the two quantities must lie within the diagonal band marked “diffuse.” The results of the new analysis are shown as an upper limit, compared to the sensitivity, which shows the range of results expected from background alone (no additional signal neutrinos associated with the directions of alert events). The upper limit is above the sensitivity because there is a statistical excess in the result (p = 0.018). Credit: IceCube Collaboration

They interpreted the results using a simulation tool called FIRESONG, which looks at populations of neutrino sources and calculates the flux from each of these sources. The simulation was then used to determine if the simulated sources might be responsible for producing a neutrino event.

“We did not find a clear excess of low-energy events associated with the high-energy alert events on any of the three time scales we analyzed,” said Justin Vandenbroucke, a physics professor at UW–Madison and colead of the analysis. “This implies that there are many astrophysical neutrino sources because, if there were few, we would detect additional events accompanying the high-energy alerts.”

Future analyses will take advantage of larger IceCube data sets and higher quality data from improved calibration methods. With the completion of the larger next-generation telescope, IceCube-Gen2, researchers will be able to detect even more dim neutrino sources. Even knowing the abundance of sources could provide important constraints on the identity of the sources.

“The future is very exciting as this analysis shows that planned improvements might reveal more astrophysical sources and populations,” said Abhishek Desai, postdoctoral fellow at UW–Madison and co-lead of the analysis. “This will be due to better event localization, which is already being studied and should be optimized in the near future.”

+ info “Constraints on populations of neutrino sources from searches in the directions of IceCube neutrino alerts,” The IceCube Collaboration: R. Abbasi et al. Submitted to The Astrophysical Journal. arxiv.org/abs/2210.04930.

Decades of work at UW–Madison underpin discovery of corona protecting Milky Way’s neighboring galaxies

a domed observatory with the night sky as a backdrop. the long exposure makes the stars look like they're rotating, with long blurry tails

This story was originally posted by UW–Madison News

Two dwarf galaxies circling our Milky Way, the Large and Small Magellanic Clouds, are losing a trail of gaseous debris called the Magellanic Stream. New research shows that a shield of warm gas is protecting the Magellanic Clouds from losing even more debris — a conclusion that caps decades of investigation, theorizing and meticulous data-hunting by astronomers working and training at the University of Wisconsin–Madison.

The findings, published recently in the journal Nature, come courtesy of quasars at the center of 28 distant galaxies. These extremely bright parts of galaxies shine through the gas that forms a buffer, or corona, that protects the Magellanic Clouds from the pull of the Milky Way’s gravity.

“We use a quasar as a light bulb,” says Bart Wakker, senior scientist in UW–Madison’s Astronomy Department. “If there is gas at a certain place between us and the quasar, the gas will produce an absorption line that tells us the composition of the clouds, their velocity and the amount of material in the clouds. And, by looking at different ions, we can study the temperature and density of the clouds.”

The temperature, location and composition — silicon, carbon and oxygen — of the gases that shadow the passing light of the quasars are consistent with the gaseous corona theorized in another study published in 2020 by UW–Madison physics graduate student Scott Lucchini, Astronomy professors Elena D’Onghia and Ellen Zweibel and UW–Madison alumni Andrew Fox and Chad Bustard, among others.

That work explained the expected properties of the Magellanic Stream by including the effects of dark matter: “The existing models of the formation of the Magellanic Stream are outdated because they can’t account for its mass,” Lucchini said in 2020.

“Our first Nature paper showed the theoretical developments, predicting the size, location and movement of the corona’s gases,” says Fox, now an astronomer at the Space Telescope Science Institute and, with Lucchini, a co-author of both studies.

The new discovery is a collaboration with a team that includes its own stream of former UW–Madison researchers pulled out into the world through the 1990s and 2000s — former graduate students Dhanesh Krishnarao, who is leading the work and is now a professor at Colorado College, David French, now scientist at the Space Telescope Science Institute, and Christopher Howk, now a professor at the University of Notre Dame — and former UW–Madison postdoctoral researcher Nicolas Lehner, also a Notre Dame professor.

UW–Madison research leading to the new discovery dates back at least to an inkling of hot gases seen in a study of stars in the Magellanic Cloud published in 1980 by the late astronomy professor Blair Savage and his then-postdoc Klaas de Boer.

“All that fell into place to allow us to look for data from the Hubble Space Telescope and a satellite called the Far Ultraviolet Spectroscopic Explorer, FUSE — which UW also played an important role in developing,” Wakker says. “We could reinterpret that old data, collected for many different reasons, in a new way to find what we needed to confirm the existence of a warm corona around the Magellanic Clouds.”

“We solved the big questions. There are always details to work out, and people to convince,” D’Onghia says. “But this is a real Wisconsin achievement. There aren’t many times where you can work together to predict something new and then also have the ability to spot it, to collect the compelling evidence that it exists.”

Read more about the research on NASA’s website.

Margaret Fortman awarded Google quantum computing fellowship

This post was adapted from a story posted by the UW–Madison Graduate School

Two UW–Madison graduate students, including physics grad student Margaret Fortman, have been awarded 2022 Google Fellowships to pursue cutting-edge research. Fortman received the 2022 Google Fellowship in Quantum Computing, one of only four awarded.

profile picture of Margaret Fortman
Margaret Fortman

Google created the PhD Fellowship Program to recognize outstanding graduate students doing exceptional and innovative research in areas relevant to computer science and related fields. The fellowship attracts highly competitive applicants from around the world.

“These awards have been presented to exemplary PhD students in computer science and related fields,” Google said in its announcement. “We have given these students unique fellowships to acknowledge their contributions to their areas of specialty and provide funding for their education and research. We look forward to working closely with them as they continue to become leaders in their respective fields.”

The program begins in July when students are connected to a mentor from Google Research. The fellowship covers full tuition, fees, and a stipend for the academic year. Fellows are also encouraged to attend Google’s annual Global Fellowship Summit in the summer.

Fortman works to diagnose noise interference in quantum bits

Fortman, whose PhD research in Victor Brar’s group specializes in quantum computing, will use the fellowship support to develop a diagnostic tool to probe the source of noise in superconducting quantum bits, or qubits.

Quantum computing has the potential to solve problems that are difficult for standard computers, Fortman said, but the field has challenges to solve first.

“The leading candidate we have for making a quantum computer right now is superconducting qubits,” Fortman said. “But those are currently facing unavoidable noise that we get in those devices, which can actually come from the qubit material itself.”

Fortman works with a low-temperature ultra-high vacuum scanning tunneling microscope on the UW–Madison campus to develop a microscopic understanding of the origins of noise in qubits. She fabricates superconductors to examine under the microscope to identify the source of the noise, and hopefully be able to develop a solution for that interference.

In her time as a graduate student at UW–Madison, Fortman said she has enjoyed collaborating with colleagues in her lab and across campus.

“It’s pretty cool to be somewhere where world-renowned research is happening and to be involved with that,” she said. “My PI and I work in collaborations with other PIs at the university and they’re all doing very important research, and so it’s really cool to be a part of that.”

Fortman is excited to have a mentor at Google through the PhD Fellowship, having been paired with someone who has a similar disciplinary background and who is a research scientist with Google Quantum AI.

“He can be a resource in debugging some parts of my project, as well as general mentorship and advice on being a PhD student, and advice for future career goals,” Fortman said.

The second UW–Madison student who earned this honor is computer sciences PhD student Shashank Rajput, who received the 2022 Google Fellowship in Machine Learning.

X(ray) marks the spot in elemental analysis of 15th century printing press methods

a woman (left, bending down) and a man (right, crouched) position an ancient manuscript into a machine

This story was originally published by University Communications

In 15th century Germany, Johannes Gutenberg developed a printing press, a machine that allowed for mass production of texts. It is considered by many to be one of the most significant technological advancements of the last millennium.

Though Gutenberg often receives credit as the inventor of the printing press, sometime earlier, roughly 5,000 miles away, Koreans had already developed a movable-type printing press.

There is no question that East Asians were first. There is also no question that Gutenberg’s invention in Europe had a far greater impact.

“What is not known is whether Gutenberg knew about the Korean printing or not. And if we could shed light on that question, that would be earth shattering,” says Uwe Bergmann, a professor of physics at the University of Wisconsin–Madison who, with UW–Madison physics graduate student Minhal Gardezi, is part of a large, interdisciplinary team that is analyzing historical texts.

He adds: “But even if we don’t, we can learn a lot about early printing methods, and that will already be a big insight.”

These texts include pages from a Gutenberg bible and Confucian texts, and they’re helping investigate these questions. The team includes 15th century Korean texts experts, Gutenberg experts, paper experts, ink experts and many more.

a person, with essentialy just their hands visible, holds a wooden box that is wrapped with a leather tie and has Korean text on the side
One of the leaves scanned was printed by a Korean movable type printing press in 1442. One of the team members from UNESCO, Angelica Noh, traveled with the preserved documents from Korea to SLAC. IMAGE PROVIDED BY MINHAL GARDEZI

How did two physicists end up participating in a seemingly very non-physics cultural heritage project? Bergmann had previously worked on other historical text analyses, where he pioneered the application of a technique known as X-ray fluorescence (XRF) imaging.

In XRF imaging, a powerful machine called a synchrotron sends an intense and very small X-ray beam — about the diameter of a human hair — at a page of text at a 45-degree angle. The beam excites electrons in the atoms that make up the text, requiring another electron to fill in the space left by the first (all matter is made up of atoms, which contain even smaller components called electrons).

The second electron loses energy in the process, and that energy is released as a small flash of light. A detector placed strategically nearby picks up that light, or its X-ray fluorescence, and measures both its intensity and the part of the light spectrum to which it belongs.

“Every single element on the periodic table emits an X-ray fluorescence spectrum that is unique to that atom when hit with a high-energy X-ray. Based on its ‘color,’ we know exactly which element is present,” says Gardezi. “It’s a very high-precision instrument that tells you all the elements that are at every location in a sample.”

With this information, researchers can effectively create an elemental map of the document. By rapidly scanning a page across the X-ray beam, they can create a record of the XRF spectrum at each pixel. One page can produce several million XRF spectra.

This summer, Bergmann and Gardezi were part of a team that used XRF scanning at the SLAC National Accelerator Laboratory in California to produce elemental maps of several large areas from original pages of a first-edition, 42-line Gutenberg Bible (dating back to 1450 to 1455 A.D.) and from Korean texts dating back to the early part of that. century.

They scanned the texts at a rate of around one pixel every 10 milliseconds, then filtered the data by elemental signature, providing high-resolution maps of which elements are present and in what relative quantities.

a three panel image. The top shows a regular photograph of the Korean text, with a dotted white line around two lines of 6 characters. The second panel shows the XRF of those characters, with a blue backround, a yellow-green hue aorund the characters, and red in the characters themmselves. The bottom panel shows a second XRF scan, but here almost the entire panel is blue, except for the circles of the characters, which are red, indicating the element being filtered was only present in the circles of the text.
A photograph of a scanned Korean text. The white dotted box indicates the areas shown in the middle and bottom panels. Each element produces a unique X-ray fluorescence. After scanning the text, the researchers applied filters for the known XRF patterns of different elements and created a color-coded heat-map of their abundance, from lowest (blue) to highest (red). An element found in only small quantities is in the red circles in the bottom part of the image. IMAGE PROVIDED BY MINHAL GARDEZI

In a way, the work is like digging for treasure from an old map — Gardezi says the researchers do not know exactly what they are looking for, but they are most interested in the unexpected.

For example, she recently presented early results of scans to the team, to demonstrate the approach had worked and that the researchers could separate out different elements. It turns out this isn’t what the team found most interesting.

“Instead, these scholars spent 15-to-20 minutes talking about, ‘Why is (this element) present?’ and coming up with hypotheses,” Gardezi says. “As physicists, we wouldn’t even recognize if something is surprising or not. It’s really this interdisciplinary aspect that tells us what to look for, what the smoking gun is.”

As more questions arise based on the elemental analyses, Bergmann and Gardezi will help guide the team to address those questions quantitatively. They are already planning to recreate some early printings in the lab — with known types, papers and inks — then compare these XRF scans with the originals.

The research may never definitively determine if Gutenberg knew about the Korean presses or if he developed his press independently. But without access to the original presses themselves, these texts hold the only clues to understanding the nature of these transformative machines.

“The more you read about it, the more you learn that there is less certainty about several things related to early printing presses,” Bergmann says. “Maybe this technique will allow us to view these prints as a time capsule and gain invaluable insight into this watershed moment in human history.”

Watch Minhal Gardezi show off XRF at SLAC National Accelerator Laboratory.

The UW–Madison efforts in the project are supported by the Overseas Korean Cultural Heritage Foundation.

Coherent light production found in very low optical density atomic clouds

No atom is an island, and scientists have known for decades that groups of atoms form communities that “talk” to each other. But there is still much to learn about how atoms — particularly energetically excited ones — interact in groups.

In a study published in PRX Quantum, physicists from the University of Wisconsin–Madison observed communication between atoms at lower and lower densities. They found that the atoms influence each other at 100 times lower densities than probed before, exhibiting slow decay rates and emitting coherent light.

“It seems that (low-density) groups of excited atoms spontaneously organize to then produce light that is coherent,” says David Gold, a postdoctoral fellow in Deniz Yavuz’s group and lead author of the study. “These findings are pretty interesting from a basic science standpoint, and in terms of quantum computing, the takeaway is that even with very low numbers of atoms, you can see significant amounts of (these effects).”

A well-established property of atoms is found in electron excitation: when a specific wavelength of light hits an atom of a specific element, an electron is excited to a higher orbital level. As that electron decays back to its initial state, a photon of a specific wavelength is emitted. A single atom has a characteristic decay rate for that process. When groups of atoms are studied, their interactions are observed: the initial decay rate is very fast, or superradiant, then transitions to a slower, or subradiant, rate.

A schematic of the experimental setup. (Top) the overall apparatus used. (A) shows the setup for the first part of the experiment, where the researchers were measuring decay rates in lower and lower density clouds. (B) shows the setup for the second part of the paper, with the addition of an interferometer

Though well-established in dense clouds, this group-talk has never been studied in less dense clouds of atoms, which could have impacts on applications such as quantum computing.

In their first set of experiments, Gold and colleagues asked what the decay rate of lower-density clouds looked like. They supercooled the atoms in a cloud, hit them with an excitation laser, and recorded the decay rates as an intensity of emitted light over time. They observed the characteristic subradiance. In this case, they did not always see superradiance, likely due to the reduced number of atoms available to measure.

profile picture of David Gold
David Gold

Next, they asked what happened if they let the cloud expand — or decrease in density — for varying periods of time before repeating their experiment. They found that as the cloud become less and less dense, the amount of subradiance decreased, until eventually a density was reached where the atoms stopped behaving like a group and instead displayed single-atom decay rates.

“The most subradiance that we observed was at around a hundred times lower optical density than it had previously been observed before,” Gold says.

Now that the researchers knew that a less dense cloud still decays subradiantly to a point, they asked if the decay was happening in an isolated manner, or if the atoms were really acting as a group. If acting as a group, the emitted light would be coherent, or more laser-like, with some structure between the atoms.

They used the same experimental setup but added an interferometer, where light is split and recombined before the photons are detected. They first set the baseline interference pattern by moving the mirror closer or further away from the splitter — changing the path length of one of the beams — and mapping the interference pattern of the split light waves that were emitted from the same atom.

If there were no relationship between the two atoms and the light they emit, then they would have expected to see no interference pattern. Instead, they saw that for some distance of mirror displacement, the lightwaves did interfere, indicating that different atoms being measured were nonetheless producing coherent light.

“I think this is the more exciting thing we found: that the light that’s being emitted is coherent and it has more of the properties of a laser than you would expect,” Gold says. “The atoms are influenced by each other and not in a way we would have expected.”

Aside from the interesting physics seen in the study, Gold says the work is also applicable to quantum computing, particularly as those computers grow bigger in the future.

“Even if everything in a quantum computer is running perfectly and the system was completely isolated, there’s still this inherent thing of, well, the atoms just might decay down from [the computational] state,” Gold says.

This work was supported by National Science Foundation (NSF) Grant No. 2016136 for the QLCI center Hybrid Quantum Architectures and Networks.

Coral skeleton formation rate determines resilience to acidifying oceans

A new University of Wisconsin–Madison study has implications for predicting coral reef survival and developing mitigation strategies against having their bony skeletons weakened by ocean acidification.

Though coral reefs make up less than one percent of the ocean floor, these ecosystems are among the most biodiverse on the planet — with over a million species estimated to be associated with reefs.

The coral species that make up these reefs are known to be differently sensitive or resilient to ocean acidification — the result of increasing atmospheric carbon dioxide levels. But scientists are not sure why.

In the study, researchers show that the crystallization rate of coral skeletons differs across species and is correlated with their resilience to acidification.

A woman holding two coral species stands in front of a body of water
“Finding solutions that are science-based is a priority,” says physics professor Pupa Gilbert, shown here with samples of scleractinian coral along the Lake Monona shoreline in Madison. | Photo: Jeff Miller

“Many agencies keep putting out reports in which they say, ‘Yes, coral reefs are threatened,’ with no idea what to do,” says Pupa Gilbert, a physics professor at UW–Madison and senior author of the study that was published Jan. 17 in the Journal of the American Chemical Society. “Finding solutions that are science-based is a priority, and having a quantitative idea of exactly what’s happening with climate change to coral reefs and skeletons is really important.”

Reef-forming corals are marine animals that produce a hard skeleton made up of the mostly insoluble crystalline material aragonite. Aragonite forms when precursors made up of a more soluble form, amorphous calcium carbonate, are deposited onto the growing skeleton and then crystallize.

The team studied three genera of coral and took an in-depth look at the components of their growing skeletons. They used a technique that Gilbert pioneered called PEEM spectromicroscopy, which detects the different forms of calcium carbonate with the greatest sensitivity to date.

When they used these spectromicroscopy images to compare the thickness of amorphous precursors to the crystalline form, they found that Acropora, which is more sensitive to acidification, had a much thicker band of amorphous calcium carbonate than Stylophora, which is less sensitive.

A third genus of unknown sensitivity, Turbinaria, had an even thinner amorphous precursor layer than Stylophora, suggesting it should be the most resilient of the three to ocean acidification.

two bright colored images assign a color to the form of calcium present in coral skeletons. On the left there is a thicker band of non-blue (blue is crystalline aragonite) compared to the image on the right where there is almost all blue, indicating the skeleton on the right crystallizes to aragonite more quickly
Coral skeletons were studied with PEEM spectromicroscopy, which identifies the calcium spectrum associated with each imaging pixel, then renders it in false color depending on the form of calcium. Blue is aragonite, the insoluble, crystalline form of calcium carbonate; the other colors represent one of the two amorphous precursor forms, a mix of the two, or a mix of aragonite and precursor form. Acropora spp. (left), has more non-blue pixels compared to Turbinaria spp. (right), indicating that Acropora has more of the soluble, non-crystalline form in its growing skeleton. | Pupa Gilbert and team in JACS

The thicker the band of uncrystallized minerals, the slower the crystallization process.

“If the surface of the coral skeleton, where all this amorphous calcium carbonate is being deposited by the living animal, crystallizes quickly, then that particular species is resilient to ocean acidification; if it crystallizes slowly, then it’s vulnerable,” Gilbert says. “For once, it’s a really simple mechanism.”

The mechanism may have worked out to be simple, but the data analysis required to process and interpret the PEEM images is anything but. Each pixel of imaging data acquired has a calcium spectrum that needs to be analyzed, which results in millions of data points. Processing the data includes many decision-making points, plus massive computing power.

The team has tried to automate the analysis or use machine-learning techniques, but those methods have not worked out. Instead, Gilbert has found that humans making decisions are the best data processors.

Gilbert did not want to base conclusions off the decision-making of just one or two people. So she hired a group of UW–Madison undergraduates, most of whom came from the Mercile J. Lee Scholars Program, which works to attract and retain talented students from underrepresented groups. This team provided a large and diverse group of decision makers.

a zoom screen showing several of the people who conducted the study
Gilbert and her research team met several times a week via Zoom, where students were assigned the same dataset to process in parallel and discuss at their next meeting. The Cnidarians — named after the phylum to which corals belong — include current and former UW–Madison undergraduates: Celeo Matute Diaz, Jorge Rivera Colon, Asiya Ahmed, Virginia Quach, Gabi Barreiro Pujol, Isabelle LeCloux, Sydney Davison, Connor Klaus, Jaden Sengkhamee, Evan Walch and Benjamin Fordyce; and graduate students Cayla Stifler, and Connor Schmidt. Schmidt was also the lead author of the study. | Provided by Pupa Gilbert

Dubbed the Cnidarians — from the phylum to which corals, anemones and jellyfish belong — this group of students became integral members of the team. They met several times a week via Zoom, when Gilbert would assign multiple students the same dataset to process in parallel and discuss at their next meeting.

“If multiple people come up with precisely the same solution even though they make different decisions, that means our analysis is robust and reliable,” Gilbert says. “The Cnidarians’ contributions were so useful that 13 of them are co-authors on this study.”

THIS STUDY WAS SUPPORTED BY THE DEPARTMENT OF ENERGY (DE-FG02-07ER15899 AND DE-AC02-05CHH11231), THE NATIONAL SCIENCE FOUNDATION (DMR-1603192) AND THE EUROPEAN RESEARCH COUNCIL (755876).

Magellanic Stream arcing over Milky Way may be five times closer than previously thought

Our galaxy is not alone. Swirling around the Milky Way are several smaller, dwarf galaxies — the biggest of which are the Small and Large Magellanic Clouds, visible in the night sky of the Southern Hemisphere.

profile photo of Scott Lucchini
Scott Lucchini

During their dance around the Milky Way over billions of years, the Magellanic Clouds’ gravity has ripped from each of them an enormous arc of gas — the Magellanic Stream. The stream helps tell the history of how the Milky Way and its closest galaxies came to be and what their future looks like.

New astronomical models developed by scientists at the University of Wisconsin–Madison and the Space Telescope Science Institute recreate the birth of the Magellanic Stream over the last 3.5 billion years. Using the latest data on the structure of the gas, the researchers discovered that the stream may be five times closer to Earth than previously thought.

The findings suggest that the stream may collide with the Milky Way far sooner than expected, helping fuel new star formation in our galaxy.

“The Magellanic Stream origin has been a big mystery for the last 50 years. We proposed a new solution with our models,” says Scott Lucchini, a graduate student in physics in Elena D’Onghia’s group at UW–Madison and lead author of the paper. “The surprising part was that the models brought the stream much closer to the Milky Way.”

Lucchini, D’Onghia, and Space Telescope Science Institute scientist Andrew Fox published their findings in The Astrophysical Journal Letters on Nov. 8.

Read the full story

a starscape showing the milky way in the distance and a rendering of the gases surrounding the large magellenic cloud
The Large and Small Magellanic Clouds as they would appear if the gas around them was visible to the naked eye. | Credits: Scott Lucchini (simulation), Colin Legg (background)

Study of high-energy particles leads PhD student Alex Wang to Department of Energy national lab

This story, by Meghan Chua, was originally published by the Graduate School

In 2012, scientists at CERN’s Large Hadron Collider announced they had observed the Higgs boson particle, verifying many of the theories of physics that rely on its existence.

profile photo of Alex Wang
Alex Wang

Since then, scientists have continued to search for the properties of the Higgs boson and for related particles, including an extremely rare case where two Higgs boson particles appear at the same time, called di-Higgs production.

“We’ve had some searches for di-Higgs right now, but we don’t see anything significant yet,” said Alex Wang, a PhD student in experimental high energy physics at UW­–Madison. “It could be because it doesn’t exist, which would be interesting. But it also could just be because, according to the Standard Model theory, it’s very rare.”

Wang will have a chance to aid in the search for di-Higgs production in more ways than one. Starting in November, he will spend a year at the SLAC National Accelerator Laboratory as an awardee in the Department of Energy Office of Science Graduate Student Research Program.

The program funds outstanding graduate students to pursue thesis research at Department of Energy (DOE) laboratories. Students work with a DOE scientist on projects addressing societal challenges at the national and international scale.

At the SLAC National Accelerator Laboratory, Wang will primarily work on hardware for a planned upgrade of the ATLAS detector, one of the many detectors that record properties of collisions produced by the Large Hadron Collider. Right now, ATLAS collects an already massive amount of data, including some events related to the Higgs boson particle. However, Higgs boson events are extremely rare.

In the future, the upgraded High-Luminosity Large Hadron Collider (HL-LHC) will enable ATLAS to collect even more data and help physicists to study particles like the Higgs boson in more detail. This will make it more feasible for researchers to look for extremely rare events such as di-Higgs production, Wang said. The ATLAS detector itself will also be upgraded to adjust for the new HL-LHC environment.

a black background with orange cones and small yellow box-like dots indicate the signal events
This image of a signal-like event in the ATLAS detector comes from one of the Higgs boson-related analyses Wang works on. The red cones and cyan towers indicate particles which may have originated from the decay of two Higgs boson particles. (Photo credit: ATLAS Experiment © 2021 CERN)

“I’m pretty excited to go there because SLAC is essentially where they’ll be assembling the innermost part of the ATLAS detector for the future upgrade,” Wang said. “So, I think it’s going to be a really central place in the future years, at least for this upgrade project.”

Increasing the amount of data a sensor collects can also cause problems, such as radiation damage to the sensors and more challenges sorting out meaningful data from background noise. Wang will help validate the performance of some of the sensors destined for the upgraded ATLAS detector.

“I’m also pretty excited because for the data analysis I’m doing right now, it’s mainly working in front of a computer, so it will be nice to have some experience working with my hands,” Wang said.

At SLAC, he will also spend time searching for evidence of di-Higgs production.

Wang’s thesis research at UW–Madison also revolves around the Higgs boson particle. He sifts through data from the Large Hadron Collider to tease out which events are “signals” related to the Higgs boson, versus events that are “backgrounds” irrelevant to his work.

One approach Wang uses is to predict how many signal events researchers expect to see, and then determine if the number of events recorded in the Large Hadron Collider is consistent with that prediction.

“If we get a number that’s consistent with our predictions, then that supports the existing model of physics that we have,” Wang said. “But for example, if you see that the theory predicts we’d have 10 events, but in reality, we see 100 events, then that could be an indication that there’s some new physics going on. So that would be a potential for discoveries.”

The Department of Energy formally approved the U.S. contribution to the High-Luminosity Large Hadron Collider accelerator upgrade project earlier this year. The HL-LHC is expected to start producing data in 2027 and continue through the 2030s. Depending on what the future holds, Wang may be able to use data from the upgraded ATLAS detector to find evidence of di-Higgs production. If that happens, he also will have helped build the machine that made it possible.

Magnetic fields implicated in the mysterious midlife crisis of stars

a brightly colored sun with a cutout showing into the core, with lines suggesting the spinning motion
Artist’s impression of the spinning interior of a star, generating the stellar magnetic field. This image combines a dynamo simulation of the Sun’s interior with observations of the Sun’s outer atmosphere, where storms and plasma winds are generated. | Credit:
CESSI / IISER Kolkata / NASA-SVS / ESA / SOHO-LASCO

This post was originally published by the Royal Astronomical Society. UW–Madison physics graduate student Bindesh Tripathi is the lead author of the scientific publication.

Middle-aged stars can experience their own kind of midlife crisis, experiencing dramatic breaks in their activity and rotation rates at about the same age as our Sun, according to new research published today in Monthly Notices of the Royal Astronomical Society: Letters. The study provides a new theoretical underpinning for the unexplained breakdown of established techniques for measuring ages of stars past their middle age, and the transition of solar-like stars to a magnetically inactive future.

Astronomers have long known that stars experience a process known as ‘magnetic braking’: a steady stream of charged particles, known as the solar wind, escapes from the star over time, carrying away small amounts of the star’s angular momentum. This slow drain causes stars like our Sun to gradually slow down their rotation over billions of years.

In turn, the slower rotation leads to altered magnetic fields and less stellar activity – the numbers of sunspots, flares, outbursts, and similar phenomena in the atmospheres of stars, which are intrinsically linked to the strengths of their magnetic fields.

profile photo of Bindesh Tripathy
Bindesh Tripathi

This decrease in activity and rotation rate over time is expected to be smooth and predictable because of the gradual loss of angular momentum. The idea gave birth to the tool known as ‘stellar gyrochronology’, which has been widely used over the past two decades to estimate the age of a star from its rotation period.

However recent observations indicate that this intimate relationship breaks down around middle age. The new work, carried out by Bindesh Tripathi at UW–Madison and the Indian Institute of Science Education and Research (IISER) Kolkata, India, provides a novel explanation for this mysterious ailment. Prof. Dibyendu Nandy, and Prof. Soumitro Banerjee of IISER are co-authors.

Using dynamo models of magnetic field generation in stars, the team show that at about the age of the Sun the magnetic field generation mechanism of stars suddenly becomes sub-critical or less efficient. This allows stars to exist in two distinct activity states – a low activity mode and an active mode. A middle aged star like the Sun can often switch to the low activity mode resulting in drastically reduced angular momentum losses by magnetized stellar winds.

Prof. Nandy comments: “This hypothesis of sub-critical magnetic dynamos of solar-like stars provides a self-consistent, unifying physical basis for a diversity of solar-stellar phenomena, such as why stars beyond their midlife do not spin down as fast as in their youth, the breakdown of stellar gyrochronology relations, and recent findings suggesting that the Sun may be transitioning to a magnetically inactive future.”

The new work provides key insights into the existence of low activity episodes in the recent history of the Sun known as grand minima – when hardly any sunspots are seen. The best known of these is perhaps the Maunder Minimum around 1645 to 1715, when very few sunspots were observed.

The team hope that it will also shed light on recent observations indicating that the Sun is comparatively inactive, with crucial implications for the potential long-term future of our own stellar neighbor.

Welcome, incoming MSPQC students! 

The UW–Madison Physics Department is pleased to welcome 18 students to the M.S. in Physics – Quantum Computing program. These students make up the third cohort to begin the program and are the largest entering class to date.  

“We are really pleased and proud that the MSPQC program continues to grow and prosper in its third year,” says Bob Joynt, MSPQC Program Director and professor of physics. “We look forward to providing a great experience for the class of 2021. A particular focus this year will be the formation of collaborative teams that will push forward research in quantum computing.” 

 Of note, three women are in the entering class, marking the first time that women have enrolled in MSPQC. Other facts and figures about this year’s cohort include: 

  • 11 students are coming directly from completing their Bachelors 
  • Three students have Master’s degrees 
  • Six students have at least four years of professional experience, and four of those students have over 10 years professional experience 
  • 15 are international students, and seven of those students have attended U.S. institutions for previous studies 
  • The students’ academic backgrounds include physics, astronomy, engineering, and business administration.  

The department is following University guidelines and is planning for students to join us in Madison this fall, with in-person instruction. Over the summer, students can attend optional virtual orientation sessions to prepare for the program.  

“The pandemic imposed restrictions on our admissions and recruitment activities which forced us to work virtually, but I believe these barriers made our programming more accessible and led to the most diverse and determined incoming cohort of MSPQC students to date,” says Jackson Kennedy, MSPQC coordinator. “Although I have been able to meet our incredibly talented students virtually, I cannot wait to greet them in-person this Fall as we celebrate a long-awaited return to campus.” 

In addition to Joynt, the department thanks the other faculty who serve on the MSPQC admissions committee — Alex Levchenko, Robert McDermott, Maxim Vavilov and Deniz Yavuz — for application review. We also thank Michelle Holland and Jackson Kennedy for organizing recruiting efforts.  

 The MSPQC program welcomed its first students in Fall 2019 – the first-ever class of students in the U.S. to enroll in a quantum computing M.S. degree program. The accelerated program was born out of a recognized need to rapidly train students for the quantum computing workforce and is designed to be completed in 12 months. It provides students with a thorough grounding in the new discipline of quantum information and quantum computing.  

names of students, UG institute and degree: Brooke Becker UW–Madison Computer Engineering Soyeon Choi Vanderbilt University Physics, Computer Science Manish Chowdhary Indian Institute of Technology Dhanbad Computer Application Hua Feng Dalian University of Technology Atomic and Molecular Physics Jacob Frederick University of Washington Computer Engineering Amol Gupta Delhi Technological University Computer Engineering Yucheng He Zhengzhou University Automation Xunyao Luo Lafayette College Physics and Neuroscience Arjun Puppala Indian Institute of Technology Roorkee Power Systems Engineering Evan Ritchie University of St Thomas - Minnesota Physics & Math Mubinjon Satymov New York City College of Technology - CUNY Applied Computational Physics Yen-An Shih National Cheng Kung University Computer Science Qianxu Wang University of Michigan Physics Jiaxi Xu UC-Berkeley Physics Anirudh Yadav Indian Institute of Technology Dhanbad Computer Science Yukun Yang Nanjing University Astronomy Jin Zhang UW–Madison Physics & Philosophy Lin Zhao UW–Madison Computer Science and Physics
The incoming 2021 class of MSPQC students