Department of Energy grant to train students at the interface of high energy physics and computer science

a long row of stacked computer servers

To truly understand our physical world, scientists look to the very small, subatomic particles that make up everything. Particle physics generally falls under the discipline of high energy physics (HEP), where higher and higher energy collisions — tens of teraelectronvolts, or about ten trillion times the energy of visible light — lead to the detection and characterization of particles and how they interact.

These collisions also lead to the accumulation of inordinate amounts of data, and HEP is increasingly becoming a field where researchers must be experts in both particle physics and advanced computing technologies. HEP graduate students, however, rarely enter graduate school with backgrounds in both fields.

Physicists from UW–Madison, Princeton University, and the University of Massachusetts-Amherst are looking to address the science goals of the HEP experiments by training the next generation of software and computing experts with a 5-year, ~$4 million grant from the U.S. Department of Energy (DOE) Office of Science, known as Training to Advance Computational High Energy Physics in the Exascale Era, or TAC-HEP.

“The exascale era is upon us in HEP and the complexity, computational needs and data volumes of current and future HEP experiments will increase dramatically over the next few years. A paradigm shift in software and computing is needed to tackle the data onslaught,” says Tulika Bose, a physics professor at UW–Madison and TAC-HEP principal investigator. “TAC-HEP will help train a new generation of software and computing experts who can take on this challenge head-on and help maximize the physics reach of the experiments.”

Tulika Bose

In total, DOE announced $10 million in funding today for three projects providing classroom training and research opportunities in computational high energy physics to train the next generation of computational scientists and engineers needed to deliver scientific discoveries.

At UW–Madison, TAC-HEP will annually fund four-to-six two-year training positions for graduate students working on a computational HEP research project with Bose or physics professors Keith Bechtol, Kevin Black, Kyle Cranmer, Sridhara Dasu, or Brian Rebel. Their research must broadly fit into the categories of high-performance software and algorithms, collaborative software infrastructure, or hardware-software co-design.

Bose’s research group, for example, focuses on proton-proton collisions in the Compact Muon Solenoid (CMS) at the CERN Large Hadron Collider (LHC). The high luminosity run of the LHC, starting in 2029, will bring unprecedented physics opportunities — and computing challenges, challenges that TAC-HEP graduate students will tackle firsthand.

“The annual data volume will increase by 30 times while the event reconstruction time will increase by nearly 25 times, requiring modernization of the software and computing infrastructure to handle the demands of the experiments,” Bose says. “Novel algorithms using modern hardware and accelerators, such as Graphics Processing Units, or GPUs, will need to be exploited together with a transformation of the data analysis process.”

TAC-HEP will incorporate targeted coursework and specialized training modules that will enable the design and development of coherent hardware and software systems, collaborative software infrastructure, and high-performance software and algorithms. Structured R&D projects, undertaken in collaboration with DOE laboratories (Fermilab and Brookhaven National Lab) and integrated within the program, will provide students from all three participating universities with hands-on experience with cutting-edge computational tools, software and technology.

The training program will also include student professional development including oral and written science communication and cohort-building activities. These components are expected to help build a cohort of students with the goal of increasing recruitment and retention of a diverse group of graduate students.

“Future high energy physics discoveries will require large accurate simulations and efficient collaborative software,” said Regina Rameika, DOE Associate Director of Science for High Energy Physics. “These traineeships will educate the scientists and engineers necessary to design, develop, deploy, and maintain the software and computing infrastructure essential for the future of high energy physics.

Opening doors to quantum research experiences with the Open Quantum Initiative

This past winter, Katie Harrison, then a junior physics major at UW–Madison, started thinking about which areas of physics she was interested in studying more in-depth.

“Physics is in general so broad, saying you want to research physics doesn’t really cut it,” Harrison says.

She thought about which classes she enjoyed the most and talked to other students and professors to help figure out what she might focus on. Quantum mechanics was high on her list. During her search for additional learning opportunities, she saw the email about the Open Quantum Initiative (OQI), a new fellowship program run by the Chicago Quantum Exchange (CQE).

“This could be something I’m interested in, right?” Harrison thought. “I’ll apply and see what happens.”

What happened was that Harrison was one of 12 undergraduate students accepted into the inaugural class of OQI Fellows. These students were paired with mentors at CQE member institutions, where they conducted research in quantum science information and engineering. OQI has a goal of connecting students with leaders in academia and industry and increasing their awareness of quantum career opportunities. The ten-week Fellowship ran through August 19.

11 students pose on a rock wall, all students are wearing the same Chicago Quantum Exchange hooded sweatshirt
OQI students attend a wrap-up at the University of Chicago on August 17. Each student presented at a research symposium that day, which also included a career panel from leaders across academia, government, and industry and an opportunity to network. | Photo provided by the Chicago Quantum Exchange

OQI also places an emphasis on establishing diversity, equity, and inclusion as priorities central to the development of the quantum ecosystem. Almost 70% of this year’s fellowship students are Hispanic, Latino, or Black, and half are the first in their family to go to college. In addition, while the field of quantum science and engineering is generally majority-male, the 2022 cohort is half female.

This summer, UW–Madison and the Wisconsin Quantum Institute hosted two students: Harrison with physics professor Baha Balantekin and postdoc Pooja Siwach; and MIT physics and electrical engineering major Kate Arutyunova with engineering physics professor Jennifer Choy, postdoc Maryam Zahedian and graduate student Ricardo Vidrio.

Harrison and Arutyunova met at OQI orientation at IBM’s quantum research lab in New York, and they hit it off immediately. (“We have the most matching energies (of the fellows),” Arutyunova says, with Harrison adding, “The synergy is real.”)

Four people stand in a lab in front of electronics equipment
OQI Fellow Kate Arutyunova with her research mentors. (L-R) Engineering Physics professor Jennifer Choy, graduate student Ricardo Vidrio, Kate Arutyunova, and postdoc Maryam Zahedian. | Photo provided by Kate Arutyunova

Despite their very different research projects — Harrison’s was theoretical and strongly focused on physics, whereas Arutyunova’s was experimental and with an engineering focus — they leaned on each other throughout the summer in Madison. They met at Union South nearly every morning at 7am to read and bounce ideas off each other. Then, after a full day with their respective research groups, they’d head back to Union South until it closed.

Modeling neutrino oscillations

Harrison’s research with Balantekin and Siwach investigated the neutrinos that escape collapsing supernovae cores. Neutrinos have a neutral charge and are relatively small particles, they make it out of cores without interacting with much — and therefore without changing much — so studying them helps physicists understand what is happening inside those stars. However, this is a difficult task because neutrinos oscillate between flavors, or different energy levels, and therefore require a lot of time and resources to calculate on a classical computer.

Harrison’s project, then, was to investigate two types of quantum computing methods, pulse vs circuit based, and determine if one might better fit their problem than the other. Previous studies suggest that pulsed based is likely to be better, but circuit based involves less complicated input calculations.

“I’ve been doing calibrations and calculating the frequencies of the pulses we’ll need to send to our qubits in order to get data that’s as accurate as a classical computer,” Harrison says. “I’m working with the circuit space, the mathematical versions of them, and then I’ll send my work to IBM’s quantum computers and they’ll calculate it and give results back.”

While she didn’t fully complete the project, she did make significant progress.

“(Katie) is very enthusiastic and she has gone a lot further than one would have expected an average undergraduate could have,” Balantekin says. “She started an interesting project, she started getting interesting results. But we are nowhere near the completion of the project, so she will continue working with us next academic year, and hopefully we’ll get interesting results.”

Developing better quantum sensors 

Over on the engineering side of campus, Arutyunova was studying different ways to introduce nitrogen vacancy (NV) centers in diamonds. These atomic-scale defects are useful in quantum sensing and have applications in magnetometry. Previous work in Choy’s group made the NV centers by a method known as nitrogen ion beam implantation. Arutyunova’s project was to compare how a different method, electron beam irradiation, formed the NV centers under different starting nitrogen concentrations in diamond.

Briefly, she would mark an edge of a very tiny (2 x 2 x 0.5 millimeter), nitrogen-containing diamond, and irradiate the sample with a scanning electron microscope. She used confocal microscopy to record the initial distribution of NV centers, then moved the sample to the annealing step, where the diamond is heated up to 1200 celsius in a vacuum annealing furnace. The diamonds are then acid washed and reexamined with the confocal microscope to see if additional NV centers are formed.

“It’s a challenging process as it requires precise coordinate-by-coordinate calculation for exposed areas and extensive knowledge of how to use the scanning electron microscope,” says Arutyunova, who will go back to MIT after the fellowship wraps. “I think I laid down a good foundation for future steps so that the work can be continued in my group.”

Choy adds:

Kate made significant strides in her project and her work has put us on a great path for our continued investigation into effective ways of generating color centers in diamond. In addition to her research contributions, our group has really enjoyed and benefited from her enthusiasm and collaborative spirit. It’s wonderful to see the relationships that Kate has forged with the rest of the group and in particular her mentors, Maryam and Ricardo. We look forward to keeping in touch with Kate on matters related to the project as well as her academic journey.

Beyond the summer fellowship

 Both Harrison and Arutyunova think that this experience has drawn them to the graduate school track, likely with a focus on quantum science. More importantly, it has helped them both to learn what they like about research.

“I would prefer to work on a problem and see the final output rather than a question where I do not have an idea of the application,” Arutyunova says. “And I realized how much I like to collaborate with people, exchange ideas, propose something, and listen to people and what they think about research.”

They also offer similar advice to other undergraduate students who are interested in research: do it, and start early.

“No matter when you start, you’re going to start knowing nothing,” Harrison says. “And if you start sooner, even though it’s scary and you feel like you know even less, you have more time to learn, which is amazing. And get in a research group where they really want you to learn.”

Machine Learning meets Physics

Machine learning and artificial intelligence are certainly not new to physics research — physicists have been using and improving these techniques for several decades.

In the last few years, though, machine learning has been having a bit of an explosion in physics, which makes it a perfect topic on which to collaborate within the department, the university, and even across the world. 

“In the last five years in my field, cosmology, if you look at how many papers are posted, it went from practically zero to one per day or so,” says assistant professor Moritz Münchmeyer. “It’s a very, very active field, but it’s still in an early stage: There are almost no success stories of using machine learning on real data in cosmology.”

Münchmeyer, who joined the department in January, arrived at a good time. Professor Gary Shiu was a driving force in starting the virtual seminar series “Physics ML” early in the pandemic, which now has thousands of people on the mailing list and hundreds attending the weekly or bi-weekly seminars by Zoom. As it turned out, physicists across fields were eager to apply their methods to the study of machine learning techniques. 

“So it was natural in the physics department to organize the people who work on machine learning and bring them together to exchange ideas, to learn from each other, and to get inspired,” Münchmeyer says. “Gary and I decided to start an initiative here to more efficiently focus department activities in machine learning.”

Currently, that initiative includes Münchmeyer, Shiu, Tulika Bose, Sridhara Dasu, Matthew Herndon, and Pupa Gilbert, and their research group members. They watch the Physics ML seminar together, then discuss it afterwards. On weeks that the virtual seminar is not scheduled, the group hosts a local speaker — from physics or elsewhere on campus — who is doing work in the realm of machine learning. 

In the next few years, the Machine Learning group in physics looks to build on the momentum the field currently has. For example, they hope to secure funding to hire postdoctoral fellows who can work within a group or across groups in the department. Also, the hiring of Kyle Cranmer — one of the best-known researchers in machine learning for physics — as Director of the American Family Data Science Institute and as a physics faculty member, will immediately connect machine learning activities in this department with those in computer sciences, statistics, and the Information School, as well other areas on campus.

“There are many people [on campus] actively working on machine learning for the physical sciences, but there was not a lot of communication so far, and we are trying to change that,” Münchmeyer says.

Machine Learning Initiatives in the Department (so far!)

Kevin Black, Tulika Bose, Sridhara Dasu, Matthew Herndon and the CMS collaboration at CERN use machine learning techniques to improve the sensitivity of new physics searches and increase the accuracy of measurements.

Pupa Gilbert uses machine learning to understand patterns in nanocrystal orientations (detected with her synchrotron methods) and fracture mechanics (detected at the atomic scale with molecular dynamics methods developed by her collaborator at MIT).

Moritz Münchmeyer develops machine learning techniques to extract information about fundamental physics from the massive amount of complicated data of current and upcoming cosmological surveys. 

Gary Shiu develops data science methods to tackle computationally complex systems in cosmology, string theory, particle physics, and statistical mechanics. His work suggests that Topological Data Analysis (TDA) can be integrated into machine learning approaches to make AI interpretable — a necessity for learning physical laws from complex, high dimensional data.

CHIME telescope detects more than 500 mysterious fast radio bursts in its first year of operation

This post has been modified from the original post, published by MIT News

To catch sight of a fast radio burst is to be extremely lucky in where and when you point your radio dish. Fast radio bursts, or FRBs, are oddly bright flashes of light, registering in the radio band of the electromagnetic spectrum, that blaze for a few milliseconds before vanishing without a trace.

These brief and mysterious beacons have been spotted in various and distant parts of the universe, as well as in our own galaxy. Their origins are unknown, and their appearance is unpredictable. Since the first was discovered in 2007, radio astronomers have only caught sight of around 140 bursts in their scopes.

Now, a large stationary radio telescope in British Columbia has nearly quadrupled the number of fast radio bursts discovered to date. The telescope, known as CHIME, for the Canadian Hydrogen Intensity Mapping Experiment, has detected 535 new fast radio bursts during its first year of operation, between 2018 and 2019.

Profile photo of Moritz Münchmeyer
Moritz Münchmeyer

Scientists with the CHIME Collaboration, including researchers at the University of Wisconsin–Madison, have assembled the new signals in the telescope’s first FRB catalog, which they will present this week at the American Astronomical Society Meeting.

UW–Madison physics professor Moritz Münchmeyer is a member of CHIME-FRB and contributed to the statistical analysis of the new FRB catalog. He joined UW–Madison this spring and a part of his new group is continuing this work, with the goal of using FRBs as a novel probe of the physics of the universe.

“This is only the beginning of FRB research. For the first time we now have enough FRBs to study their statistical distribution. It turns out that FRBs come from all over the universe, from relatively nearby to half way back to the Big Bang,” Münchmeyer says. “They are also quite frequent, about 800 per day if we were to see them all. They are extremely powerful light sources at cosmological distances and thus provide a new window into the physics of the universe.”

For the full story, please visit https://news.mit.edu/2021/chime-telescope-fast-radio-bursts-0609

The large radio telescope CHIME, pictured here, has detected more than 500 mysterious fast radio bursts in its first year of operation, MIT researchers report. | Image Courtesy of CHIME

Highest-energy Cosmic Rays Detected in Star Clusters

For decades, researchers assumed the cosmic rays that regularly bombard Earth from the far reaches of the galaxy are born when stars go supernova — when they grow too massive to support the fusion occurring at their cores and explode.

Those gigantic explosions do indeed propel atomic particles at the speed of light great distances. However, new research suggests even supernovae — capable of devouring entire solar systems — are not strong enough to imbue particles with the sustained energies needed to reach petaelectronvolts (PeVs), the amount of kinetic energy attained by very high-energy cosmic rays.

And yet cosmic rays have been observed striking Earth’s atmosphere at exactly those velocities, their passage marked, for example, by the detection tanks at the High-Altitude Water Cherenkov (HAWC) observatory near Puebla, Mexico. Instead of supernovae, the researchers — including UW–Madison’s Ke Fang — posit that star clusters like the Cygnus Cocoon serve as PeVatrons — PeV accelerators — capable of moving particles across the galaxy at such high energy rates.

Their paradigm-shifting research provides compelling evidence for star forming regions to be PeVatrons and is published in two recent papers in Nature Astronomy and Astrophysical Journal Letters.

For the full news story, please visit https://www.mtu.edu/news/stories/2021/march/not-so-fast-supernova-highestenergy-cosmic-rays-detected-in-star-clusters.html.

 

Welcome, Professor Moritz Münchmeyer!

Profile photo of Moritz Münchmeyer
Moritz Münchmeyer

On January 1, assistant professor Moritz Münchmeyer joined the UW–Madison physics department. He specializes in theoretical and computational cosmology. His research combines theoretical investigation, the analysis of data from different observatories, and the development of machine learning techniques to probe fundamental physics with cosmological data. He joins us from the Perimeter Institute for Theoretical Physics in Waterloo, where he was a Senior Postdoctoral Fellow. To welcome Münchmeyer to the department and to learn more about him and his research, we sat down for a (virtual) interview.

What are your research interests? 

I work at the intersection of theory and observation in cosmology. On the one hand we have the mathematical theories of how the universe works, and then we have observations made by telescopes and detectors. The universe, of course, is incredibly complicated. There are many forces and particles and radiation that all interact with each other. And that makes it often hard to go from observational data to the theory that you’re interested in. We want to know, for example, what were the laws of physics in the very early universe? Or how does the universe evolve? And so, I develop new methods to use the data to probe the theories.

One thing that I’m very excited about now is using techniques from data science and machine learning for cosmology. As everybody knows, there’s a machine learning revolution going on which is having an impact on many fields, including cosmology. But the techniques in machine learning are often developed to do things like object recognition in images. They do not necessarily work well for the kind of data that we have, which has very different properties and is described by physical theories. So, I’m trying to adapt these machine learning techniques, or find new ones, that are specifically suited for the problems of cosmology.

I also work on new theoretical ideas to use observational data. There will be a huge influx of new cosmological data in the next decade: many experiments are being built and they are often much better than previous experiments. We’ll get amazing new data of the universe and I’m thinking about how to use this data to learn more about fundamental physics, for example by combining different data sources in new ways that have not been explored before.

What is the source of the data you use in your research?

 When I started in cosmology, I became a member of the Planck satellite collaboration, which was a Cosmic Microwave Background (CMB) experiment. Many of the best measurements of cosmological parameters, such as the age of the universe, come from Planck. Of course, now we are building even better CMB experiments, such as the Simons Observatory which I am a member of. In about two years it will start to take precision measurements of the radiation from the early universe. I am also a member of the CHIME experiment, which is detecting Fast Radio Bursts, a new exciting source of data for cosmology and astrophysics. In Madison I am looking to also become involved with Vera Rubin Observatory, one the major upcoming galaxy surveys, which can be combined with CMB experiments. Prof. Keith Bechtol in the physics department is a leading contributor to this experiment. As a theorist, I am not involved much in the data taking process, but once the data is taken, my group will work on its analysis with the methods we have developed.

Once you settle into your new role here, what are the first research projects your group will start on? 

The broad subject we’ll work on is to learn about the initial conditions of the universe from CMB and galaxy data. We will develop new statistical tools and machine learning methods towards this goal. We will also think about new ideas to use cosmological data, such as the Fast Radio Bursts I mentioned before.

What hobbies and interests do you have?  

I have a family with two young children, so I like to go on adventures with them. I also play piano, especially to get my mind off physics. My current favorite sport is Brazilian Jiu-Jitsu. I’ve also always been interested in entrepreneurship. A few years ago, I co-founded a small company, Wolution, which uses machine learning — not in cosmology, but for image analyses in bio sciences, agriculture, and other fields.

What is your favorite element and/or elementary particle? 

My favorite elementary particle is the photon, because it’s extremely versatile: the entire electromagnetic spectrum, like radio waves and x-rays and of course visible light. All the experiments I mentioned above fundamentally detect photons.