High energy physicist Sridhara Dasu was recently named a member of the International Committee for Future Accelerators (ICFA), a term he’ll serve for three years. ICFA was created to facilitate international collaboration in the construction and use of accelerators for high energy physics. The Committee has 16 members, selected primarily from the regions most deeply involved in high-energy physics. Dasu will be representing the United States on the committee.
This story, by Meghan Chua, was originally published by the Graduate School
In 2012, scientists at CERN’s Large Hadron Collider announced they had observed the Higgs boson particle, verifying many of the theories of physics that rely on its existence.
Since then, scientists have continued to search for the properties of the Higgs boson and for related particles, including an extremely rare case where two Higgs boson particles appear at the same time, called di-Higgs production.
“We’ve had some searches for di-Higgs right now, but we don’t see anything significant yet,” said Alex Wang, a PhD student in experimental high energy physics at UW–Madison. “It could be because it doesn’t exist, which would be interesting. But it also could just be because, according to the Standard Model theory, it’s very rare.”
Wang will have a chance to aid in the search for di-Higgs production in more ways than one. Starting in November, he will spend a year at the SLAC National Accelerator Laboratory as an awardee in the Department of Energy Office of Science Graduate Student Research Program.
The program funds outstanding graduate students to pursue thesis research at Department of Energy (DOE) laboratories. Students work with a DOE scientist on projects addressing societal challenges at the national and international scale.
At the SLAC National Accelerator Laboratory, Wang will primarily work on hardware for a planned upgrade of the ATLAS detector, one of the many detectors that record properties of collisions produced by the Large Hadron Collider. Right now, ATLAS collects an already massive amount of data, including some events related to the Higgs boson particle. However, Higgs boson events are extremely rare.
In the future, the upgraded High-Luminosity Large Hadron Collider (HL-LHC) will enable ATLAS to collect even more data and help physicists to study particles like the Higgs boson in more detail. This will make it more feasible for researchers to look for extremely rare events such as di-Higgs production, Wang said. The ATLAS detector itself will also be upgraded to adjust for the new HL-LHC environment.
“I’m pretty excited to go there because SLAC is essentially where they’ll be assembling the innermost part of the ATLAS detector for the future upgrade,” Wang said. “So, I think it’s going to be a really central place in the future years, at least for this upgrade project.”
Increasing the amount of data a sensor collects can also cause problems, such as radiation damage to the sensors and more challenges sorting out meaningful data from background noise. Wang will help validate the performance of some of the sensors destined for the upgraded ATLAS detector.
“I’m also pretty excited because for the data analysis I’m doing right now, it’s mainly working in front of a computer, so it will be nice to have some experience working with my hands,” Wang said.
At SLAC, he will also spend time searching for evidence of di-Higgs production.
Wang’s thesis research at UW–Madison also revolves around the Higgs boson particle. He sifts through data from the Large Hadron Collider to tease out which events are “signals” related to the Higgs boson, versus events that are “backgrounds” irrelevant to his work.
One approach Wang uses is to predict how many signal events researchers expect to see, and then determine if the number of events recorded in the Large Hadron Collider is consistent with that prediction.
“If we get a number that’s consistent with our predictions, then that supports the existing model of physics that we have,” Wang said. “But for example, if you see that the theory predicts we’d have 10 events, but in reality, we see 100 events, then that could be an indication that there’s some new physics going on. So that would be a potential for discoveries.”
The Department of Energy formally approved the U.S. contribution to the High-Luminosity Large Hadron Collider accelerator upgrade project earlier this year. The HL-LHC is expected to start producing data in 2027 and continue through the 2030s. Depending on what the future holds, Wang may be able to use data from the upgraded ATLAS detector to find evidence of di-Higgs production. If that happens, he also will have helped build the machine that made it possible.
Note: This story has been modified slightly from the original, which was published by the CMS Collaboration. Their version has some nice interactive graphics to check out, too!
The standard model of particle physics is our current best theory to describe the most basic building blocks of the universe, the elementary particles, and the interactions among them. At the heart of the standard model is a hypothesis describing how all the elementary particles acquire mass. Importantly, this scheme also envisages the existence of a new type of particle, called the Higgs boson. It took nearly 50 years, since its postulation, to observe the Higgs boson at the LHC experiments at CERN. It is strongly believed that the Higgs boson, the only scalar particle known to date, is a key to answer some of the questions that standard model cannot answer. Thus a detailed study of the properties of the Higgs boson is the order of the day. Often, specially at the LHC, one of the essential observables concerns the probability that a certain unstable particle is produced momentarily, albeit obeying the laws of nature. In experiments this production cross section is estimated using a specific decay final state of this transient particle in terms of the number of events over a given amount of time. The standard model predicts the cross section for the Higgs boson production as well as the decay rates very precisely. The frequency distribution of a given type of event, as a function of some of the measured variables in the experiment, helps us understand better various aspects of the interactions involved; they are typically lost in the summed or total cross section. Hence measurement of this differential cross section is a powerful tool to vindicate the standard model; also any deviation from the standard model predictions in data would indicate presence of a New Physics.
The Higgs boson is roughly about 125 times more massive than a proton and decays to lighter particles including cascade processes in some cases. Physicists typically use the signatures of stable particles in the detector to trace back suitable decay chains of the Higgs boson. The tau lepton is the heaviest lepton known so far, and as such it is the lepton with strongest ties to the Higgs boson. The probability of a Higgs boson decaying to a pair of tau leptons is reasonably high (about 6%), when compared, for example, to a pair of muons (about 0.02%). But the tau lepton is also an unstable particle and decays quickly to lighter particles always accompanied by its partner, the tau neutrino. Often the decay products from the tau lepton are hadrons producing a shower of particles or jet in the calorimeter system. The tau neutrino goes undetected affecting the accuracy of measurement of the tau lepton energy. It is interesting to study the detailed characteristics of the Higgs boson events using the decay to tau leptons which possess a rest mass of only about 1.4% that of the parent.
A recent study from the CMS Collaboration, focuses on the events where the Higgs boson decays into a pair of tau leptons using data collected by the experiment between 2016 and 2018. The analysis measures the Higgs boson production cross section as a function of three key variables: the Higgs boson momentum in the direction transverse to the beam, the number of jets produced along with the Higgs boson, and the transverse momentum of the leading jet. New Physics could manifest in excess of events in the frequency distribution of these variables when compared with the standard model predictions.
Says Andrew Loeliger, a UW–Madison physics grad student and one of the lead authors on the study:
The Higgs Boson is the most recent addition to the standard model of particle physics, discovered jointly between the CMS and ATLAS collaborations in 2012, so a big goal of the High Energy Physics field is to make very detailed measurements of its properties, to understand if our predictions are all confirmed, or if there is some kind of new physics or strange properties that might foreshadow or necessitate further discoveries. This work provides, what amounts to, a very fine grained consistency check (alternatively, a search for deviations in the amount) that the Higgs Boson is produced with the amounts/strengths we would expect when categorizing alongside some second interesting property (the transverse momentum of the Higgs Boson is a big one). This type of analysis had not been performed before using the particles we used, so it may open the door for far more precise measurements in places we may not have been able to do before, and a better overall confirmation of the Higgs Boson’s properties.
The analysis employs deep neural networks to exploit simultaneously a variety of tau lepton properties for identifying them with high efficiency. Eventually, to ensure that the selected tau lepton pair is produced from the decay of the Higgs boson and discard those from other processes, such as Z boson decay, the mass of the selected tau pair (m𝝉𝝉 ) is scrutinized. Reconstruction of m𝝉𝝉 , after taking into account the neutrinos involved in the decay as mentioned earlier, required a dedicated algorithm which computes, for each event, a likelihood function P(m𝝉𝝉) to quantify the level of compatibility of a Higgs boson process.
The Higgs boson typically has more transverse momentum or boost when produced in conjunction with jet(s), compared to the case when it is produced singly. One such event, collected by the CMS detector in 2018 and shown in Figure 1, could correspond to such a boosted Higgs boson decaying to two tau leptons which, in turn, decay hadronically. However, several other less interesting processes could also be the cause of such an event and pose as backgrounds. Such contributions have been measured mostly from the data itself by carefully studying the properties of the jets. Figure 2 shows the good agreement in the m𝝉𝝉 distribution between the prediction and data collected by the CMS experiment for the events with the transverse momentum of the Higgs boson below 45 GeV. The contribution from the Higgs boson process is hardly noticeable due to the overwhelming background. On the other hand, Figure 3 presents m𝝉𝝉 distribution for the events with highly boosted Higgs boson, when its transverse momentum is above 450 GeV. Selecting only events with high boost reduces a lot the total number of available events, but the fraction of the signal events in the collected sample is significantly improved. The data agrees with the sum of predicted contributions from the Higgs boson and all the standard model background processes.
This CMS result presents the first-ever measurement of the differential cross sections for the Higgs boson production decaying to a pair of tau leptons. Run 2 data is allowing us to scrutinize the Higgs boson in the tau lepton decay channel which was only observed a few years back. Future comparison and combination of all Higgs boson decay modes will offer better insights on the interactions of the Higgs boson to different standard model particles. But the story does not end here! The Run 3 of the LHC machine is just around the corner and looking into the future, the high luminosity operation (the HL-LHC) will offer a huge increase in data volume. That could perhaps provide hints of the question if the discovered Higgs boson is the one as predicted by the standard model or if there is any new interaction depicting another fundamental particle contributing to such measurements. That will indeed point to New Physics!
A biography of Professor Sau Lan Wu was included as the cover story in a recent issue of the AIP History Newsletter (see pages 14-15). The story covers her early life in Hong Kong, her arrival in the U.S. to attend college at Vassar which she attended on a full scholarship, and into her graduate studies and subsequent research programs at MIT, UW–Madison and CERN. There is also a list of references / further readings. Check out the article to learn more about Prof. Wu’s illustrious career so far!
The Department of Physics is pleased to announce that Prof. Yang Bai has been promoted to the rank of full professor.
“It is my pleasure and honor as Dean to approve Prof. Yang Bai’s promotion to Full Professor. His creativity and impressive breadth in particle physics research make him a leader not only on dark matter, but also more generally on Beyond-the-Standard-Model Physics,” says Eric Wilcots, Dean of the College of Letters & Science. “He is also a valued teacher, appreciated by students especially at the graduate level. Graduate students and junior researchers in Madison are in good hands.”
Bai joined the department in 2012, and was promoted to associate professor in 2017. In addition to his robust and well-funded research program, he has trained several successful graduate students, taught all levels of departmental courses, and served on several departmental and university committees.
“Professor Yang Bai is widely recognized as one of the leading theoretical particle physicists of his generation with a broad and vigorous research program, covering both the collider-related frontiers and the cosmic frontier. His work includes significant contributions in essentially every area related to dark matter,” says Sridhara Dasu, professor and department chair. “The Physics Department very strongly endorses the promotion of Yang Bai to Full Professor.”
Congrats, Prof. Bai on this well-earned recognition!
Neutrinos mix and transform from one flavor to the other. So do quarks. However, electron and its heavier cousins, the muon and the tau, seem to conserve their flavor identity. This accidental conservation of charged lepton flavor must have a profound reason, or low-levels of violation of that conservation principle should occur at high energy scales. However, evidence for any charged lepton flavor violation remains elusive.
The CMS group recently published a new study on Lepton flavor in Higgs boson decays. At UW–Madison, the effort was led by Sridhara Dasu and postdoctoral researcher Varun Sharma, building off of work done by former postdoctoral researcher Maria Cepeda and former graduate student Aaron Levine.
The international CMS collaboration recently published a news story about this new study. Please read the full story here.
For decades, researchers assumed the cosmic rays that regularly bombard Earth from the far reaches of the galaxy are born when stars go supernova — when they grow too massive to support the fusion occurring at their cores and explode.
Those gigantic explosions do indeed propel atomic particles at the speed of light great distances. However, new research suggests even supernovae — capable of devouring entire solar systems — are not strong enough to imbue particles with the sustained energies needed to reach petaelectronvolts (PeVs), the amount of kinetic energy attained by very high-energy cosmic rays.
And yet cosmic rays have been observed striking Earth’s atmosphere at exactly those velocities, their passage marked, for example, by the detection tanks at the High-Altitude Water Cherenkov (HAWC) observatory near Puebla, Mexico. Instead of supernovae, the researchers — including UW–Madison’s Ke Fang — posit that star clusters like the Cygnus Cocoon serve as PeVatrons — PeV accelerators — capable of moving particles across the galaxy at such high energy rates.
For the full news story, please visit https://www.mtu.edu/news/stories/2021/march/not-so-fast-supernova-highestenergy-cosmic-rays-detected-in-star-clusters.html.
New UW–Madison assistant professor of physics Lu Lu’s research program combines the past with the future. Her research looks for sources of ultrahigh energy particles, which is done by analyzing data that has already been collected. As she says, “Maybe data is already talking to us, we just haven’t looked.” But she is also working toward improving future data collection, which will require more technologically-advanced detectors. “My teachers, my great masters, have taught me that the current young generation has the responsibility to look into new techniques to go to the future for younger generations to proceed forward,” she says about her work in sensor R&D.
On January 1, Professor Lu joined the Department of Physics and IceCube. Most recently, she was a postdoctoral fellow at the International Center for Hadron Astrophysics at Chiba University in Japan. To welcome her, we sat down for a (virtual) interview.
What are your research interests?
My prime interest is astroparticle physics, and my ultimate goal is to find the sources of the highest energy particles in the universe. These particles carry energy of about 1020 electronvolts. This is higher energy than what we have from the Large Hadron Collider and human technologies. The real attractiveness here is we don’t know how nature accelerates these particles. And once we identify the sources, we can test new theories beyond the Standard Model using sources crated by nature.
What are one or two main projects you focus your research on?
I’m involved in two experiments. One is IceCube, the other is Pierre Auger Observatory. I was doing cosmic ray analysis, but cosmic rays are usually charged particles and they are deflected in the magnetic field of the galaxy; they would not travel in a straight line. IceCube studies neutrinos which are neutral particles, they travel directly from the source. Pierre Auger detects ultrahigh energy photons, which are also neutral particles. One thing I want to do immediately after I join Madison is to combine these two experiments to do a joint analysis. We have photon candidates but we haven’t really tried to connect them in the multimessenger regime. By combining Pierre Auger photons with IceCube neutrinos, we could possibly find a transient source, a source that doesn’t constantly emit ultrahigh energy photons or neutrinos but all of a sudden there’s a flare. This type of analysis has never been done, but we have data on disks.
The second thing I’m interested in is using new sensor technologies. In IceCube, we have Gen2 being planned right now. Instead of using a single photon sensor, we’d use a more sensitive design and R&D. UW–Madison is taking the lead of designing this future detector. There’s also radio technology. So, to detect the highest energy neutrinos we need to build a large instrument volume. With optical array, it is really hard to scale up because one has to drill holes inside the South Pole, which is really expensive. But radio technology doesn’t have to go so deep, so they can bury their detectors on the surface areas, and the radiowaves can transmit further away than the optical photons in ice. For optical you have to make the detectors very dense, but for radio you can make the antennas further apart, so that means you can have a larger area and detect more events easily. I think radio is the way to go for the future.
You said you have a lot of data collected already and just need to analyze it. How do you analyze the data from these detectors?
We would have to search for photon candidates from the data from Auger, and identify where it comes from and what the time this event happened. Correspondingly, do we see neutrinos from IceCube coming from the same direction and at the same time? Because you can never be sure it’s a photon. It could be a proton. We then want to build a statistical framework to combine different multimessengers together in real time.
What does it mean if you find a photon in coincidence with a neutrino?
Cosmic rays were first detected more than 100 years ago, and there’s a rich history of studying where they come from. The mystery of origins still remains today because our poor knowledge on the galactic/extragalactic magnetic fields and mass composition of cosmic rays. In my opinion, the most probable way to solve this puzzle is to use neutral particles. If we can identify ultrahigh energy photons in coincidence with neutrinos, that is a smoking gun that we are actually looking at a source and we can finally pin down where in the universe is accelerating high energy particles. And therefore, we can study particle physics maybe beyond Standard Model. It’s just like a lab created by the universe to test particle physics.
What is your favorite element and/or elementary particle?
My favorite elementary particle is the electron anti-neutrino. I like muons, too. My favorite element is hydrogen.
What hobbies and interests do you have?
I’m afraid I’ll disappoint you because my hobby is related to my research: Augmented reality. When I heard about something called Microsoft Hololens, I thought, I could make IceCube a hologram. I bought these special glasses, and then made a program on it and used it for some outreach events. But the glasses are very expensive, so people said, “Okay we can’t buy hologram glasses.” So I moved it to mobile phones so that everyone could look at it for fun. It’s called IceCubeAR (note: download it for iPhones or Android phones). I made it with a group of friends in Tokyo.
The Daya Bay Reactor Neutrino Experiment collaboration – which made a precise measurement of an important neutrino property eight years ago, setting the stage for a new round of experiments and discoveries about these hard-to-study particles – has finished taking data. Though the experiment is formally shutting down, the collaboration will continue to analyze its complete dataset to improve upon the precision of findings based on earlier measurements.
The detectors for the Daya Bay experiment were built at UW–Madison by the Physical Sciences Laboratory, and detailed in a 2012 news release.
Says PSL’s Jeff Cherwinka, U.S. chief project engineer for Daya Bay:
The University of Wisconsin Physics Department and the Physical Sciences Lab were very involved in the design, fabrication and installation of the anti-neutrino detectors for the Daya Bay Experiment. It was a great opportunity for faculty, staff, and students to participate in an important scientific measurement, while learning about another country and culture. There were many trips and man years of effort in China by UW physicists, engineers and technicians to construct the experiment and many more for operations and data taking. This international collaboration took a lot of effort, and in the end produced great results.
The chief experimentalist at UW–Madison was Karsten Heeger who has since left for Yale. At present, Prof. Baha Balantekin is the only one remaining at UW–Madison in the Daya Bay Collaboration.
A completion ceremony will be held Friday, December 11from 7:30-8:3opm CST. Video stream options and the full story can be found at Berkeley Lab’s website.
Three UW–Madison physics professors and their colleagues have been awarded a U.S. Department of Energy (DOE) High Energy Physics Quantum Information Science award for an interdisciplinary collaboration between theoretical and experimental physicists and experts on quantum algorithms.
The grant, entitled “Detection of dark matter and neutrinos enhanced through quantum information,” will bring a total of $2.3 million directly to UW-Madison. Physics faculty include principal investigator Baha Balantekin as well as Mark Saffman, and Sue Coppersmith. Collaborators on the grant include Kim Palladino at the University of Oxford, Peter Love at Tufts University, and Calvin Johnson at San Diego State University.
With the funding, the researchers plan to use a quantum simulator to calculate the detector response to dark matter particles and neutrinos. The simulator to be used is an array of 121 neutral atom qubits currently being developed by Saffman’s group. Much of the research plan is to understand and mitigate the behavior of the neutral atom array so that high accuracy and precision calculations can be performed. The primary goal of this project is to apply lessons from the quantum information theory in high energy physics, while a secondary goal is to contribute to the development of quantum information theory itself.