Research, teaching and outreach in Physics at UW–Madison
News Archives
Mark Saffman awarded 2026 APS Ramsey Prize
Posted on
Mark Saffman, the Johannes Rydberg Professor of Physics and director of the Wisconsin Quantum Institute, won the American Physical Society’s 2026 Norman F. Ramsey Prize in Atomic, Molecular, and Optical Physics, and in Precision Tests of Fundamental Laws and Symmetries.
The Ramsey prize recognizes outstanding accomplishments in the two fields of Norman Ramsey: atomic, molecular, and optical (AMO) physics; and precision tests of fundamental laws and symmetries. Saffman won “for seminal developments of quantum information processing with neutral atoms that allow the investigation of many-body problems that are intractable by classical computing.” He shares the prize with Antoine Browaeys at the Institut d’Optique in France.
Mark Saffman
Saffman joined the UW–Madison physics faculty in 1999 with ideas for his research program but struggled to acquire enough funding. Then, he started reading theory papers about the relatively new field of quantum computing and how to develop qubits, or quantum bits.
“This was in an era when people were proposing all these different ideas for qubits,” Saffman says. “I read this paper about using Rydberg gates to entangle atomic qubits and thought, ‘This looks interesting, let’s do that.’ That was the smartest decision I ever made in my career.”
An atom can be induced into a Rydberg state by a strong laser, when one of its outer shell electrons is excited into a very high energy state. The atom is effectively much larger than usual, and can lead to interesting quantum properties. Relatively inexperienced in experimental atomic physics, Saffman approached Thad Walker, a professor in the department and an expert on how to laser cool atoms, about collaborating. A decade later, they had their major success: a Rydberg blockade.
“The basic interaction is that you excite one atom to a Rydberg state and then you cannot excite a second one close by,” Saffman says. “That blockade interaction lies behind the ability to do a logic gate — a CNOT gate — and entangle two qubits.”
A year later, Saffman and Walker demonstrated the first CNOT gate for atomic qubits. These qubits, also called neutral atom qubits, quickly are now one of the leading platforms for achieving fault tolerant quantum computing.
Over the next decade Saffman started to realize that building a fully functional quantum computer was not just a scientific effort, it was a major engineering effort, one that was likely outside the scope of an academic research group.
“It became clear to me that to compete at the forefront, I needed more resources. I wanted to go faster,” Saffman says. “So, I ended up joining forces with ColdQuanta (now Infleqtion), an existing small cold atom sensing and components company .”
The glow of red and green lasers and an array of supporting electronics fill the Saffman lab | Jacob Scott, PhD’25
Saffman brought his quantum computing ideas to the company as Chief Scientist for Quantum Information at Colorado-based Infleqtion in 2018, and the company now has a satellite office in Madison.
The partnership with Infleqtion did, in fact, accelerate Saffman’s research. In 2022, his group, including long-time scientist and group member Trent Graham, co-authored a paper with engineers at Infleqtion where they demonstrated the first quantum algorithm to be run on an atomic quantum computer. It was a huge proof of principle and significant step forward in the field.
Quantum information research has emerged as a major topic within the AMO physics community. At UW–Madison, Saffman has been a key player in that shift. In 2019, he helped develop the Wisconsin Quantum Institute, an interdisciplinary effort of all quantum information science and engineering researchers on campus. That same year, he was named the institute’s director.
“UW–Madison was one of the first places to have multiple serious efforts in qubits: Thad and I pioneered neutral atoms, (physics professor) Mark Eriksson pioneered silicon spin qubits, (physics professor) Robert McDermott has superconducting qubits,” Saffman says. “Now, a huge fraction of new faculty coming out of academia and starting their own groups are working in quantum information-related science and engineering, including many of our new faculty. The state of quantum computing at UW–Madison is very strong.”
Interdisciplinary physicist Mariel Pettee uses techniques grounded in machine learning to study a range of topics that span high energy physics and astrophysics, with an ultimate goal of developing a better understanding of the fundamental physical building blocks of our Universe.
Originally from Dallas, TX, Pettee was a physics and mathematics undergraduate at Harvard University, a master’s student in physics at the University of Cambridge, and a PhD student in physics at Yale University. While pursuing a postdoc at Lawrence Berkeley National Lab, she also joined the Flatiron Institute in New York City as a guest researcher. She then joined the UW–Madison physics faculty as part of the RISE-AI initiative in August 2025.
Please give an overview of your research.
My background is in high energy physics, and that training has fundamentally shaped the way I approach my work. But over the past several years, I have become more of what you might call a “data physicist” — someone with physics expertise who works at the intersection of physics and data science. In particular, I’m interested in how machine learning can help us do interdisciplinary physics research and make discoveries using massive experimental datasets that would otherwise be out of our reach.
On a broad scale, my research touches on high energy particle physics and astrophysics through the lens of machine learning. Some of my work applies recent machine learning techniques to domain-specific problems such as anomaly detection, object reconstruction, and unfolding. Another part of my work explores core questions in machine learning in areas such as self-supervised learning and likelihood-free inference in a physics-driven way. I’m also interested in developing large-scale foundation models for broader scientific use.
What are one or two main projects you’ll have your group focus on first?
The field of scientific foundation models has been rapidly taking shape over the last couple of years, but there are still a lot of open questions to explore. By researching what might make training foundation models on fundamental physics data distinct from training on more common industry-standard data, I think there is significant potential to understand our data more deeply.
I’m interested in simultaneously incorporating information from multiple heterogeneous layers of a detector, e.g. time series, images, and point clouds, as well as across detectors. Early projects in this direction will develop a variety of self-supervised learning strategies on multimodal HEP and astrophysics data to understand how models can simultaneously incorporate many different types of measurements of the same physics objects.
I’m also interested in studying stellar streams, which are remnants of ancient galaxies or globular clusters being absorbed into the Milky Way and serve as interesting tracers of local dark matter. The first step is to simply detect more of them using unsupervised or weakly supervised anomaly detection: trying to learn with no labels or with imperfect or missing labels. We can use machine learning models to automatically detect resonant anomalies in data, and stellar streams emerge as resonant anomalies in velocity space due to their constituents’ shared origin.
I’m optimistic that we will also eventually be able to use aggregate stream information to better map local dark matter substructure. Beyond their immediate physics use cases, streams can also serve as a nice testbed for understanding the limits of domain transfer for foundation models due to their resonant properties: perhaps particle physics data, with its 3D point cloud structure and “bump”-like anomalies, has more shared information with streams from the perspective of a foundation model than one might initially expect.
What attracted you to Madison and the university?
I felt a strong fit with Madison and the university from my first visit. I think that’s a combination of the general spirit of the department, how warm and open it felt, and how much I admired the researchers that I met when I was here. Also, the nature of the position that I was offered gave me the kind of flexibility that I dreamed of — to work and move between these spaces of high energy physics, astrophysics, and machine learning with a lot of freedom.
What is your favorite element and/or elementary particle?
Well, I have to pick a particle! I got into physics because of the Higgs boson. I started my physics career as an undergraduate at CERN on July 1st, 2012, and then the discovery of the Higgs boson was announced three days later. So I think I have the Higgs to thank for really getting me energized about this field. Waking up so early that morning, witnessing those presentations, seeing hundreds of people buzzing with excitement, scribbling on chalkboards, popping champagne corks — it made me feel like I was in the center of the universe.
What hobbies and interests do you have?
I love the performing arts of all kinds—contemporary dance, theater, music. I’m a dancer, choreographer, and occasional actor and director. I’m also an amateur birdwatcher.
Welcome, Prof. Josiah Sinclair!
Posted on
Josiah Sinclair
When he was younger, UW–Madison assistant professor of physics Josiah Sinclair wanted to be a scientist-inventor when he grew up. In high school, he would ask questions in biology and chemistry classes that his teachers said were really physics questions. So, when he began his undergrad at Calvin University, he majored in physics, believing that experimental physics would be at the intersection of his interests. In the end, it was quantum physics that really fascinated him, motivating him to complete a PhD in experimental quantum optics and atomic physics at the University of Toronto. He says, “The ethos of my PhD group was this idea that with modern technology, maybe we can invent an apparatus that can reproduce the essential elements of this or that classic thought experiment and learn something new.” After completing a postdoc at MIT, Sinclair joined the UW–Madison physics department as an assistant professor in August, where he will tinker in the lab as an experimental quantum physicist, and just maybe invent a new kind of neutral atom quantum computer.
Please give an overview of your research.
There’s a global race underway to build a quantum computer—a machine that operates according to the laws of quantum mechanics and uses an entirely different, more powerful kind of logic to solve certain problems exponentially faster than any classical computer can. Quantum computers won’t solve all problems, but there’s strong confidence they’ll solve some very important ones. Moreover, as we build them, we’re likely to discover new applications we can’t yet imagine.
The approach my group focuses on uses arrays of single neutral atoms as qubits. Right now, the central challenge in practical quantum computing is how to scale up quantum processors without compromising their quality. Today’s atom-array quantum computers are remarkable, hand-built systems that have reached hundreds or even thousands of qubits in recent years—a truly impressive feat and possible in part due to pioneering work done right here in Madison. However, as these systems grow larger, we’re hitting fundamental size limits that call for new strategies.
My lab is working to develop modular interconnects for neutral-atom quantum computers. Instead of trying to build a single massive machine, we aim to link multiple smaller systems together using single photons traveling through optical fibers. The challenge is that single photons are easily misplaced, so to make this work, we need to develop the most efficient atom–photon interfaces ever built—pushing the limits of our ability to control the interaction between one atom and one photon.
Once we get these quantum links working, we’ll have realized the essential building block for a truly scalable quantum computer and maybe someday the quantum internet. Beyond computing, these technologies could also enable new kinds of distributed quantum sensors, where multiple quantum systems work together to detect extremely faint signals spread across a large area, like photons arriving from distant planets.
What are the one or two main projects your new group will work on?
Our main focus will be to build two neutral atom quantum processors in adjacent rooms and link them together with an optical fiber. This project will teach us how to integrate highly efficient photonic interfaces—such as optical cavities—with atom arrays, and how to precisely control the interactions between atoms and photons. Step by step, we aim to demonstrate atom-photon entanglement and eventually send quantum information back and forth through the fiber.
We’re collaborating with a new company called CavilinQ, a Harvard spin-out supported by Argonne National Lab, to integrate a new cavity design with the geometry we want to explore for atom-photon coupling. Because we intend to iterate rapidly on the cavity design, our setup will be built on a precision translation stage, allowing us to easily slide the system in and out and swap out cavity components.
Another project in the lab will focus on developing a new kind of cold-atom quantum sensor. Most current sensors rely on magneto-optical traps, which require bulky electromagnets and impose constraints that limit performance. We plan to explore magnetic-field-free trapping techniques that could lead to simpler, more compact, and ultimately higher-performance quantum sensors.
What attracted you to Madison and the university?
Well, for me professionally, Madison’s a powerhouse in atomic physics and quantum computing. There are groups here that have been highly influential since the beginning in developing neutral atoms as a platform for quantum information science. So there’s a strong atomic physics community here that has incredible overlap with my research interests, and a thriving broader quantum information community as well. Some people work best in isolation, but that is not who I am, so the prospects of joining this vibrant collaborative environment was very appealing to me.
I also really enjoyed all my interactions with the members of the search committee and other faculty here both during my interview and subsequent visits. On the personal side, my wife’s family is all in the Chicago area, so the prospects of being so close to one side of the family were very appealing. We have a 18-month-old daughter, and when we visited, we just had such a positive impression of Madison as a place to have a family and to grow up.
What is your favorite element and/or elementary particle?
It’s rubidium. I worked with it in my PhD, I worked with it in my postdoc, and I will work with it again. It’s simple. It has one electron in the outer valence shell, which makes it easy to work with. It was one of the first atoms to be laser cooled and one of the first to be Bose condensed, but I think it still has some tricks for us up its sleeve. I believe the first quantum computers are going to be built out of rubidium atoms. Some people (and companies) think we will need a more complicated atom, like strontium or ytterbium, but I think we already have the atom we need—we just need to figure out how to make it work.
What hobbies and interests do you have?
In the last year: spending time with my eighteen-month-old daughter. It’s been a special time. I also enjoy photography. I do some photography of research labs, but mostly I do adventure photography. I don’t think of myself as a particularly talented photographer, my specialty is more being willing to lug a heavy camera up a mountain. I also really enjoy cycling, rock climbing, reading, and traveling.
Nuclear physicist Paul Quin has passed away
Posted on
Paul Quin
Emerit professor of physics Paul Quin passed away on October 9, 2025. He was 84.
Born in Brooklyn, NY in 1941, Quin received his doctorate in physics from the University of Notre Dame, where his thesis work centered on the spectroscopy of the SD-shell nuclei. He joined the nuclear physics group at UW–Madison as a postdoc in 1969, playing a central role in the construction and installation of the new Lamb-Shift polarized ion source. He was also one of three survivors of the 1970 Sterling Hall bombing.
Quin joined the faculty in 1971. His research focused on the use of polarized beams as a tool for nuclear spectroscopy and his group made numerous important contributions in this field. In addition, Quin was an important player in the many instrumentation development projects that took place in the nuclear physics lab during the 1970s and the early 80s. In particular, he was the leader of the first experiment to test storage-cell technology for targets of polarized hydrogen atoms, a technology which has gone on to become important for polarization experiments at storage ring machines throughout the world.
Around 1980, Quin began expanding his research focus, moving into the field of weak interactions. In the years that followed, he carried out a variety of interesting and important experiments on β decay of polarized nuclei. These experiments typically involved tests of the conserved-vector-current hypothesis or searches for right-handed currents. In 1986, he and T. Girard published an important paper which described a new and potentially very sensitive technique for detecting right-handed currents in β decay. This new concept, which involves measuring the polarized-nucleus beta asymmetry correlation, became the basis for a number of experiments performed over the subsequent decade in both the U.S. and Europe, with Quin playing a central role in many cases.
Later in his career, Quin continued to work in the area of weak interactions, helping to define the role of various nuclear physics experiments that place constraints on extensions of the standard model. Quin retired in 2001.
Quin also made many contributions to the teaching mission of the department. His great enthusiasm for teaching was always evident, and he frequently introduced new and innovative ideas in the classes he taught. In the ‘80s, he took responsibility for developing new experiments for the Physics 321 lab and upgraded a number of the existing experiments. Towards the end of his teaching career, he was a tireless instructor in the large introductory courses, contributing in a number of important ways to the implementation of computer-based laboratories. In addition, Quin was a staunch supporter of the department’s then-new Peer Mentor Tutor Program. He also supervised nine students who received doctorates under his guidance.
This post was derived from department archives
“Rival” neutrino experiments NOvA and T2K publish first joint analysis
Posted on
The combined results add to physicists’ understanding and validate the impressive collaborative effort between two competing — yet complementary — experiments.
This story was published by Fermilab
When the universe began, physicists expect there should have been equal amounts of matter and antimatter. But if that were so, the matter and antimatter should have perfectly canceled each other out, resulting in total annihilation.
And yet, here we are. Somehow, matter won out over antimatter — but we still don’t know how or why.
Physicists suspect the answer may lie in the mysterious behavior of abundant yet elusive particles called neutrinos. Specifically, learning more about a phenomenon called neutrino oscillation — in which neutrinos change types, or flavors, as they travel — could bring us closer to an answer.
The international collaborations representing two neutrino experiments, NOvA in the United States and T2K in Japan, recently combined forces to produce their first joint results, published October 22 in the journal Nature. This initial joint analysis provides some of the most precise neutrino-oscillation measurements in the field. The NOvA collaboration, centered at Fermilab, includes University of Wisconsin–Madison physicists in Brian Rebel’s group.
“These results are an outcome of a cooperation and mutual understanding of two unique collaborations, both involving many experts in neutrino physics, detection technologies and analysis techniques, working in very different environments, using different methods and tools,” says T2K collaborator Tomáš Nosek.
Caption: T2K in Japan and NOvA in the United States are both long-baseline experiments: they each shoot an intense beam of neutrinos that passes through both a near detector close to the neutrino source and a far detector hundreds of kilometers away. Both experiments compare data recorded in each detector to learn about neutrinos’ behavior and properties. | Credit: Fermilab
Different experiments, common goals
Despite their ubiquity, neutrinos are very difficult to detect and study. Even though they were first seen in the 1950s, the ghostly particles remain deeply enigmatic. Filling in gaps in our knowledge about neutrinos and their properties may reveal fundamental truths about the universe.
T2K and NOvA are both long-baseline experiments: they each shoot an intense beam of neutrinos that passes through both a near detector close to the neutrino source and a far detector hundreds of miles away. Both experiments compare data recorded in each detector to learn about neutrinos’ behavior and properties.
NOvA, the NuMI Off-axis νe Appearance experiment, sends a beam of neutrinos 810 kilometers from its source at the U.S. Department of Energy’s Fermi National Accelerator Laboratory near Chicago, Illinois, to a 14,000-ton liquid-scintillator detector in Ash River, Minnesota.
The T2K experiment’s neutrino beam travels 295 kilometers from Tokai to Kamioka — hence the name T2K. Tokai is home to the Japan Proton Accelerator Research Complex (J-PARC) and Kamioka hosts the Super-Kamiokande neutrino detector, an enormous tank of ultrapure water located a kilometer underground.
Since the experiments have similar science goals but different baselines and different neutrino energies, physicists can learn more by combining their data.
“By making a joint analysis, you can get a more precise measurement than each experiment can produce alone,” says NOvA collaborator Liudmila Kolupaeva. “As a rule, experiments in high-energy physics have different designs even if they have the same science goal. Joint analyses allow us to use complementary features of these designs.”
As long-baseline experiments, NOvA and T2K are ideal for studying neutrino oscillations, a phenomenon that can provide insight into open questions like charge-parity violation and the neutrino mass ordering. Two experiments with different baselines and energies have a better chance of disentangling the two effects than one experiment alone.
Interrogating neutrino oscillations
The mystery of neutrino mass ordering is the question of which neutrino is the lightest. But it isn’t as simple as placing particles on a scale. Neutrinos have miniscule masses that are made up of combinations of mass states. There are three neutrino mass states, but, confusingly, they don’t map to the three neutrino flavors. In fact, each flavor is made of a mix of the three mass states, and each mass state has a different probability of acting like each flavor of neutrino.
There are two possible mass orderings, called normal or inverted. Under the normal ordering, two of the mass states are relatively light and one is heavy, while the inverted ordering has two heavier mass states and one light.
In the normal ordering, there is an enhanced probability that muon neutrinos will oscillate to electron neutrinos but a lower probability that muon antineutrinos will oscillate to electron antineutrinos. In the inverted ordering, the opposite happens. However, an asymmetry in the neutrinos’ and antineutrinos’ oscillations could also be explained if neutrinos violate CP symmetry — in other words, if neutrinos don’t behave the same as their antimatter counterparts.
The combined results of NOvA and T2K do not favor either mass ordering. If future results show the neutrino mass ordering mass ordering is normal, NOvA’s and T2K’s results are less clear on CP symmetry, requiring additional data to clarify. However, if the neutrino mass ordering is found to be inverted, the results published today provide evidence that neutrinos violate CP symmetry, potentially explaining why the universe is dominated by matter instead of antimatter.
“Neutrino physics is a strange field. It is very challenging to isolate effects,” says Kendall Mahn, co-spokesperson for T2K. “Combining analyses allows us to isolate one of these effects, and that’s progress.”
The combined analysis does provide one of the most precise values of the difference in mass between neutrino mass states, a quantity called Δ . With an uncertainty below 2%, the new value will enable physicists to make precision comparisons with other neutrino experiments to test whether the neutrino oscillation theory is complete.
What’s next
These first joint results do not definitively solve any mysteries of neutrinos, but they do add to physicists’ knowledge about the particles. Plus, they validate the impressive collaborative effort between two competing — yet complementary — experiments.
The NOvA collaboration consists of more than 250 scientists and engineers from 49 institutions in eight countries. The T2K collaboration has more than 560 members from 75 institutions in 15 countries. The two collaborations began active work on this joint analysis in 2019; it combines six years of data from NOvA, which began collecting data in 2014, and a decade of data from T2K, which started up in 2010. Both experiments continue to take data, and efforts are already underway to update the joint analysis with the new data.
“The joint analysis work has benefited both collaborations,” says Patricia Vahle, co-spokesperson for NOvA. “We have a much better mutual understanding of the strengths and challenges of the different experimental setups and analysis techniques.”
NOvA and T2K are the only currently operating long-baseline neutrino experiments. Their initial combined results lay a foundation for forthcoming neutrino experiments that will answer the questions around neutrinos unambiguously.
The Fermilab-led Deep Underground Neutrino Experiment is under construction in Illinois and South Dakota in the U.S. With its longer baseline of 1,800 kilometers, DUNE will be more sensitive to neutrino mass ordering and could give physicists a conclusive answer shortly after it turns on in the early years of the next decade.
In Japan, Hyper-Kamiokande, a sequel to Super-Kamiokande located beneath a mountain in Hida City, will be more sensitive to CP violation. And a medium-baseline reactor neutrino experiment in China called JUNO recently began additional studies of antineutrinos and their behavior. Two experiments that use neutrinos generated in the atmosphere to study oscillations, KM3Net-Orca and IceCube, also continue to take data.
Many physicists hope these next-generation neutrino experiments can come together — as NOvA and T2K have already done — to make progress on their shared scientific goals to learn more about neutrinos and their unusual properties.
“As shown in this very analysis, there are no truly ‘rivaling’ experiments because they all share a common goal of scientific study of a phenomenon,” says Nosek. “Collaborating is naturally important for the transfer of knowledge, know-how and experience, and for sharing resources, ideas and tools. The T2K-NOvA collaboration is not merely a sum of T2K and NOvA collaborations. It is much, much more.”
UW fostering closer research ties with federal defense, cybersecurity agencies
UW–Madison leaders seek to expand partnership with federal agencies to boost dual-use research funding.
Exploring Decades of Semiconductor Collaboration between Argonne National Lab & UW–Madison
UW–Madison and Argonne National Laboratory have built a portfolio of shared research for decades. Read how semiconductor researchers from all interest areas have benefited from this affiliation.