New 3D integrated semiconductor qubit saves space without sacrificing performance

Small but mighty, semiconducting qubits are a promising area of research on the road to a fully functional quantum computer. Less than one square micron, thousands of these qubits could fit into the space taken up by one of the current industry-leading superconducting qubit platforms, such as IBM’s or Google’s.

For a quantum computer on the order of tens or hundreds of qubits, that size difference is insignificant. But to get to the millions or billions of qubits needed to use these computers to model quantum physical processes or fold a protein in a matter of minutes, the tiny size of the semiconducting qubits could become a huge advantage.

Except, says Nathan Holman, who graduated from UW–Madison physics professor Mark Eriksson’s group with a PhD in 2020 and is now a scientist with HRL Laboratories, “All those qubits need to be wired up. But the qubits are so small, so how do we get the lines in there?”

In a new study published in NPJ Quantum Information on September 9, Holman and colleagues applied flip chip bonding to 3D integrate superconducting resonators with semiconducting qubits for the first time, freeing up space for the control wires in the process. They then showed that the new chip performs as well as non-integrated ones, meaning that they solved one problem without introducing another.

If quantum computers are to have any chance of outperforming their classical counterparts, their individual qubit units need to be scalable so that millions of qubits can work together. They also need an error correction scheme such as the surface code, which requires a 2D qubit grid and is the current best-proposed scheme.

a three-chip sandwich showing the device architecture.
Proposed approach: the 3D integrated device consists of a superconducting die (top layer) and a semiconducting qubit die (middle layer) brought together though a technique known as flip chip integration. The bottom layer, proposed but not studied experimentally in this work, will serve to enable wiring and readout electronics. This study is the first time that semiconducting qubits (middle layer) and superconducting resonators (top layer) have been integrated in this way, and it frees up space for the wiring needed to control the qubits. | Credit: Holman et al., in NPJ Quantum Information

To attain any 2D tiled structure with current semiconducting devices, it quickly gets to the point where 100% of available surface area is covered by wires — and at that point, it is physically impossible to expand the device’s capacity by adding more qubits.

To try to alleviate the space issue, the researchers applied a 3D integration method developed by their colleagues at MIT. Essentially, the process takes two silicon dies, attaches pillars of the soft metal indium placed onto one, aligns the two dies, and then presses them together. The result is that the wires come in from the top instead of from the side.

“The 3D integration helps you get some of the wiring in in a denser way than you could with the traditional method,” Holman says. “This particular approach has never been done with semiconductor qubits, and I think the big reason why it hadn’t is that it’s just a huge fabrication challenge.”

profile photo of Mark Eriksson
Mark Eriksson
profile photo of Nathan Holman
Nathan Holman

In the second part of their study, the researchers needed to confirm that their new design was functional — and that it didn’t add disadvantages that would negate the spacing success.

The device itself has a cavity with a well-defined resonant frequency, which means that when they probe it with microwave photons at that frequency, the photons transmit through the cavity and are registered by a detector. The qubit itself is coupled to the cavity, which allows the researchers to determine if it is functioning or not: a functioning qubit changes the resonant frequency, and the number of photons detected goes down.

They probed their 3D integrated devices with the microwave photons, and when they expected their qubits to be working, they saw the expected signal. In other words, the new design did not negatively affect device performance.

“Even though there’s all this added complexity, the devices didn’t perform any worse than devices that are easier to make,” Holman says. “I think this work makes it conceivable to go to the next step with this technology, whereas before it was very tricky to imagine past a certain number of qubits.”

Holman emphasizes that this work does not solve all the design and functionality issues currently hampering the success of fully functional quantum computers.

“Even with all the resources and large industry teams working on this problem, it is non-trivial,” Holman says. “It’s exciting, but it’s a long-haul excitement. This work is one more piece of the puzzle.”

The article reports that this work was sponsored in part by the Army Research Office (ARO) under Grant Number W911NF-17-1-0274 (at UW­–Madison) and by the Assistant Secretary of Defense for Research & Engineering under Air Force Contract No. FA8721-05-C-0002 (at MIT Lincoln Laboratory).

 

Magnetic fields implicated in the mysterious midlife crisis of stars

a brightly colored sun with a cutout showing into the core, with lines suggesting the spinning motion
Artist’s impression of the spinning interior of a star, generating the stellar magnetic field. This image combines a dynamo simulation of the Sun’s interior with observations of the Sun’s outer atmosphere, where storms and plasma winds are generated. | Credit:
CESSI / IISER Kolkata / NASA-SVS / ESA / SOHO-LASCO

This post was originally published by the Royal Astronomical Society. UW–Madison physics graduate student Bindesh Tripathi is the lead author of the scientific publication.

Middle-aged stars can experience their own kind of midlife crisis, experiencing dramatic breaks in their activity and rotation rates at about the same age as our Sun, according to new research published today in Monthly Notices of the Royal Astronomical Society: Letters. The study provides a new theoretical underpinning for the unexplained breakdown of established techniques for measuring ages of stars past their middle age, and the transition of solar-like stars to a magnetically inactive future.

Astronomers have long known that stars experience a process known as ‘magnetic braking’: a steady stream of charged particles, known as the solar wind, escapes from the star over time, carrying away small amounts of the star’s angular momentum. This slow drain causes stars like our Sun to gradually slow down their rotation over billions of years.

In turn, the slower rotation leads to altered magnetic fields and less stellar activity – the numbers of sunspots, flares, outbursts, and similar phenomena in the atmospheres of stars, which are intrinsically linked to the strengths of their magnetic fields.

profile photo of Bindesh Tripathy
Bindesh Tripathi

This decrease in activity and rotation rate over time is expected to be smooth and predictable because of the gradual loss of angular momentum. The idea gave birth to the tool known as ‘stellar gyrochronology’, which has been widely used over the past two decades to estimate the age of a star from its rotation period.

However recent observations indicate that this intimate relationship breaks down around middle age. The new work, carried out by Bindesh Tripathi at UW–Madison and the Indian Institute of Science Education and Research (IISER) Kolkata, India, provides a novel explanation for this mysterious ailment. Prof. Dibyendu Nandy, and Prof. Soumitro Banerjee of IISER are co-authors.

Using dynamo models of magnetic field generation in stars, the team show that at about the age of the Sun the magnetic field generation mechanism of stars suddenly becomes sub-critical or less efficient. This allows stars to exist in two distinct activity states – a low activity mode and an active mode. A middle aged star like the Sun can often switch to the low activity mode resulting in drastically reduced angular momentum losses by magnetized stellar winds.

Prof. Nandy comments: “This hypothesis of sub-critical magnetic dynamos of solar-like stars provides a self-consistent, unifying physical basis for a diversity of solar-stellar phenomena, such as why stars beyond their midlife do not spin down as fast as in their youth, the breakdown of stellar gyrochronology relations, and recent findings suggesting that the Sun may be transitioning to a magnetically inactive future.”

The new work provides key insights into the existence of low activity episodes in the recent history of the Sun known as grand minima – when hardly any sunspots are seen. The best known of these is perhaps the Maunder Minimum around 1645 to 1715, when very few sunspots were observed.

The team hope that it will also shed light on recent observations indicating that the Sun is comparatively inactive, with crucial implications for the potential long-term future of our own stellar neighbor.

Francis Halzen named Vilas Research Professor

Francis Halzen

UW–Madison physics professor Francis Halzen has been named a Vilas Research Professor. Created “for the advancement of learning,” Vilas Research Professorships are granted to faculty with proven research ability and unusual qualifications and promise. The recipients of the award have contributed significantly to the research mission of the university and are recognized both nationally and internationally.

Halzen, the Gregory Breit and Hilldale Professor of Physics, joined the UW­­–Madison faculty in 1972. He has made pioneering contributions to particle physics and neutrino astrophysics, and he continues to be the driving force of the international IceCube Collaboration.

Early in his career, Halzen cofounded the internationally recognized phenomenology research institute in the UW–Madison Department of Physics to promote research at the interface of theory and experiment in particle physics. This institute is recognized for this research and for its leadership in the training of postdocs and graduate students in particle physics phenomenology.

The IceCube Neutrino Observatory is the culmination of an idea first conceived in the 1960s, and one in which Halzen has played an integral role in its design, implementation, and data acquisition and analysis for the past three decades. After initial experiments confirmed that the Antarctic ice was ultratransparent and established the observation of atmospheric neutrinos, IceCube was ready to become a reality. From 2004 to 2011, the South Pole observatory was constructed — the largest project ever assigned to a university and one led by Halzen.

After two years of taking data with the full detector, the IceCube Neutrino Observatory opened a new window onto the universe with its discovery of highly energetic neutrinos of extragalactic origin. This discovery heralded the beginning of the exploration of the universe with neutrino telescopes. The IceCube observation of cosmic neutrinos was named the 2013 Physics World Breakthrough of the Year.

Nationally and internationally renowned for this work, Halzen was awarded a 2014 American Ingenuity Award, a 2015 Balzan Prize, a 2018 Bruno Pontecorvo Prize, a 2019 Yodh Prize, and a 2021 Bruno Rossi Prize.

With the Vilas Research Professorship, Halzen is also recognized for his commitment to education and service in the department, university, and international science communities. He has taught everything from physics for nonscience majors to advanced particle physics and special topics courses at UW–Madison. He has actively participated on several departmental and university committees as well as advisory, review, and funding panels. His input is highly sought by committees and agencies that assess future priorities of particle and astroparticle physics research.

“Francis Halzen has had a prolific, internationally recognized research career, has shown excellence as an educator who is able to effectively communicate cutting-edge science on all levels, and has made tireless and valued contributions in service of the department,” says Sridhara Dasu, Physics Department chair. “He is one of the most creative and influential physicists of the last half century and worthy of the prestigious Vilas Research Professorship.”

Vilas awards are supported by the estate of professor, U.S. senator and UW Regent William F. Vilas (1840-1908). The Vilas Research Professorship provides five years of flexible funding — two-thirds of which is provided by the Office of the Provost through the generosity of the Vilas trustees and one-third provided by the school or college whose dean nominated the winner.

Halzen joins department colleagues Profs. Vernon Barger and Sau Lan Wu as recipients of this prestigious UW–Madison professorship.

Physics projects funded in first round of UW’s Research Forward initiative

In its inaugural round of funding, the Office of the Vice Chancellor for Research and Graduate Education’s (OVCRGE) Research Forward initiative selected 11 projects, including two with physics department faculty involvement.

OVCRGE hosts Research Forward to stimulate and support highly innovative and groundbreaking research at the University of Wisconsin–Madison. The initiative is supported by the Wisconsin Alumni Research Foundation (WARF) and will provide funding for 1–2 years, depending on the needs and scope of the project.

The two projects from the department are:

Research Forward seeks to support collaborative, multidisciplinary, multi-investigator research projects that are high-risk, high-impact, and transformative. It seeks to fund research projects that have the potential to fundamentally transform a field of study as well as projects that require significant development prior to the submission of applications for external funding. Collaborative research proposals are welcome from within any of the four divisions (Arts & Humanities, Biological Sciences, Physical Sciences, Social Sciences), as are cross-divisional collaborations.

Correlated errors in quantum computers emphasize need for design changes

Quantum computers could outperform classical computers at many tasks, but only if the errors that are an inevitable part of computational tasks are isolated rather than widespread events.

Now, researchers at the University of Wisconsin–Madison have found evidence that errors are correlated across an entire superconducting quantum computing chip — highlighting a problem that must be acknowledged and addressed in the quest for fault-tolerant quantum computers.

The researchers report their findings in a study published June 16 in the journal Nature, Importantly, their work also points to mitigation strategies.

“I think people have been approaching the problem of error correction in an overly optimistic way, blindly making the assumption that errors are not correlated,” says UW–Madison physics Professor Robert McDermott, senior author of the study. “Our experiments show absolutely that errors are correlated, but as we identify problems and develop a deep physical understanding, we’re going to find ways to work around them.”

Read the full story at https://news.wisc.edu/correlated-errors-in-quantum-computers-emphasize-need-for-design-changes/

artist rendition of a 4-qubit chip with a dotted-line-like cosmic ray hitting it from out of the image frame, lighting up two neighboring qubits "red" to indicate they are affected by the cosmic ray's energy
In this artistic rendering, a high-energy cosmic ray hits the qubit chip, freeing up charge in the chip substrate that disrupts the state of neighboring qubits. 

CHIME telescope detects more than 500 mysterious fast radio bursts in its first year of operation

This post has been modified from the original post, published by MIT News

To catch sight of a fast radio burst is to be extremely lucky in where and when you point your radio dish. Fast radio bursts, or FRBs, are oddly bright flashes of light, registering in the radio band of the electromagnetic spectrum, that blaze for a few milliseconds before vanishing without a trace.

These brief and mysterious beacons have been spotted in various and distant parts of the universe, as well as in our own galaxy. Their origins are unknown, and their appearance is unpredictable. Since the first was discovered in 2007, radio astronomers have only caught sight of around 140 bursts in their scopes.

Now, a large stationary radio telescope in British Columbia has nearly quadrupled the number of fast radio bursts discovered to date. The telescope, known as CHIME, for the Canadian Hydrogen Intensity Mapping Experiment, has detected 535 new fast radio bursts during its first year of operation, between 2018 and 2019.

Profile photo of Moritz Münchmeyer
Moritz Münchmeyer

Scientists with the CHIME Collaboration, including researchers at the University of Wisconsin–Madison, have assembled the new signals in the telescope’s first FRB catalog, which they will present this week at the American Astronomical Society Meeting.

UW–Madison physics professor Moritz Münchmeyer is a member of CHIME-FRB and contributed to the statistical analysis of the new FRB catalog. He joined UW–Madison this spring and a part of his new group is continuing this work, with the goal of using FRBs as a novel probe of the physics of the universe.

“This is only the beginning of FRB research. For the first time we now have enough FRBs to study their statistical distribution. It turns out that FRBs come from all over the universe, from relatively nearby to half way back to the Big Bang,” Münchmeyer says. “They are also quite frequent, about 800 per day if we were to see them all. They are extremely powerful light sources at cosmological distances and thus provide a new window into the physics of the universe.”

For the full story, please visit https://news.mit.edu/2021/chime-telescope-fast-radio-bursts-0609

The large radio telescope CHIME, pictured here, has detected more than 500 mysterious fast radio bursts in its first year of operation, MIT researchers report. | Image Courtesy of CHIME

Dark Energy Survey releases most precise look at the universe’s evolution

This news piece has been slightly modified from this news story, first published by Fermilab. 

The Dark Energy Survey collaboration has created the largest ever maps of the distribution and shapes of galaxies, tracing both ordinary and dark matter in the universe out to a distance of more than 7 billion light years. The analysis, which includes the first three years of data from the survey, is consistent with predictions from the current best model of the universe, the standard cosmological model. Nevertheless, there remain hints from DES and other experiments that matter in the current universe is a few percent less clumpy than predicted.

New results from the Dark Energy Survey — a large international team that includes researchers from the University of Wisconsin–Madison — use the largest ever sample of galaxies over an enormous piece of the sky to produce the most precise measurements of the universe’s composition and growth to date. Scientists measured that the way matter is distributed throughout the universe is consistent with predictions in the standard cosmological model, the best current model of the universe.

Over the course of six years, DES surveyed 5,000 square degrees — almost one-eighth of the entire sky — in 758 nights of observation, cataloguing hundreds of millions of objects. The results, announced May 27, draw on data from the first three years — 226 million galaxies observed over 345 nights — to create the largest and most precise maps yet of the distribution of galaxies in the universe at relatively recent epochs.

Since DES studied nearby galaxies as well as those billions of light-years away, its maps provide both a snapshot of the current large-scale structure of the universe and a movie of how that structure has evolved over the course of the past 7 billion years.

profile photo of keith bechtol
Keith Bechtol

“This a stringent test of the current standard cosmological paradigm, a model proposing that 95% of the universe is dark matter and dark energy that we do not yet understand,” explains UW–Madison physics professor Keith Bechtol. “By measuring the apparent positions and shapes of hundreds of millions of galaxies in our survey, we test whether the cosmic structures that have formed in the universe today match the predictions based on structures observed in the early universe.”

To test cosmologists’ current model of the universe, DES scientists compared their results with measurements from the European Space Agency’s orbiting Planck observatory. Planck used light signals known as the cosmic microwave background to peer back to the early universe, just 400,000 years after the Big Bang. The Planck data give a precise view of the universe 13 billion years ago, and the standard cosmological model predicts how the dark matter should evolve to the present. If DES’s observations don’t match this prediction, there is possibly an undiscovered aspect to the universe. While there have been persistent hints from DES and several previous galaxy surveys that the current universe is a few percent less clumpy than predicted—an intriguing find worthy of further investigation—the recently released results are consistent with the prediction.

“In the area of constraining what we know about the distribution and structure of matter on large scales as driven by dark matter and dark energy, DES has obtained limits that rival and complement those from the cosmic microwave background,” said Brian Yanny, a Fermilab scientist who coordinated DES data processing and management. “It’s exciting to have precise measurements of what’s out there and a better understanding of how the universe has changed from its infancy through to today.”

a black background with lots of small bright white stars
Ten areas in the sky were selected as “deep fields” that the Dark Energy Camera imaged multiple times during the survey, providing a glimpse of distant galaxies and helping determine their 3-D distribution in the cosmos. Photo: Dark Energy

Ordinary matter makes up only about 5% of the universe. Dark energy, which cosmologists hypothesize drives the accelerating expansion of the universe by counteracting the force of gravity, accounts for about 70%. The last 25% is dark matter, whose gravitational influence binds galaxies together. Both dark matter and dark energy remain invisible and mysterious, but DES seeks to illuminate their natures by studying how the competition between them shapes the large-scale structure of the universe over cosmic time.

DES photographed the night sky using the 570-megapixel Dark Energy Camera on the Victor M. Blanco 4-meter Telescope at the Cerro Tololo Inter-American Observatory in Chile, a Program of the National Science Foundation’s NOIRLab. One of the most powerful digital cameras in the world, the Dark Energy Camera was designed specifically for DES and built and tested at Fermilab. The DES data were processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

“These analyses are truly state-of-the-art, requiring artificial intelligence and high-performance computing super-charged by the smartest young scientists around,” said Scott Dodelson, a physicist at Carnegie Mellon University who co-leads the DES Science Committee with Elisabeth Krause of the University of Arizona. “What an honor to be part of this team.”

To quantify the distribution of dark matter and the effect of dark energy, DES relied on two main phenomena. First, on large scales, galaxies are not distributed randomly throughout space but rather form a weblike structure due to the gravity of dark matter. DES measured how this cosmic web has evolved over the history of the universe. The galaxy clustering that forms the cosmic web, in turn, revealed regions with a higher density of dark matter.

images shows a huge camera inside an observatory
The Dark Energy Survey photographed the night sky using the 570-megapixel Dark Energy Camera on the 4-meter Blanco telescope at the Cerro Tololo Inter-American Observatory in Chile, a Program of the National Science Foundation’s NOIRLab. Photo: Reidar Hahn, Fermilab

Second, DES detected the signature of dark matter through weak gravitational lensing. As light from a distant galaxy travels through space, the gravity of both ordinary and dark matter can bend it, resulting in a distorted image of the galaxy as seen from Earth. By studying how the apparent shapes of distant galaxies are aligned with each other and with the positions of nearby galaxies along the line of sight, DES scientists inferred the spatial distribution (or clumpiness) of the dark matter in the universe.

Analyzing the massive amounts of data collected by DES was a formidable undertaking. The team began by analyzing just the first year of data, which was released in 2017. That process prepared the researchers to use more sophisticated techniques for analyzing the larger data set, which includes the largest sample of galaxies ever used to study weak gravitational lensing.

For example, calculating the redshift of a galaxy — the change in light’s wavelength due to the expansion of the universe — is a key step toward measuring how both galaxy clustering and weak gravitational lensing change over cosmic history.  The redshift of a galaxy is related to its distance, which allows the clustering to be characterized in both space and time.

“Redshift calibration is one topic where we significantly improved upon our year-1 data analysis,” said Ross Cawthon, a UW-Madison physics postdoc who led the redshift calibration efforts for two of the main galaxy samples. “We developed new methods and refined old ones. It has been a huge effort by DES members from all over the world.”

Ten regions of the sky were chosen as “deep fields” that the Dark Energy Camera imaged repeatedly throughout the survey. Stacking those images together allowed the scientists to glimpse more distant galaxies. The team then used the redshift information from the deep fields to calibrate measurements of redshift in the rest of the survey region. This and other advancements in measurements and modeling, coupled with a threefold increase in data compared to the first year, enabled the team to pin down the density and clumpiness of the universe with unprecedented precision.

Along with the analysis of the weak-lensing signals, DES also precisely measures other probes that constrain the cosmological model in independent ways: galaxy clustering on larger scales (baryon acoustic oscillations), the frequency of massive clusters of galaxies, and high-precision measurements of the brightnesses and redshifts of Type Ia supernovae. These additional measurements will be combined with the current weak-lensing analysis to yield even more stringent constraints on the standard model.

“DES has delivered cost-effective, leading-edge science results directly related to Fermilab’s mission of pursuing the fundamental nature of matter, energy, space and time,” said Fermilab Director Nigel Lockyer. “A dedicated team of scientists, engineers and technicians from institutions around the world brought DES to fruition.”

The DES collaboration consists of over 400 scientists from 25 institutions in seven countries.

“The collaboration is remarkably young. It’s tilted strongly in the direction of postdocs and graduate students who are doing a huge amount of this work,” said DES Director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. “That’s really gratifying. A new generation of cosmologists are being trained using the Dark Energy Survey.”

UW–Madison physics graduate student Megan Tabbutt was one of the many significant contributors to this work, developing new methods that contributed to an independent validation of the galaxy clustering analysis.

DES concluded observations of the night sky in 2019. With the experience of analyzing the first half of the data, the team is now prepared to handle the complete data set. The final DES analysis is expected to paint an even more precise picture of the dark matter and dark energy in the universe. And the methods developed by the team have paved the way for future sky surveys to probe the mysteries of the cosmos.

“This work represents a ‘big statement’ from the Dark Energy Survey. DES data combined with other observations provide world-leading constraints on the nature of dark energy,” Bechtol says. “At the same time, we are training a new generation of cosmologists, and pioneering advanced methodologies that will be essential to realize the full potential of upcoming galaxy surveys, including the Vera C. Rubin Observatory Legacy Survey of Space and Time.”

The recent DES results were presented in a scientific seminar on May 27. Twenty-nine papers are available on the arXiv online repository.

Dark Energy Survey result video Exploring 7 billion light years of space with the Dark Energy Survey

The Dark Energy Survey is a collaboration of more than 400 scientists from 25 institutions in seven countries. For more information about the survey, please visit the experiment’s website.

Flexible, easy-to-scale nanoribbons move graphene toward use in tech applications

greyscale scanning electron micrograph of graphene nanoribbons that looks like an intricate fingerprint. has also been described as a "zen garden"

From radio to television to the internet, telecommunications transmissions are simply information carried on light waves and converted to electrical signals.

Joel Siegel

Silicon-based fiber optics are currently the best structures for high-speed, long distance transmissions, but graphene — an all-carbon, ultra-thin and adaptable material — could improve performance even more.

In a study published April 16 in ACS Photonics, University of Wisconsin–Madison researchers fabricated graphene into the smallest ribbon structures to date using a method that makes scaling-up simple. In tests with these tiny ribbons, the scientists discovered they were closing in on the properties they needed to move graphene toward usefulness in telecommunications equipment.

“Previous research suggested that to be viable for telecommunication technologies, graphene would need to be structured prohibitively small over large areas, (which is) a fabrication nightmare,” says Joel Siegel, a UW–Madison graduate student in physics professor Victor Brar’s group and co-lead author of the study. “In our study, we created a scalable fabrication technique to make the smallest graphene ribbon structures yet and found that with modest further reductions in ribbon width, we can start getting to telecommunications range.”

For the full story, please visit: https://news.wisc.edu/flexible-easy-to-scale-nanoribbons-move-graphene-toward-use-in-tech-applications/

New nondestructive optical technique reveals the structure of mother-of-pearl

Most people know mother-of-pearl, an iridescent biomineral also called nacre, from buttons, jewelry, instrument inlays and other decorative flourishes. Scientists, too, have admired and marveled at nacre for decades, not only for its beauty and optical properties but because of its exceptional toughness.

“It’s one of the most-studied natural biominerals,” says Pupa Gilbert, a University of Wisconsin–Madison physics professor who has studied nacre for more than a decade. “It may not look like much — just a shiny, decorative material. But it can be 3,000 times more resistant to fracture than aragonite, the mineral from which it’s made. It has piqued the interest of materials scientists because making materials better than the sum of their parts is extremely attractive.”

Now, a new, nondestructive optical technique will unlock even more knowledge about nacre, and in the process could lead to a new understanding of climate history. Gilbert, UW–Madison electrical engineering professor Mikhail Kats — who is also an affiliate professor of physics — their students, and collaborators described the technique, called hyperspectral interference tomography, today in the journal Proceedings of the National Academy of Sciences.

Read the Full News Story | PNAS study

Highest-energy Cosmic Rays Detected in Star Clusters

For decades, researchers assumed the cosmic rays that regularly bombard Earth from the far reaches of the galaxy are born when stars go supernova — when they grow too massive to support the fusion occurring at their cores and explode.

Those gigantic explosions do indeed propel atomic particles at the speed of light great distances. However, new research suggests even supernovae — capable of devouring entire solar systems — are not strong enough to imbue particles with the sustained energies needed to reach petaelectronvolts (PeVs), the amount of kinetic energy attained by very high-energy cosmic rays.

And yet cosmic rays have been observed striking Earth’s atmosphere at exactly those velocities, their passage marked, for example, by the detection tanks at the High-Altitude Water Cherenkov (HAWC) observatory near Puebla, Mexico. Instead of supernovae, the researchers — including UW–Madison’s Ke Fang — posit that star clusters like the Cygnus Cocoon serve as PeVatrons — PeV accelerators — capable of moving particles across the galaxy at such high energy rates.

Their paradigm-shifting research provides compelling evidence for star forming regions to be PeVatrons and is published in two recent papers in Nature Astronomy and Astrophysical Journal Letters.

For the full news story, please visit https://www.mtu.edu/news/stories/2021/march/not-so-fast-supernova-highestenergy-cosmic-rays-detected-in-star-clusters.html.