Quantum Computers Are Starting to Simulate the World of Subatomic Particles

There is a heated race to make quantum computers deliver practical results. But this race isn't just about making better technology—usually defined in terms of having fewer errors and more qubits, which are the basic building blocks that store quantum information. At least for now, the quantum computing race requires grappling with the complex realities of both quantum technologies and difficult problems. To develop quantum computing applications, researchers need to understand a particular quantum technology and a particular challenging problem and then adapt the strengths of the technology to address the intricacies of the problem.

Assistant Professor Zohreh Davoudi, a member of the Maryland Center for Fundamental Physics, has been working with multiple colleagues at UMD to ensure that the problems that she cares about are among those benefiting from early advances in quantum computing. The best modern computers have often proven inadequate at simulating the details that nuclear physicists need to understand our universe at the deepest levels.

Davoudi and JQI Fellow Norbert Linke are collaborating to push the frontier of both the theories and technologies of quantum simulation through research that uses current quantum computers. Their research is intended to illuminate a path toward simulations that can cut through the current blockade of fiendishly complex calculations and deliver new theoretical predictions. For example, quantum simulations might be the perfect tool for producing new predictions based on theories that combine Einstein’s theory of special relativity(link is external) and quantum mechanics to describe the basic building blocks of nature—the subatomic particles and the forces among them—in terms of “quantum fields(link is external).” Such predictions are likely to reveal new details about the outcomes of high-energy collisions in particle accelerators and other lingering physics questions.

Current quantum computers, utilizing technologies like the trapped ion device on the left, are beginning to tackle problems theoretical physicists care about, like simulating particle physics models. More than 60 years ago, the physicist Julian Schwinger laid the foundation for describing the relativistic and quantum mechanical behaviors of subatomic particles and the forces among them, and now his namesake model is serving as an early challenge for quantum computers. (Credit: Z. Davoudi/UMD with elements adopted from Emily Edwards/JQI (trapped ion device), Dizzo via Getty Images (abstract photon rays), and CERN (Schwinger photo)Current quantum computers, utilizing technologies like the trapped ion device on the left, are beginning to tackle problems theoretical physicists care about, like simulating particle physics models. More than 60 years ago, the physicist Julian Schwinger laid the foundation for describing the relativistic and quantum mechanical behaviors of subatomic particles and the forces among them, and now his namesake model is serving as an early challenge for quantum computers. (Credit: Z. Davoudi/UMD with elements adopted from Emily Edwards/JQI (trapped ion device), Dizzo via Getty Images (abstract photon rays), and CERN (Schwinger photo)

The team’s current efforts might help nuclear physicists, including Davoudi, to take advantage of the early benefits of quantum computing instead of needing to rush to catch up when quantum computers hit their stride. For Linke, who is also an assistant professor of physics at UMD, the problems faced by nuclear physicists provide a challenging practical target to take aim at during these early days of quantum computing.

In a new paper in PRX Quantum(link is external), Davoudi, Linke and their colleagues have combined theory and experiment to push the boundaries of quantum simulations—testing the limits of both the ion-based quantum computer in Linke’s lab and proposals for simulating quantum fields. Both Davoudi and Linke are also part of the NSF Quantum Leap Challenge Institute for Robust Quantum Simulation that is focused on exploring the rich opportunities presented by quantum simulations.

The new project wasn’t about adding more qubits to the computer or stamping out every source of error. Rather, it was about understanding how current technology can be tested against quantum simulations that are relevant to nuclear physicists so that both the theoretical proposals and the technology can progress in practical directions. The result was both a better quantum computer and improved quantum simulations of a basic model of subatomic particles

“I think for the current small and noisy devices, it is important to have a collaboration of theorists and experimentalists so that we can implement useful quantum simulations,” says JQI graduate student Nhung Nguyen, who was the first author of the paper. “There are many things we could try to improve on the experimental sides but knowing which one leaves the greatest impact on the result helps guides us in the right direction. And what makes the biggest impact depends a lot on what you try to simulate.”

The team knew the biggest and most rewarding challenges in nuclear physics are beyound the reach of current hardware, so they started with something a little simpler than reality: the Schwinger model. Instead of looking at particles in reality’s three dimensions evolving over time, this model pares things down to particles existing in just one dimension over time. The researchers also further simplified things by using a version of the model that breaks continuous space into discrete sites. So in their simulations, space only exist as one line of distinct sites, like a column cut off a chess board, and the particles are like pieces that must always reside in one square or another along that column.

Despite the model being stripped of so much of reality’s complexity, interesting physics can still play out in it. The physicist Julian Schwinger developed this simplified model of quantum fields to mimic parts of physics that are integral to the formation of both the nuclei at the centers of atoms and the elementary particles that make them up.

“The Schwinger model kind of hits the sweet spot between something that we can simulate and something that is interesting,” says Minh Tran, a MIT postdoctoral researcher and former JQI graduate student who is a coauthor on the paper. “There are definitely more complicated and more interesting models, but they're also more difficult to realize in the current experiments.”

In this project, the team looked at simulations of electrons and positrons(link is external)—the antiparticles of electrons—appearing and disappearing over time in the Schwinger model. For convenience, the team started the simulation with an empty space—a vacuum. The creation and annihilation of a particle and its antiparticle out of vacuum is one of the significant predictions of quantum field theory. Schwinger’s work establishing this description of nature earned him, alongside Richard Feynman and Sin-Itiro Tomonaga, the Nobel Prize in physics in 1965(link is external). Simulating the details of such fundamental physics from first principles is a promising and challenging goal for quantum computers.

Nguyen led the experiment that simulated Schwinger’s pair production on the Linke Lab quantum computer, which uses ions—charged atoms—as the qubits.

“We have a quantum computer, and we want to push the limits,” Nguyen says. “We want to see if we optimize everything, how long can we go with it and is there something we can learn from doing the experimental simulation.”

The researchers simulated the model using up to six qubits and a preexisting language of computing actions called quantum gates. This approach is an example of digital simulation. In their computer, the ions stored information about if particles or antiparticles exist at each site in the model, and interactions were described using a series of gates that can change the ions and let them influence each other.

In the experiments, the gates only manipulated one or two ions at a time, so the simulation couldn’t include everything in the model interacting and changing simultaneously. The reality of digital simulations demands the model be chopped into multiple pieces that each evolve over small steps in time. The team had to figure out the best sequence of their individual quantum gates to approximate the model changing continuously over time.

“You're just approximately applying parts of what you want to do bit by bit,” Linke says. “And so that's an approximation, but all the orderings—which one you apply first, and which one second, etc.—will approximate the same actual evolution. But the errors that come up are different from different orderings. So there's a lot of choices here.”

Many things go into making those choices, and one important factor is the model’s symmetries. In physics, a symmetry(link is external) describes a change that leaves the equations of a model unchanged. For instance, in our universe rotating only changes your perspective and not the equations describing gravity, electricity or magnetism. However, the equations that describe specific situations often have more restrictive symmetries. So if an electron is alone in space, it will see the same physics in every direction. But if that electron is between the atoms in a metal, then the direction matters a lot: Only specific directions look equivalent. Physicists often benefit from considering symmetries that are more abstract than moving around in space, like symmetries about reversing the direction of time.

The Schwinger model makes a good starting point for the team’s line of research because of how it mimics aspects of complex nuclear dynamics and yet has simple symmetries.

“Once we aim to simulate the interactions that are in play in nuclear physics, the expression of the relevant symmetries is way more complicated and we need to be careful about how to encode them and how to take advantage of them,” Davoudi says. “In this experiment, putting things on a one-dimensional grid is only one of the simplifications. By adopting the Schwinger model, we have also a greatly simplified the notion of symmetries, which end up becoming a simple electric charge conservation. In our three-dimensional reality though, those more complicated symmetries are the reason we have bound atomic nuclei and hence everything else!”

The Schwinger model’s electric charge conservation symmetry keeps the total amount of electric charge the same. That means that if the simulation of the model starts from the empty state, then an electron should always be accompanied by a positron when it pops into or out of existence. So by choosing a sequence of quantum gates that always maintains this rule, the researchers knew that any result that violated it must be an error from experimental imperfections. They could then throw out the obviously bad data—a process called post-selection. This helped them avoid corrupted data but required more runs than if the errors could have been prevented.

The team also explored a separate way to use the Schwinger model’s symmetries. There are orders of the simulation steps that might prove advantageous despite not obeying the model’s symmetry rules. So suppressing errors that result from orderings that don’t conform to the symmetry could prove useful. Earlier this year, Tran and colleagues at JQI showed(link is external) there is a way to cause certain errors, including ones from a symmetry defying order of steps, to interfere with each other and cancel out.

The researchers applied the proposed procedure in an experiment for the first time. They found that it did decrease errors that violated the symmetry rules. However, due to other errors in the experiment, the process didn’t generally improve the results and overall was not better than resorting to post-selection. The fact that this method didn’t work well for this experiment provided the team with insights into the errors occurring during their simulations.

All the tweaking and trial and error paid off. Thanks to the improvements the researchers made, including upgrading the hardware and implementing strategies like post-selection, they increased how much information they could get from the simulation before it was overwhelmed by errors. The experiment simulated the Schwinger model evolving for about three times longer than previous quantum simulations. This progress meant that instead of just seeing part of a cycle of particle creation and annihilation in the Schwinger model, they were able to observe multiple complete cycles.

“What is exciting about this experiment for me is how much it has pushed our quantum computer forward,” says Linke. “A computer is a generic machine—you can do basically anything on it. And this is true for a quantum computer; there are all these various applications. But this problem was so challenging, that it inspired us to do the best we can and upgrade our system and go in new directions. And this will help us in the future to do more.”

There is still a long road before the quantum computing race ends, and Davoudi isn’t betting on just digital simulations to deliver the quantum computing prize for nuclear physicists. She is also interested in analog simulations and hybrid simulations that combine digital and analog approaches. In analog simulations, researchers directly map parts of their model onto those of an experimental simulation. Analog quantum simulations generally require fewer computing resources than their digital counterparts. But implementing analog simulations often requires experimentalists to invest more effort in specialized preparation since they aren’t taking advantage of a set of standardized building blocks that has been preestablished for their quantum computer.

Moving forward, Davoudi and Linke are interested in further research on more efficient mappings onto the quantum computer and possibly testing simulations using a hybrid approach they have proposed(link is external). In this approach, they would replace a particularly challenging part of the digital mapping by using the phonons—quantum particles of sound—in Linke Lab’s computer as direct stand-ins for the photons—quantum particles of light—in the Schwinger model and other similar models in nuclear physics.

“Being able to see that the kind of theories and calculations that we do on paper are now being implemented in reality on a quantum computer is just so exciting,” says Davoudi. “I feel like I'm in a position that in a few decades, I can tell the next generations that I was so lucky to be able to do my calculations on the first generations of quantum computers. Five years ago, I could have not imagined this day.”

Original story by Bailey Bedford: https://jqi.umd.edu/news/quantum-computers-are-starting-simulate-world-subatomic-particles

 

Bennewitz Receives DOE Fellowship

Elizabeth Bennewitz, a first-year physics graduate student, has received a Department of Energy Computational Science Graduate Fellowship(link is external). Bennewitz is one of 33 recipients in 2022—the largest number of students this program has ever selected in a year.

The fellowships provide financial support, including tuition and a stipend, to each fellow for up to four years of their education. Additionally, Bennewitz and the other recipients will gain practical experience working in a DOE laboratory for three months.

“I am very honored to receive this fellowship and am grateful for the freedom it gives me to explore my interests in quantum information and computing,” Bennewitz says. “I'm also very thankElizabeth Bennewitz (credit:  Dan Spencer)Elizabeth Bennewitz (credit: Dan Spencer)ful for all the support and guidance I received from my professors and peers along the way at Bowdoin College, Perimeter(link is external) and here at Maryland.”

The fellowship is funded by the DOE's Office of Science(link is external) and the National Nuclear Security Administration's Office of Defense Programs(link is external) in order to train future leaders in the field of computational science.

“[The] Office of Science is proud to support the training of a diverse and accomplished group of students to become leaders among the next generation of computational scientists,” says Barbara Helland, DOE Associate Director of Science for Advanced Scientific Computing Research, in a press release. “As evidenced by the success of the current CSGF alumni, the new fellows’ research will advance efforts in a wide range of science and engineering topics that benefit Administration priorities and the American people.”

Bennewitz is working with Joint Quantum Institute and Joint Center for Quantum Information and Computer Science (QuICS)  Fellow Alexey Gorshkov. She has chosen to research large collections of interacting quantum particles—called many-body quantum systems. The physics of quantum interactions is an area of cutting-edge research and is important to quantum computer technologies. Many-body quantum interactions can also be used to develop simulations to explore challenging problems in materials science and chemistry.

“In my research, I look forward to using high-performance computing techniques to further our understanding of quantum systems as well as studying the high-performance computing capabilities of quantum systems themselves,” Bennewitz says. 

In her first year as a graduate student, Bennewitz has started exploring ways that quantum simulators might help researchers understand the interactions that are responsible for holding particles together to form the nuclei that are the cores of atoms.

“Echoing my thoughts from when Elizabeth was named a finalist for the Hertz fellowship, I'm again very happy for Elizabeth, and I'm again excited and honored that she chose to work with my group,” Gorshkov says.

Original story by Bailey Bedford: https://jqi.umd.edu/news/jqi-graduate-student-receives-doe-fellowship

Ott Elected to National Academy of Sciences

Distinguished University Professor Edward Ott has been elected to the 2022 class of the National Academy of Sciences, one of 120 members and 30 international members recognized for their exceptional and continuing achievements in original research. Richard Walker from the Department of Geology was also chosen.  

“I am thrilled that Dr. Ott and Dr. Walker have been elected to the National Academy of Sciences,” said Amitabh Varshney, Dean of the  College of Computer, Mathematical, and Natural Sciences (CMNS). “They are world-renowned scholars and leaders in their fields. This honor is richly deserved, and we are proud to have them as colleagues here at Maryland."Ed Ott  Ed Ott

Their election brings the number of CMNS faculty members in the National Academy of Sciences to 18.

Ott holds appointments in physics, the Department of Electrical and Computer Engineering, and the Institute for Research in Electronics and Applied Physics. He has spent his career conducting research in areas including the basic theory and applications of nonlinear dynamics, wave chaos, control of chaos, fractal basin boundaries, dynamics of large interconnected networks, chaotic dynamics of fluids, models of brain dynamics and learning, and weather prediction.  

Of his NAS election, he said, "I feel greatly honored by this recognition of my work, and also regard this as a recognition of the important role that the general field in which I have mostly worked—nonlinear dynamics and chaos—is now playing in science and technology research.”

Ott was nominated as a foreign member of the Academia Europaea in 2020 and is a fellow of the IEEE, American Physical Society, Society for Industrial and Applied Mathematics and World Innovation Foundation. He received the A. James Clark School of Engineering Outstanding Faculty Research Award in 2005.

  

Bilayer Graphene Inspires Two-Universe Cosmological Model

Physicists sometimes come up with crazy stories that sound like science fiction. Some turn out to be true, like how the curvature of space and time described by Einstein was eventually borne out by astronomical measurements. Others linger on as mere possibilities or mathematical curiosities.

In a new paper in Physical Review Research(link is external)Victor Galitski and graduate student Alireza Parhizkar have explored the imaginative possibility that our reality is only one half of a pair of interacting worlds. Their mathematical model may provide a new perspective for looking at fundamental features of reality—including why our universe expands the way it does and how that relates to the most miniscule lengths allowed in quantum mechanics. These topics are crucial to understanding our universe and are part of one of the great mysteries of modern physics.

The pair of scientists stumbled upon this new perspective when they were looking into research on sheets of graphene—single atomic layers of carbon in a repeating hexagonal pattern. They realized that experiments on the electrical properties of stacked sheets of graphene produced results that looked like little universes and that the underlying phenomenon might generalize to other areas of physics. In stacks of graphene, new electrical behaviors arise from interactions between the individual sheets, so maybe unique physics could similarly emerge from interacting layers elsewhere—perhaps in cosmological theories about the entire universe.A curved and stretched sheet of graphene laying over another curved sheet creates a new pattern that impacts how electricity moves through the sheets. A new model suggests that similar physics might emerge if two adjacent universes are able to interact. (Credit: Alireza Parhizkar, JQI)A curved and stretched sheet of graphene laying over another curved sheet creates a new pattern that impacts how electricity moves through the sheets. A new model suggests that similar physics might emerge if two adjacent universes are able to interact. (Credit: Alireza Parhizkar, JQI)

“We think this is an exciting and ambitious idea,” says Galitski, who is also a Fellow of the Joint Quantum Institute (JQI). “In a sense, it's almost suspicious that it works so well by naturally ‘predicting’ fundamental features of our universe such as inflation and the Higgs particle as we described in a follow up preprint(link is external).”

Stacked graphene’s exceptional electrical properties and possible connection to our reality having a twin comes from the special physics produced by patterns called moiré patterns. Moiré patterns form when two repeating patterns—anything from the hexagons of atoms in graphene sheets to the grids of window screens—overlap and one of the layers is twisted, offset, or stretched.

The patterns that emerge can repeat over lengths that are vast compared to the underlying patterns. In graphene stacks, the new patterns change the physics that plays out in the sheets, notably the electrons' behaviors. In the special case called “magic angle graphene,” the moiré pattern repeats over a length that is about 52 times longer than the pattern length of the individual sheets, and the energy level that governs the behaviors of the electrons drops precipitously, allowing new behaviors, including superconductivity.

Galitski and Parhizkar realized that the physics in two sheets of graphene could be reinterpreted as the physics of two two-dimensional universes where electrons occasionally hop between universes. This inspired the pair to generalize the math to apply to universes made of any number of dimensions, including our own four-dimensional one, and to explore if similar phenomenon resulting from moiré patterns might pop up in other areas of physics. This started a line of inquiry that brought them face to face with one of the major problems in cosmology.

“We discussed if we can observe moiré physics when two real universes coalesce into one,” Parhizkar says. “What do you want to look for when you're asking this question? First you have to know the length scale of each universe.”

A length scale—or a scale of a physical value generally—describes what level of accuracy is relevant to whatever you are looking at. If you’re approximating the size of an atom, then a ten-billionth of a meter matters, but that scale is useless if you’re measuring a football field because it is on a different scale. Physics theories put fundamental limits on some of the smallest and largest scales that make sense in our equations.

The scale of the universe that concerned Galitski and Parhizkar is called the Planck length(link is external), and it defines the smallest length that is consistent with quantum physics. The Planck length is directly related to a constant—called the cosmological constant(link is external)—that is included in Einstein’s field equations of general relativity(link is external). In the equations, the constant influences whether the universe—outside of gravitational influences—tends to expand or contract.

This constant is fundamental to our universe. So to determine its value, scientists, in theory, just need to look at the universe, measure several details, like how fast galaxies are moving away from each other, plug everything into the equations and calculate what the constant must be.

This straightforward plan hits a problem because our universe contains both relativistic and quantum effects. The effect of quantum fluctuations across the vast vacuum of space should influence behaviors even at cosmological scales. But when scientists try to combine the relativistic understanding of the universe given to us by Einstein with theories about the quantum vacuum, they run into problems.

One of those problems is that whenever researchers attempt to use observations to approximate the cosmological constant, the value they calculate is much smaller than they would expect based on other parts of the theory. More importantly, the value jumps around dramatically depending on how much detail they include in the approximation instead of homing in on a consistent value. This lingering challenge is known as the cosmological constant problem, or sometimes the “vacuum catastrophe.”

“This is the largest—by far the largest—inconsistency between measurement and what we can predict by theory,” Parhizkar says. “It means that something is wrong.”

Since moiré patterns can produce dramatic differences in scales, moiré effects seemed like a natural lens to view the problem through. Galitski and Parhizkar created a mathematical model (which they call moiré gravity) by taking two copies of Einstein’s theory of how the universe changes over time and introducing extra terms in the math that let the two copies interact. Instead of looking at the scales of energy and length in graphene, they were looking at the cosmological constants and lengths in universes.

Galitski says that this idea arose spontaneously when they were working on a seemingly unrelated project that is funded by the John Templeton Foundation(link is external) and is focused on studying hydrodynamic flows in graphene and other materials to simulate astrophysical phenomena.

Playing with their model, they showed that two interacting worlds with large cosmological constants could override the expected behavior from the individual cosmological constants. The interactions produce behaviors governed by a shared effective cosmological constant that is much smaller than the individual constants. The calculation for the effective cosmological constant circumvents the problem researchers have with the value of their approximations jumping around because over time the influences from the two universes in the model cancel each other out.

“We don't claim—ever—that this solves cosmological constant problem,” Parhizkar says. “That's a very arrogant claim, to be honest. This is just a nice insight that if you have two universes with huge cosmological constants—like 120 orders of magnitude larger than what we observe—and if you combine them, there is still a chance that you can get a very small effective cosmological constant out of them.”

In preliminary follow up work(link is external), Galitski and Parhizkar have started to build upon this new perspective by diving into a more detailed model of a pair of interacting worlds—that they dub “bi-worlds.” Each of these worlds is a complete world on its own by our normal standards, and each is filled with matching sets of all matter and fields. Since the math allowed it, they also included fields that simultaneously lived in both worlds, which they dubbed “amphibian fields.”

The new model produced additional results the researchers find intriguing. As they put together the math, they found that part of the model looked like important fields that are part of reality. The more detailed model still suggests that two worlds could explain a small cosmological constant and provides details about how such a bi-world might imprint a distinct signature on the cosmic background radiation—the light that lingers from the earliest times in the universe.

This signature could possibly be seen—or definitively not be seen—in real world measurements. So future experiments could determine if this unique perspective inspired by graphene deserves more attention or is merely an interesting novelty in the physicists’ toy bin.

“We haven't explored all the effects—that's a hard thing to do, but the theory is falsifiable experimentally, which is a good thing,” Parhizkar says. “If it's not falsified, then it's very interesting because it solves the cosmological constant problem while describing many other important parts of physics. I personally don't have my hopes up for that— I think it is actually too big to be true.”

The research was supported by the Templeton Foundation and the Simons Foundation.

Original story by Bailey Bedford: https://jqi.umd.edu/news/bilayer-graphene-inspires-two-universe-cosmological-model?fbclid=IwAR2IS02vynZeBfnmX2tdgEr1TdLYb2OUN1E1vIXGUj1lDiLvbgFPl_LCzxs