The Big Bang, supernovae, collisions of nuclei at breakneck speeds—our universe is filled with extreme phenomena, both natural and human-made. But the surprising thing is that all of these seemingly distinct processes are governed by the same underlying physics: a combination of quantum mechanics and Einstein’s theory of special relativity known as quantum field theory.
Theoretical nuclear and particle physicists wield quantum field theory in their efforts to understand interactions between many particles or the behavior of particles with extremely large energies. This is no easy feat: At least theoretically, quantum field theory plays out in an infinite universe with particles constantly popping in and out of existence. Even the world’s biggest supercomputer would never be able to model it exactly. Fortunately, there are many computational tricks that can make the problem more tractable—like cutting up the infinite universe into a finite grid and taking judicious statistical samples instead of tracking every parameter of every particle—but they can only help so much.
Over the past few years, a growing group of scientists has become wise to the potential of quantum computers to approach these calculations in a completely new way. With a fully functioning quantum computer, a lot of the approximations could be avoided, and the quantum nature of the universe could be modeled with true quantum hardware. However, quantum computers are not yet big and reliable enough to really tackle these problems, and the algorithms nuclear and particle physicists would need to run on them are not yet fully developed.
“Even if we have large-scale, fully capable quantum computers tomorrow,” said Zohreh Davoudi, associate professor of physics at UMD, “we don’t actually have all the theoretical tools and techniques to use them to solve our grand-challenge problems.”
Classical computers require exponential resources to simulate quantum physics. To simulate one extra tick of the clock or include one extra particle, the amount of computing power must grow significantly. So, the classical methods resort to approximations that fall short because they leave out details and lose the ability to address certain kinds of questions. For one, they can’t keep up with the real-time quantum evolution of the early universe. Additionally, they can’t track what happens during collisions of heavy nuclei. And finally, they are forced to ignore the quantum interactions between the myriad particles in high-energy settings, like those that are emitted from an exploding star. A quantum computer, however, could tackle these problems on their own quantum turf, without needing as many resources or resorting to as many approximations.
Now, researchers want to make sure the nascent effort to use quantum computers to simulate the extreme events of the universe continues to thrive. Davoudi, along with JQI Adjunct Fellow and College Park Professor of Physics Chris Monroe and other researchers, penned a whitepaper laying out the case for funding quantum simulation research in particle physics, published in the journal PRX Quantum in May 2023. Davoudi also co-authored a similar whitepaper in the field of nuclear physics, available on the arXiv preprint server.
“It's a responsibility of researchers to also think at a larger scale,” said Davoudi, who is also a Fellow of the Joint Center for Quantum Information and Computer Science (QuICS) and the associate director of education at the National Science Foundation Quantum Leap Challenge Institute for Robust Quantum Simulation (RQS). “If we think this field is intellectually promising, interesting, and worth investing in as a scientist, we have to make sure that it stays healthy and lively for generations to come.”
Some sub-fields of physics, including the nuclear and particle physics communities, engage in long-term planning for the future of their field. Nuclear physicists in the U.S. plan seven years ahead, and particle physicists plan a full decade ahead. Researchers from many universities and national laboratories come together in meetings, seminars, and panel discussions over the course of a year to decide what the highest priorities in the field should be. Funding agencies in the U.S. and worldwide have historically taken these conclusions seriously. The whitepapers developed by Davoudi and her co-authors are a part of those efforts. In them, they argue for the importance of studying quantum simulation for nuclear and particle physics and make specific recommendations for further development.
“These new research directions in both nuclear physics and high-energy physics were not part of the last U.S. long-range planning processes, because the idea had simply not been introduced at the time,” Davoudi said.
Indeed, the ideas weren’t even on Davoudi’s radar six years ago when she came to UMD to join the physics faculty as a theoretical nuclear physicist. While she was busy searching for an apartment, Davoudi saw an announcement for a workshop hosted by QuICS exploring the intersection of her field with quantum computing. Instead of looking for a place to live, she spent several days at the workshop, talking to theorists and experimentalists alike.
Davoudi was enticed by the promise of quantum simulations to solve the kinds of problems she was unable to address with classical computational tools, and it changed the course of her career. In the years since, she has developed new theoretical techniques and collaborated with experimentalists to push the boundaries of what quantum simulators can do to help uncover the basic physics of the universe.
Davoudi wants to ensure that this burgeoning field continues to thrive into the future. In the whitepapers, she and her co-authors identified specific problems where quantum computing holds the most promise. Then, they made three main recommendations to ensure the success of the field for the next seven to 10 years.
First, they recommended funding for theoretical efforts to develop algorithms that run on quantum hardware. Even though the potential of quantum computing is clear, detailed algorithms for simulating quantum field theory on a quantum computer are still in their infancy. Developing these will require a dedicated effort by the nuclear and particle physics communities.
Second, they advocated for greater interdisciplinary communication between the nuclear, particle and quantum physics communities. Different quantum computer architectures will have different quirks and advantages, and the field theory folks will need to have access to them to figure out how to make the best use of each one. Certain implementations may, in turn, become motivated to engineer specific capabilities for the kinds of problems nuclear and particle physicists want to study. This can only be accomplished through close interdisciplinary collaboration, the authors claim.
“As a community, we cannot isolate ourselves from the quantum information and quantum technology communities,” Davoudi said.
Third, Davoudi and her co-authors believe it is key to bring in junior researchers, train them with a diverse set of skills, and give them opportunities to contribute to this growing effort. As with the QuICS workshop that inspired Davoudi, the community should invest in education and training for the relevant skills through partnerships between universities, national labs and the private sector.
“This is a new field, and you have to build the workforce,” Davoudi said. “I think it's important for our field to bring in diverse talent that would allow the field to continue to intellectually grow, and be able to solve the problems that we would like to eventually solve.”
Written by Dina Genkina