LHC Scientists Finally Detect Most Favored Higgs Decay

ZeeEvent Aug21 v5Candidate event showing the associated production of a Higgs boson and a Z boson, with the subsequent decay of the Higgs boson to a bottom quark and its antiparticle.

Scientists now know the fate of the vast majority of all Higgs bosons produced in the LHC.

Today at CERN, the Large Hadron Collider experiments ATLAS and CMS jointly announced the discovery of the Higgs boson transforming into bottom quarks as it decays. This is predicted to be the most common way for Higgs bosons to decay, yet was a difficult signal to isolate because it closely mimics ordinary background processes. This new discovery is a big step forward in the quest to understand how the Higgs enables fundamental particles to acquire mass.

After several years of refining their techniques and gradually incorporating more data, both experiments finally saw evidence of the Higgs decaying to bottom quarks that exceeds the 5-sigma threshold of statistical significance typically required to claim a discovery. Both teams found their results were consistent with predictions based on the Standard Model. UMD professors Alberto Belloni, Drew Baden, Sarah Eno, Nick Hadley and Andris Skuja are members of the CMS collaboration.

Higgs bosons are only produced in roughly one out of a billion LHC collisions and live only one-septillionth of a second before their energy is converted into a cascade of other particles. Because it is impossible to see Higgs bosons directly, scientists use these secondary particles to study the Higgs’ properties. Since its discovery in 2012, scientists have been able to identify only about thirty percent of all the predicted Higgs boson decays, while its decays to bottom quarks, which should occur sixty percent of the time, had not yet been observed.

“The Higgs boson is an integral component of our universe and theorized to give all fundamental particles their mass,” said Alberto Belloni of the University of Maryland. “But previously we had only directly seen the Higgs couplings to the tau lepton, and the W and Z bosons. Now we have seen the decay of the Higgs to a quark-antiquark pair. This measurement shows for the first time that the Higgs gives mass to a quark.”

The Higgs field is theorized to interact with all massive particles in the Standard Model, the best theory scientists have to explain the behavior of subatomic particles. But many scientists suspect that the Higgs could also interact with massive particles outside the Standard Model, such as dark matter. By finding and mapping the Higgs bosons’ interactions with known particles, scientists can simultaneously probe for new phenomena.

The next step is to increase the precision of these measurements so that scientists can study this decay mode with a much greater resolution and explore what secrets the Higgs boson might be hiding.

Further information:

ATLAS: https://atlas.cern/updates/press-statement/observation-higgs-boson-decay-pair-bottom-quarks

CMS: http://cms.cern/higgs-observed-decaying-b-quarks-submitted

JQI Scientists Monroe and Gorshkov are Part of a New, $15 Million NSF Quantum Computing Project

ion trapA fabricated trap that researchers use to capture and control atomic ion qubits (quantum bits). (Credit: K. Hudek/IonQ and E. Edwards/JQI)NSF has announced a $15 million award to a collaboration of seven institutions including the University of Maryland. The goal: Build the world’s first practical quantum computer.

"Quantum computers will change everything about the technology we use and how we use it, and we are still taking the initial steps toward realizing this goal," said NSF Director France Córdova. "Developing the first practical quantum computer would be a major milestone. By bringing together experts who have outlined a path to a practical quantum computer and supporting its development, NSF is working to take the quantum revolution from theory to reality."

Dubbed the Software-Tailored Architecture for Quantum co-design (STAQ) project, the new effort seeks to demonstrate a quantum advantage over traditional computers within five years using ion trap technology.

The project is the result of a National Science Foundation Ideas Lab—a week-long, free-form exchange among researchers from a wide range of fields that aims to spawn creative, collaborative proposals to address a given research challenge. The result of each Ideas Lab is interdisciplinary research that is high-risk, high-reward, cutting-edge and unlikely to be funded through traditional grant mechanisms.

JQI Fellow Christopher Monroe will lead the team developing the hardware. JQI Fellow Alexey Gorshkov will be involved in the theory side of the collaboration.

Text for this news item was adapted from the Duke University and NSF press releases on the award.

RESEARCH CONTACT
Christopher Monroe | This email address is being protected from spambots. You need JavaScript enabled to view it.;
MEDIA CONTACT
Emily Edwards | This email address is being protected from spambots. You need JavaScript enabled to view it.

Complexity Test Offers New Perspective on Small Quantum Computers

Simulating the behavior of quantum particles hopping around on a grid may be one of the first problems tackled by early quantum computers. (Credit: E. Edwards/JQI)

State-of-the-art quantum devices are not yet large enough to be called full-scale computers. The biggest comprise just a few dozen qubits—a meager count compared to the billions of bits in an ordinary computer’s memory. But steady progress means that these machines now routinely string together 10 or 20 qubits and may soon hold sway over 100 or more.

In the meantime, researchers are busy dreaming up uses for small quantum computers and mapping out the landscape of problems they’ll be suited to solving. A paper by researchers from the Joint Quantum Institute (JQI) and the Joint Center for Quantum Information and Computer Science (QuICS), published recently in Physical Review Letters, argues that a novel non-quantum perspective may help sketch the boundaries of this landscape and potentially even reveal new physics in future experiments.

The new perspective involves a mathematical tool—a standard measure of computational difficulty known as sampling complexity—that gauges how easy or hard it is for an ordinary computer to simulate the outcome of a quantum experiment. Because the predictions of quantum physics are probabilistic, a single experiment could never verify that these predictions are accurate. You would need to perform many experiments, just like you would need to flip a coin many times to convince yourself that you’re holding an everyday, unbiased nickel.

If an ordinary computer takes a reasonable amount of time to mimic one run of a quantum experiment—by producing samples with approximately the same probabilities as the real thing—the sampling complexity is low; if it takes a long time, the sampling complexity is high.

Few expect that quantum computers wielding lots of qubits will have low sampling complexity—after all, quantum computers are expected to be more powerful than ordinary computers, so simulating them on your laptop should be hard. But while the power of quantum computers remains unproven, exploring the crossover from low complexity to high complexity could offer fresh insights about the capabilities of early quantum devices, says Alexey Gorshkov, a JQI and QuICS Fellow who is a co-author of the new paper.

“Sampling complexity has remained an underappreciated tool,” Gorshkov says, largely because small quantum devices have only recently become reliable. “These devices are now essentially doing quantum sampling, and simulating this is at the heart of our entire field.”

To demonstrate the utility of this approach, Gorshkov and several collaborators proved that sampling complexity tracks the easy-to-hard transition of a task that small- and medium-sized quantum computers are expected to perform faster than ordinary computers: boson sampling.

Bosons are one of the two families of fundamental particles (the other being fermions). In general two bosons can interact with one another, but that’s not the case for the boson sampling problem. “Even though they are non-interacting in this problem, bosons are sort of just interesting enough to make boson sampling worth studying,” says Abhinav Deshpande, a graduate student at JQI and QuICS and the lead author of the paper.

In the boson sampling problem, a fixed number of identical particles are allowed to hop around on a grid, spreading out into quantum superpositions over many grid sites. Solving the problem means sampling from this smeared-out quantum probability cloud, something a quantum computer would have no trouble doing.

Deshpande, Gorshkov and their colleagues proved that there is a sharp transition between how easy and hard it is to simulate boson sampling on an ordinary computer. If you start with a few well-separated bosons and only let them hop around briefly, the sampling complexity remains low and the problem is easy to simulate. But if you wait longer, an ordinary computer has no chance of capturing the quantum behavior, and the problem becomes hard to simulate.

The result is intuitive, Deshpande says, since at short times the bosons are still relatively close to their starting positions and not much of their “quantumness” has emerged. For longer times, though, there’s an explosion of possibilities for where any given boson can end up. And because it’s impossible to tell two identical bosons apart from one another, the longer you let them hop around, the more likely they are to quietly swap places and further complicate the quantum probabilities. In this way, the dramatic shift in the sampling complexity is related to a change in the physics: Things don’t get too hard until bosons hop far enough to switch places.

Gorshkov says that looking for changes like this in sampling complexity may help uncover physical transitions in other quantum tasks or experiments. Conversely, a lack of ramping up in complexity may rule out a quantum advantage for devices that are too error-prone. Either way, Gorshkov says, future results arising from this perspective shift should be interesting. “A deeper look into the use of sampling complexity theory from computer science to study quantum many-body physics is bound to teach us something new and exciting about both fields,” he says.

Story by Chris Cesare

Reference Publication
"Dynamical Phase Transitions in Sampling Complexity," Abhinav Deshpande, Bill Fefferman, Minh C. Tran, Michael Foss-Feig, Alexey V. Gorshkov, Phys. Rev. Lett., 121, 030501 (2018)
Research Contact: Abhinav Deshpande, This email address is being protected from spambots. You need JavaScript enabled to view it.: Alexey Gorshkov, This email address is being protected from spambots. You need JavaScript enabled to view it.

Orginal story: https://jqi.umd.edu/news/complexity-test-offers-new-perspective-on-small-quantum-computers

Chris Monroe Co-authors Piece on National Quantum Initiative - The Washington Times

Quantum technology harnesses the radical power of quantum systems — such as isolated atoms, photons and electrons — to transform how we process and communicate information. But that potential can be realized only if our nation’s resources are focused in a way that helps bring quantum research from the laboratory to the marketplace.

Read More

IceCube Neutrinos Point to Long-Sought Cosmic Ray Accelerator

An international team of scientists, with key contributions from researchers at the University of Maryland, has found the first evidence of a source of high-energy cosmic neutrinos—ghostly subatomic particles that travel to Earth unhindered for billions of light years from the most extreme environments in the universe.

Read More