Study on the Smallest Particles in the Universe

Since the 1930s, scientists have been making use of particle accelerators to better understand the structure of matter and laws of physics. These accelerators are very powerful machines, moving particles nearly at the speed of light and then clashing them to enable physicists study the resulting interactions and particles that are formed.

Most of the particle accelerators have the goal of providing better comprehension of hadrons― subatomic particles such as protons or neutrons made up of two or more particles called quarks. Quarks are some of the smallest particles in the universe and they carry only fractional electric charges. Scientists understand a bit of how quarks make up hadrons, but have little understanding of the properties of the individual quarks because they cannot be studied outside their respective hadrons.

Utilizing the Summit supercomputer resident at the Department of Energy’s Oak Ridge National Laboratory, a team of nuclear physicists led by Kostas Orginos at the Thomas Jefferson National Accelerator Facility and William & Mary developed an auspicious technique for measuring quark interactions in hadrons and has applied this technique to simulations using quarks with close-to-physical masses. To finish up the simulations, they used a powerful computation method called Lattice Quantum Chromodynamics or LCQD for short, along with the computing power of Summit, the fastest supercomputer in the nation. The results were published in ‘Physical Review Letters’.

‘Typically, scientists have only known a fraction of the energy and momentum of quarks when they’re in a proton’, said Joe Karpie, postdoctoral research scientist at Columbia University and leading author on the paper. ‘That doesn’t tell them the probability that quark could turn into a different kind of quark or particle. Whereas past calculations relied on artificially large masses to help speed up the calculations, we now have been able to simulate these at very close to physical mass, and we can apply this theoretical knowledge to experimental data to make better predictions about subatomic matter’.

The calculations of the team will complement the experiments that were performed on DOEs upcoming Electron-Ion Collider, EIC, a particle collider to be built at Brookhaven National Laboratory or BNL, that will make provision for detailed spatial and momentum 3D maps of how subatomic particles are distributed inside the proton.

Sufficient comprehension of the properties of individual quarks could aid the scientists in the prediction of possible events when quarks interact with the Higgs boson, a basic particle associated with the Higgs field, a field in particle physics theory that gives mass to matter that interacts with it. The technique could also assist scientists in the understanding of the phenomena that are governed by the weak force, which is responsible for radioactive decay.

To be accurate on the operation of quarks, scientists must average the properties of quarks inside their respective protons. Making use of results from collider experiments like the ones at the Relativistic Heavy Ion Collider at BNL, the Large Hadron Collider at CERN or DOE’s upcoming EIC, they can extract out a fraction of a quark’s energy and momentum.

However, the prediction of the amount of quarks interaction with particles such as the Higgs Boson and computing the full distribution of quark energies and momenta has long been a challenge in particle physics.

Bálint Joó is a new addition to the staff at the lab’s Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility. To resolve this challenge, Joó used the Chroma software suite for Lattice QCD and NVIDIA’s QUDA library. Lattice QCD empowers scientists with the ability to study quarks and gluons— the basic glue-like particles that hold quarks together— on a computer by representing space-time as a grid or a lattice on which the quark and gluon fields are formulated. Making use of Chroma and QUDA (for QCD on CUDA), Joó generated snapshots of the strong-force field in a cube of space-time, weighting the snapshots to describe what the quarks were doing in the vacuum. The other members of the team took these snapshots and simulated what would occur as quarks moved through the strong-force field.

‘If you drop a quark into this field, it will propagate similarly to how dropping an electric charge into an electric field causes electricity to propagate through the field’, said Joó.

With a grant of computational time from DOE’s Innovative and Novel Computational Impact on Theory and Experiment program, as well as support from the Scientific Discovery through Advanced Computing program and the Exascale Computing Project, the team took the propagator calculations and combined them using Summit to generate final particles that they could then use to extract results.

‘We set what are known as the bare quark masses and the quark-gluon coupling in our simulations’, said Joó. ‘The actual quark masses, which arise from these bare values, need to be computed from the simulations— for example, by comparing the values of some computed particles to their real-world counterparts, which are experimentally known.’

Taking knowledge from physical experiments, the team were aware that the lightest physical particles they were simulating—pi mesons or pions— should have a mass of around 140 megaelectron volts, or MeV. The team’s calculations ranged from 358 MeV down to 172 MeV, close to the experimental mass of pions.

The simulations needed the power of Summit because of the amount of vacuum snapshots the team had to generate and the amount of quark propagators that needed to be calculated on them. To get estimated results at the physical quark mass, calculations needed to be carried out at three different masses of quarks and extrapolated to the physical one. In total, the team used more than 1,000 snapshots over three different quark masses in cubes with lattices ranging from 323 to 643 points in space.

‘The closer the masses of the quarks in the simulation are to reality, the more difficult the simulation’, said Karpie. ‘The lighter the quarks are, the more iterations are required in our solvers, so getting to the physical quark masses has been a major challenge in QCD’.

Joó, who has been making use of the Chroma code on OLCF system since 2007, stated that advancements in algorithms have made contributions to the ability to run simulations at the physical mass.

‘Algorithmic improvements like multigrid solvers and their implementations in efficient software libraries such as QUDA, combined with hardware that can execute them, have made these kinds of simulations possible’.

While Chroma has been go-to code, Joó said improvements in code development will continue to avail opportunities to solve new problems in particle physics.

‘Despite having worked with this same code all these years, new things still happen under the hood’, he said. ‘There will always be new challenges because there will always be new machines, new GPUs and new methods that we will be able to take advantage of’.

In the future, the team plans to explore gluons and acquire a full 3D image of the proton with its various components.

By Marvellous Iwendi.

Source: Oak Ridge