How Supercomputers Help Us Understand Galaxy Formation
Int'l team tackles galaxy formation simulations. They compared top simulation codes and found biases. This helps refine simulations and understand mysteries like early galaxies and missing satellites.
The researcher at the Institute of Astronomy (IA) of UNAM, Héctor Manuel Velázquez, collaborates in an international team of scientists that implemented a strategy to reduce the sources of error in simulations of galaxy formation.
The international cooperation AGORA (Assembling Galaxies Of Resolved Anatomy) aims to carry out a comparison project between the most important codes to follow the formation of galaxies within the large-scale structure of the universe, mentioned Velázquez, the only Mexican scientist who participates in the project, and who made several of the reproductions.
“The collaboration has helped us improve the numerical codes by finding and correcting errors and better understanding how the parameters in each code control astrophysical processes, including star formation. “This is the starting point for increasingly precise simulations of galaxy formation, which really help us reliably interpret observations,” he stated.
To carry out more realistic simulations, the results of the most used codes in the world (AREPO, GADGET, ART, RAMSES, CHANGA, GIZMO, among others) were compared based on the same initial conditions and the star formation model, leaving other aspects free both physics and numerical.
Meanwhile, the coordinator of the UNAM Models and Data Laboratory, Octavio Valenzuela Tijerino, pointed out that the above made it possible to quantify and assess under what conditions similar or convergent results are achieved between all groups. “This effort allows us to separate the biases associated with each computational code used in the interpretation of galaxy observations.”
The comparisons made are based on computer codes, which each community has developed to follow the formation and evolution of galaxies:
We have programs with different strategies to represent physical quantities, for example meshes that are like a grid where the physics of the problem is solved. They can automatically focus on certain regions and increase the resolution in that area to follow the evolution process in more detail; In some cases these are not a grid, but rather a map of triangular tiles to easily fit the shape of the simulated objects.
Additionally, there are other particle-based programs or those that use both meshes and particles. Each one has its benefits, limitations and advantages, commented Velázquez.
Computer codes
The properties of galaxies are the result of complex processes between gas, dust, radiation, stars, and the properties of the universe as a whole. Interpreting their observations in detail requires considering such complexity, usually using supercomputers.
Recently, significant progress has been made in this type of representation, but there are still several challenges for this approach. Observations from the James Webb Space Telescope show possible evidence of well-developed galaxies early in the evolution of the universe, in apparent conflict with theoretical calculations.
Another challenge discovered in the late 1990s is the apparent lack of bright satellite galaxies around the Milky Way or other similar galaxies, and some simulations already provided possible explanations.
It is known that an important part of the atoms (baryons) is found in the halo of the galaxies surrounding the disk; Detailed predictions of the properties of this gas have been a pending task.
One of the difficulties that astrophysicists face in explaining how they formed from the Big Bang to the present day are the sources of uncertainty related to the computer codes with which the formation of stars, their explosions, as well as the movement of gas, dark matter and the stars themselves.
What are the limitations?
They are presented in the resolution, numerical precision or in the amount of memory or processor time they require, as well as by hypotheses about various physical factors such as the evolution and explosion of stars, gas flows in galaxies, as well as stellar movements or dark matter.
Due to the above, the hundreds of simulations have meant considerable computational efforts that involved millions of hours of computing time and access to large storage resources: “The most complicated thing was obtaining the supercomputing resources,” Velázquez stressed.
Furthermore, the human element is vital to achieve the results obtained, for this reason, the collaboration is made up of so many scientists; However, there are few researchers and students contributing to these studies in our country:
“Mexico's contribution to the realization and calibration of these simulations that help consolidate the understanding of phenomena in galaxies would be greater if there were more people, especially young people, working on this type of projects,” said Valenzuela Tijerino.
Both experts recognized: although it might be thought that in our country it is difficult to participate in this type of efforts due to the large computing resources they require or the large groups, "there are niches that can be taken advantage of if efforts are joined to carry out these collaborations."
Furthermore, astronomy has a transdisciplinary nature and astrophysicists, researchers and students from other areas — such as physics, chemistry, engineering or computing — can also contribute to the development and improvement of tools to analyze the results of these studies.
According to Héctor Manuel Velázquez, we can imagine that a supercomputer is an agglomerate of several like those you have at home, but they all work collaboratively, for many hours and with a connection faster than any home internet.
Using these tools allows us to accelerate discoveries, which could not be made because they last longer than the human life span. “In the future, realism will increase and much more precise interpretations will be possible,” he added.
The results
The results of this research are in three new papers from the AGORA collaboration, the so-called CosmoRun simulations. They analyzed the formation of a galaxy with the mass of the Milky Way.
The simulations share the same astrophysical assumptions about the ultraviolet background radiation, the physics of gas cooling and heating, and star formation, but differ in the code architecture and the physics of stellar feedback. With the help of the new results, it was determined that disk galaxies, such as the Milky Way, could have begun to form early in the history of the universe, as revealed in recent observations by the James Webb Space Telescope.
Likewise, they discovered that the number of bright satellite galaxies – those that orbit larger galaxies – matches the observations, regardless of the simulation strategy used, which reduces an old problem called “lost satellites” that consists of the absence of a large population of small bright satellite galaxies in observations, which is predicted by dark matter simulations without the inclusion of gas and stars.
Héctor Manuel Velázquez and Octavio Valenzuela Tijerino suggested that more young people interested in participating and collaborating in this project seek to be part of it: The doors are open, since with more human talent, it will be possible to accelerate the obtaining of results to continue improving observation techniques and understanding of the universe.
The study was led by Ji-hoon Kim at Seoul National University, Korea; Joel Primack, University of California Santa Cruz; and Santi Roca-Fàbrega, Lund University, Sweden.
Supercomputers were used in several countries, including Miztli at the DGTIC UNAM; Atocatl and Tochtli, at LAMOD-UNAM; as well as Perlmutter at NERSC, HIPAC and XSEDE, in the US; CfCA and Oakforest-PACS, in Japan.
Around 160 researchers from 60 universities around the world participate in the AGORA collaboration. Part of the study has been published in The Astrophysical Journal.