The largest ever “supercomputer” simulation has been created to investigate the Universe’s evolution – from the Big Bang to the present day.
Astronomers say the ‘Flamingo’ simulations calculate the evolution of all the components of the Universe – ordinary matter, such as stars and planets, dark matter and dark energy – based on the laws of physics.
As the simulations progress, virtual galaxies and galaxy clusters emerge in precise detail.
Flamingo stands for Full-hydro Large-scale structure simulations with All-sky Mapping for the Interpretation of Next Generation Observations.
Scientists from Durham University and Liverpool John Moores University along with colleagues at Leiden University in Holland created the simulation.
They hope it will allow researchers to compare the virtual Universe with observations of the real thing being captured by new high-powered telescopes, such as the James Webb Space Telescope.
It could help scientists understand if the standard model of cosmology – used to explain the evolution of the Universe – provides a good description of reality.
Previous simulations, which have been compared to observations of the Universe, have focused on cold dark matter – believed to be a key component of the structure of the cosmos.
However, astronomers now say that the effect of ordinary matter, which makes up only 16 percent of all matter in the Universe, and neutrinos, tiny particles that rarely interact with normal matter, also need to be taken into account when trying to understand the Universe’s evolution.
The first research papers from Flamingo were published in the journal Monthly Notices of the Royal Astronomical Society.
Principal Investigator Professor Joop Schaye, of Leiden University, said: “Although the dark matter dominates gravity, the contribution of ordinary matter can no longer be neglected since that contribution could be similar to the deviations between the models and the observations.”
The Flamingo simulations tracked the formation of the Universe’s structure in dark matter, ordinary matter and neutrinos, following the standard model of physics.
The researchers ran the simulations through a powerful supercomputer in Durham over the past two years using different resolutions and also altered other factors such as the strength of galactic winds and the mass of the neutrinos.
The first results showed that the inclusion of ordinary matter and neutrinos in the simulations is “essential” for making accurate predictions.
Evolution of the most massive galaxy cluster in the high-resolution simulation. The main frame shows the gas density (brightness) and temperature (hue), with white indicating the hottest and densest gas in the intra-cluster medium. The zoom inset shows the evolution of dark matter density in the centre of one cluster. The cluster grows both through continuous accretion of matter along filaments, and occasional dramatic mergers that shock-heat the gas.
New telescopes – such as the European Space Agency’s Euclid space telescope – are collecting massive amounts of data about galaxies, quasars and stars.
Scientists say the observations are posing questions about the theories behind current understanding of the evolution of the Universe.
They believe simulations such as Flamingo will play a “key role” in understanding the data by comparing theoretical predictions with observational data.
The simulations took more than 50 million processor hours on the Cosmology Machine (COSMA 8) supercomputer.
The background image shows the present-day distribution of matter in a slice through the largest FLAMINGO simulation, which is a cubic volume of 2.8 Gpc (9.1 billion lightyears) on a side. The luminosity of the background image gives the present-day distribution of dark matter, while the colour encodes the distribution of neutrinos. The insets show three consecutive zooms centred on the most massive cluster of galaxies; in order, these show the gas temperature, the dark matter density, and a virtual X-ray observation.
To make the Flamingo simulations possible, the researchers developed a new code, called SWIFT, which efficiently distributes the computational work over thousands of Central Processing Units (CPUs), sometimes as many as 65,000.
Flamingo research collaborator Professor Carlos Frenk, of Durham University, added: “Cosmology is at a crossroads.
“We have amazing new data from powerful telescopes some of which do not, at first sight, conform to our theoretical expectations. Either the standard model of cosmology is flawed or there are subtle biases in the observational data.
“Our super precise simulations of the Universe should be able to tell us the answer.”
Produced in association with SWNS Talker
“What’s the latest with Florida Man?”
Get news, handpicked just for you, in your box.