The Large Scale Structure of the Universe (LSS) was first discussed at the IAU Symposium No. 79 in 1977 in Tallinn. The title of the Symposium : “The Large Scale Structure of the Universe” , was the first official use of this term (J. Einasto). Since then, it has been of major interest in cosmology. This large scale pattern, emerged from the primordial density fluctuations of dark matter under the effect of gravity, is composed of nodes, filaments and walls surrounding large voids: it is a vast foam-like structure, aka the “cosmic web”, which provides constraints on the content of the universe and the nature of its components. Continua a leggere Drifting through the Cosmic Web
Star formation is a vibrant field in contemporary astrophysics, combining a rich diversity of physics and chemistry, sophisticated simulations with supercomputers, and cutting edge observational facilities on the ground and in space to tackle both fundamental and complex problems related to the origin of stars and galaxies in our relative neighbourhood – the Milky Way galaxy – and in the distant Universe. Continua a leggere Star formation in galaxies: from small to large scales
Un gruppo di ricercatori guidati da Mark Vogelsberger dell’Harvard-Smithsonian Center for Astrophysics, in collaborazione con l’Heidelberg Institute for Theoretical Studies in Germania, hanno realizzato la prima mappa virtuale, alquanto realistica, dell’Universo utilizzando una simulazione numerica denominata “Illustris“. Il modello ha permesso di ricreare uno spazio cubico di lato pari a 350 milioni di anni-luce in un intervallo di tempo di circa 13 miliardi di anni e con una risoluzione senza precedenti.
Vi siete chiesti cosa accade quando una stella di grande massa subisce un collasso gravitazionale? Uno dei “prodotti finali” è la formazione di una supernova. Le osservazioni di questi eventi spettacolari ci permettono di avere una serie di informazioni della superficie stellare quando la stella esplode mentre invece risulta alquanto complicato capire quali sono i meccanismi responsabili che avvengono nelle regioni più centrali e profonde all’interno della stella. Per studiare in dettaglio questi processi di alta energia, gli astrofisici devono eseguire diverse simulazioni numeriche che si basano sui vari tipi di stelle e sulle proprietà fondamentali delle interazioni massa-energia.
La Swinburne University of Technology ha ideato un programma virtuale di astronomia che permetterà agli scienziati di ricostruire una serie di visualizzazioni complesse, e a piacere, dell’Universo. Tutto ciò si potrà fare da casa con il proprio computer.
The Theoretical Astrophysical Observatory (TAO), funded by the Australian Government’s $48 million NeCTAR project, draws on the power of Swinburne’s gSTAR GPU supercomputer to allow astronomers to simulate the Universe and see how it would look through a wide range of telescopes. “TAO lets researchers take the data from massive cosmological simulations and map it onto an observer’s viewpoint, to test theories of how galaxies and stars form and evolve”, TAO project scientist, Swinburne Associate Professor Darren Croton, said. “TAO makes it easy and efficient for any astronomer to create these virtual universes. It’s the culmination of years of effort that is now at the fingertips of scientists around the world. Using TAO it might take a few minutes to create a mock catalogue of galaxies, versus months or even years of development previously“. Swinburne worked with eResearch company Intersect Australia Ltd, who designed the web interface with simplicity and user-friendliness in mind. Associate Professor Croton said that “it was important to create a service that could be used by any astronomer regardless of their area of expertise, because that accelerates the pace of science and boosts the chance of breakthroughs”. As new survey telescopes and instruments become available, they can be modelled within TAO to maintain an up-to-date set of observatories. “TAO could be especially useful for comparing theoretical predictions against observations coming from next-generation survey telescopes, like the Australian Square Kilometre Array Pathfinder (ASKAP) in Western Australia, and the SkyMapper Telescope run by the Australian National University (ANU). These will cover large chunks of the sky and peer back into the early stages of the Universe and are tasked with answering some of the most fundamental questions know to humankind”.
Swinburne University: Creating virtual universes with Swinburne's Theoretical Astrophysical Observatory
L’Universo è composto dal 95% di energia scura e materia scura. Comprendere la fisica del cosiddetto “settore scuro” rappresenta la principale sfida della cosmologia moderna. Oggi, grazie a sofisticate simulazioni numeriche, possiamo iniziare a comprendere l’evoluzione dell’Universo primordiale e la formazione delle strutture cosmiche.
The primary lens through which scientists look at the night sky is no longer only a telescope, it’s also a supercomputer. The new and coming generations of supercomputers will finally be capable of modeling the Universe in the detail and volume required by astronomical surveys of the sky that are now underway, or soon will be. Scientists use large cosmological simulations to test theories about the structure of the Universe and the evolution of the distribution of galaxies and clusters of galaxies. State of the art supercomputers let cosmologists make predictions and test them against data from powerful telescopes and space probes. Two decades of surveying the sky have culminated in the celebrated Cosmological Standard Model. Yet two of the model’s key pillars, dark matter and dark energy, together accounting for 95% of the Universe, remain mysterious. A research team led by Argonne is tackling this mystery, aided by some of the world’s fastest supercomputers. To model the distribution of matter in the Universe, the researchers are running some of the largest, most complex simulations of the large-scale structure of the universe ever undertaken.The Argonne team has run a 1.1-trillion-particle simulation on half a million processor cores of Mira, Argonne’s new Blue Gene/Q supercomputer. The team was among a few science teams from across the country to gain early access to the system, which is now online.
“In a very real sense, we only understand 4% of the Universe. To basic scientists like us, that’s a crime—that’s not allowed”, says Argonne physicist Steve Kuhlmann.
The power and speed of supercomputers and simulation codes have significantly advanced over the past decade. Mira enables cosmology runs with greater resolution and accuracy on much larger simulation volumes, giving researchers the ability to confront theory with observational data from wide-area cosmological surveys. Exploring the cosmic structure of the dark Universe is an enormously complex problem. As the Universe expands, gravitational attraction causes matter to coalesce and form structures, first sheets, then filaments where the sheets intersect, and then clumps where the filaments meet. As time progresses, one can begin to see more clearly the basic structure of an enormous web of voids, filaments, and clumps. Simulations at Argonne have calculated this web-like structure, the so-called cosmic web, in a cube of simulated space more than 13 trillion light-years across. “Because these trillions of particles are meant to trace matter in the entire universe, they are extremely massive, something in the range of a billion suns”, said Argonne computational physicist Salman Habib, the project’s director. “We know the gravitational dynamics of how these tracer particles interact, and so we evolve them forward to see what kind of densities and structure they produce, as a result of both gravity and the expansion of the Universe. That’s essentially what the simulation does: it takes an initial condition and moves it forward to the present to see if our ideas about structure formation in the Universe are correct”.
Next-generation sky surveys will map billions of galaxies to explore the physics of the “dark universe”. Science requirements for these surveys demand simulations at extreme scales in order to resolve galaxy-scale mass concentrations over the observational volumes of sky surveys. A key aspect of the Argonne project involves developing a major simulation suite covering approximately 100 different cosmological scenarios and combining them in a framework that can generate predictions for any scenario within the range covered by the original runs.
ANL: Exploring the dark universe at the speed of petaflops ANL: Dark energy: Q&A with Steve Kuhlmann
CosmicConference – Domenica 1° Dicembre ore 18
Il 96% dell’Universo
The workshop will last 5 weeks in total, including two one-week conferences on SNe and GRBs respectively. More than 100 people are expected to participate in each conference (capacity of the conference hall at YITP is 120 people maximally). The remaining 3 weeks are spent for workshops where participants can hear seminars in the morning and enjoy free discussions in the afternoon. The capacity of the visitor facilities at YITP during the 3-week workshop is 50 maximally. The main scopes of the 3-week workshop are Nuclear Physics in CC-SNe and GRBs (Oct. 14-18), CC-SNe (Oct. 21-25), and GRBs (Nov. 4-8). Participants can choose their favorite dates to stay in Kyoto during the workshop. We can offer some financial support, although the budget is limited. We will also arrange a textbox on the financial support in the registration form (registration form will open in April 2013).
- Explosion Mechanism of Core-Collapse Supernovae
- Equation of State for High-Density Matter
- Structure of Neutron Stars as Remnants of CC-SNe
- Collapsars and Magnetars as Central Engine of Long Gamma-Ray Bursts
- Merging Compact Binaries as Central Engine of Short Gamma-Ray Bursts
- Neutrinos and Gravitational Waves as Signals of Death of Massive Stars
- Progenitors of CC-SNe and GRBs
- Multi-Wavelength Observations of SNe, GRBs, and their Remnants
- Plasma Physics, Particle Acceleration, and Radiation Process in Shocks of SNe, GRBs, and their Remnants
- UHECRs and VHE-Neutrinos & Gamma-Rays from GRBs
- Explosive Nucleosynthesis in SNe and GRBs
Una serie di simulazioni numeriche mostra per la prima volta che la presenza di instabilità nel nucleo delle stelle di neutroni può determinare la formazione di campi magnetici giganteschi che possono a loro volta causare violente e drammatiche esplosioni stellari mai osservate nell’Universo.
An ultra-dense (“hypermassive”) neutron star is formed when two neutron stars in a binary system finally merge. Its short life ends with the catastrophic collapse to a black hole, possibly powering a short gamma-ray burst, one of the brightest explosions observed in the Universe. Short gamma-ray bursts as observed with satellites like XMM Newton, Fermi or Swift release within a second the same amount of energy as our Galaxy in one year. It has been speculated for a long time that enormous magnetic field strengths, possibly higher than what has been observed in any known astrophysical system, are a key ingredient in explaining such emission.
Scientists at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute/AEI) have now succeeded in simulating a mechanism which could produce such strong magnetic fields prior to the collapse to a black hole.
How can such ultra-high magnetic fields, stronger than ten or hundred million billion times the Earth’s magnetic field, be generated from the much lower initial neutron star magnetic fields? This could be explained by a phenomenon that can be triggered in a differentially rotating plasma in the presence of magnetic fields: neighbouring plasma layers, which rotate at different speeds, “rub against each other”, eventually setting the plasma into turbulent motion. In this process called magnetorotational instability magnetic fields can be strongly amplified. This mechanism is known to play an important role in many astrophysical systems such as accretion disks and core-collapse supernovae. It had been speculated for a long time that magnetohydrodynamic instabilities in the interior of hypermassive neutron stars could bring about the necessary magnetic field amplification. The actual demonstration that this is possible has only now been achieved with the present numerical simulations. The scientists of the Gravitational Wave Modelling Group at the AEI simulated a hypermassive neutron star with an initially ordered (“poloidal”) magnetic field, whose structure is subsequently made more complex by the star’s rotation. Since the star is dynamically unstable, it eventually collapses to a black hole surrounded by a cloud of matter, until the latter is swallowed by the black hole.
These simulations have unambiguously shown the presence of an exponentially rapid amplification mechanism in the stellar interior, the magnetorotational instability. This mechanism has so far remained essentially unexplored under the extreme conditions of ultra-strong gravity as found in the interior of hypermassive neutron stars.
This is because the physical conditions in the interior of these stars are extremely challenging. The discovery is interesting for at least two reasons. First, it shows for the first time unambiguously the development of the magnetorotational instability in the framework of Einstein’s theory of general relativity, in which there exist no analytical criteria to date to predict the instability. Second, this discovery can have a profound astrophysical impact, supporting the idea that ultra strong magnetic fields can be the key ingredient in explaining the huge amount of energy released by short gamma-ray bursts.
Conference rationale is to bring together scientific communities working on theory and numerical simulations on small scales: star formation, (MHD) turbulence, radiative and accretion processes to talk to researchers working on large scale cosmological simulations. The aim is to share theoretical and numerical expertise to further interaction between these scientific communities; discuss the modelling of feedback processes on different scales with the following scopes:
- pin down what are the most crucial physical processes at each scale for different systems
- find novel ways to incorporate these processes by learning from small and large scale simulations both in terms of numerical techniques and of more realistic physical modelling (e.g. improved effective/sub-grid models for the scales beyond the dynamical range)