Home

 

SUPERCOMPUTING TECHNIQUES IN ASTROPHYSICS

International school and workshop Campus San Joaquín, U. Católica, Santiago, Chile 19-23 April, 2010

 

This meeting has been funded by:

 

Pontificia Universidad Católica

Centro de Astrofísica FondapCentro de Astrofísica y Tecnologías Afines European Southern Observatory Servicio Alemán de Intercambio Académico Foreign Ministry of GermanyDurham University Santander Universities

 

Astronomy is increasingly becoming a computationally intensive field due to the ever larger datasets delivered by observational efforts to map ever larger volumes of the Universe, and also to provide ever finer details of galaxies and their stellar, gas and dust content.  As a result there are two computationally demanding, complementary approaches that need to be performed to uncover new findings and interpret them within a cosmological context:

The processing of the observational data so that it can be used to answer particular problems of cosmology and galaxy formation.  This includes computationally intensive statistical tools to be applied to datasets of hundreds of terabytes or even more.  The study of the power spectrum of density fluctuations mapped by the galaxies in the largest survey to date, the Sloan Digital Sky Survey, already requires parallel computing in order to be performed.  In the near future several new surveys will demand orders of magnitude increase in the available data and therefore in data processing capabilities.

The appropriate construction and interpretation of theoretical models of galaxy formation within a cosmological framework.   The simulations involve two general steps.  The first one is to include the necessary physics to produce a reliable evolution and required observables; the second is the analysis of outputs which need to be at least as large or detailed as the observational datasets, with information on the underlying properties and complete evolution history across time of galaxies and their components.