Antonio Bibiano edited Summary of Previous Work.tex  over 9 years ago

Commit id: cdd6fd05f60510ed80665e552b14ac7d04594d5a

deletions | additions      

       

For most of the models some solutions are readily available in the literature, but for some of those it was necessary to solve the equations numerically and I did this using the Mathematica software package.  In the second half of this year I started build the pipeline needed to perform an actual cosmological simulation.   The pipeline consists of 5 steps: in the first step one needs to calculate the linear matter power spectrum of the models using the CAMB [ ] \cite{Lewis:2002ah}  code. Then using this power spectrum one generates the initial conditions for the simulations using the 2LPTic code. code\cite{Crocce_2006}.  The actual simulation is then ready to run and the predicted dark matter field is then analyzed to build an halo catalogs and merger trees that are used to produce galaxy catalogs using the SAGE semi-analytical code for galaxy formation.\\ While I familiarized with the pipeline during the first six months of my PhD it was necessary to re-assess every step of it to understand which assumptions are made at every step and modify the code accordingly.  The CAMB code comes with a module that allows to compute the power spectrum using the Parametrized Post-Friedmann Framework [ http://camb.info/ppf/ ] \cite{Hu_2008}  which allows to compute the power spectrum for variable equation of state and quintessence dark energy models.\\ The initial conditions code instead relies heavily on the assumption of an underlying LCDM cosmology, in order to account for a different expansion history I modified this code to read an input table and added routines to interpolate the values for the Hubble function and the growth function.  The same assumptions were hard coded into the cosmological simulation code GADGET [ ] \cite{Springel_2005}  , I first did my modifications on version 2 and then on version 3 of this code to read the aforementioned tables and also to account for a variable newton’s gravitational constant, this is the fastest way to include the possibility to simulate a large class of different models whose different characteristics can be easily recast in a form where the dark energy density varies with time alongside the gravitational constant like the “Running FLRW model” [ ] \cite{Grande_2011}  , another adjustment I made in the main gravitational routine of the code allows me to simulate a different gravitational coupling between different species (eg. dark matter self coupling, dark matter-baryon coupling) like the ones described by the Coupled Dark Energy scenarios [ ].\\ \cite{Baldi_2012}.\\  Having access to version 3 of the GADGET code allowed me to analyze the simulation on the flight using the builtin halo finder. This code automatically benefits from the modified routines and analyzes the raw simulation data to recognize gravitationally bound objects and calculates their properties to create a catalog of Dark Matter Halos and the substructures contained within them. Doing this at every time-step allows also the creation of the Merger Trees by keeping track of the progenitor/descendant relationship between halos at different redshift to study their formation and evolution.\\  The last step of the pipeline involves running Semi-Analytical models against the Merger Trees generated from the simulations, this code generates a galaxy catalog using physically grounded analytic approximations to describe the physical processes acting on the baryonic gas, such as gas cooling, star formation and feedback mechanisms and while those prescriptions are only weakly dependent on the cosmology there is obviously a dependence on the expansion history when it’s necessary to convert between cosmological and galactic time-scales and a dependance on the strength of the gravitational coupling.\\  During my review talk I will show some example of the results obtained using this pipeline for test simulations of the “Running FLRW” model, which was chosen for being compelling from a theoretical point of view and a good test of the modifications performed on the pipeline.\\