Greg Dobler edited introduction.tex  over 10 years ago

Commit id: bb1618b8c4267a82589995945cbf9ec6b4a66ad8

deletions | additions      

       

As light travels to us from a distant source, its path is deflected by the gravitational forces of intervening matter. The most dramatic manifestation of this effect occurs in strong lensing, when light rays from a single source can take several paths to reach the observer, causing the appearance of multiple images of the same source. These images will also be magnified in size and thus total brightness (because surface brightness is conserved in gravitational lensing). When the source is time varying, the images are observed to vary with delays between them due to the differing path lengths taken by the the light and the gravitational potential that it passes through. A common example of such a source in lensing is a quasar, an extremely luminous active galactic nucleus at cosmological distance. From the observations of the image positions, magnifications, and the time delays between the multiple images we can measure the mass structure of the lens galaxy itself (on scales $\geq M_{\odot}$) as well as a characteristic distance between the source, lens, and observer. This ``time delay distance'' encodes the cosmic expansion rate, which in turn depends on the energy density of the various components in the universe, phrased collectively as the cosmological parameters.   The time delays themselves have been proposed as tools to study massive substructures within lens galaxies \citep{KeetonAndMoustakas2009}, and for measuring cosmological parameters, primarily the Hubble constant, $H_0$ \citep[see, e.g.,][for a recent example]{SuyuEtal2013}, a method first proposed by \citet{Refsdal1964}. In the future, we aspire to measure further cosmological parameters, such as the properties of parameters (e.g.,  dark energy that accelerated the cosmic expansion, energy)  by combining large samples of measured time delay distances \citep[e.g.,][]{Linder2012}. It is clearly of great interest to develop to maturity the powers of time delay lens analysis for probing the dark universe. New wide area imaging surveys that repeatedly scan the sky to gather time-domain information on variable sources are coming online. This pursuit will reach a new height when the \emph{Large Synoptic Survey Telescope} (LSST) enables the first long baseline multi-epoch observational campaign on $\sim$1000 lensed quasars \citep{LSSTSciBook}. However, to use the measured LSST lightcurves to extract time delays for accurate cosmology will require detailed understanding of how, and how well, time delays can be reconstructed from data with real world properties of noise, gaps, and additional systematic variations. For example, to what accuracy can time delays between the multiple image intensity patterns be measured from individual double- or quadruply-imaged systems for which the sampling rate and campaign length are given by LSST? In order for time delay errors to be small compared to errors from the gravitational potential, we will need the precision of time delays on an individual system to be better than 3\%, and those estimates will need to be robust to systematic error. Simple techniques such as the ``dispersion'' method \citep{PeltEtal1994,PeltEtal1996} or spline interpolation through the sparsely sampled data \citep[e.g.,][]{TewesEtal2013a} yield time delays which {\it may} be insufficiently accurate for a Stage IV dark energy experiment. More complex algorithms such as Gaussian Process modeling \citep[e.g.,][]{TewesEtal2013a,Hojjati+Linder2013} may hold more promise. None of these methods have been tested on large scale data sets.