Dan Gifford edited Introduction.tex  about 10 years ago

Commit id: 40623e1f0bc9104821dafc6a7e8d6ee160a017c6

deletions | additions      

       

P(\hat{\sigma} | \sigma) = \frac{1}{\sqrt{2\pi}S_{\ln\hat{\sigma} | \ln\sigma}} e^{\frac{(\ln\hat{\sigma} - \ln\sigma)^{2}}{2 S_{\ln\hat{\sigma} | \ln\sigma}^2}}  \end{equation}  But we are binning! That means that we have a distribution of masses in our bin that we must integrate over. What does this This  integral look like? takes the form:  \begin{equation}  \langle \hat{\sigma} \rangle = \int_{min(bin)}^{max_bin} \int_{min(bin)}^{max(bin)}  dM \frac{d \langle n \rangle}{dM} P(\hat{\sigma} | M) \end{equation}   So this is what we want to compare the measured observable with. A scatter convolved, mass-function weighted expectation value.     This being said, this is not what we are interested in. Really, we are interested in the average mass per bin. This is easy if we are binning on mass. Simply:   \begin{equation}   \langle M \rangle = \int_{min(bin)}^{max(bin)} \frac{d \langle n \rangle}{dM} dM   \end{equation}   But unfortanately, we are not binning on mass in the real universe. Instead, we are binning on some observable $\hat{\theta}$ and asking what is the probability of observing $M$. This switches some things around in the integrals, so let's take a look.     Before when we needed to know $\langle M \rangle$, we simply took the integral of the mass function within some bin. We no longer have this luxury. Instead, lets start with what we have/want. We have $\hat{\theta}$ and want $P(M | \hat{\theta})$. If we know the scatter between these two things, we could make some basic assumptions. We could say that $P(M | \hat{\theta})$ is log-normal with some scatter and a power-law relationship. However, to get a value for the bin we would need $\frac{d\langle n \rangle}{d \hat{\theta}}$. This requires turning $\frac{d\langle n \rangle}{d M}$ into $\frac{d\langle n \rangle}{d \hat{\theta}}$, although I suppose that can be achieved by knowing the average power-law relationship between $M$ and $\hat{\theta}$. Then   \begin{equation}   $\frac{d\langle n \rangle}{d \hat{\theta}}$ = $\frac{d\langle n \rangle}{d M}$ $\frac{d M}{d \hat{\theta}}$  \end{equation}