probability \(1-p_i\) .  Therefore, \(\mathbf{E}\exp\left(\lambda\left(-X_i\right)\right)=e^{-\lambda}p_i+\left(1-p_i\right)=1+\left(e^{-\lambda}-1\right)p_i\le\exp\left(\left(e^{-\lambda}-1\right)p_i\right)\).
Since \(\mu=\sum_{_{i=1}}^N\lambda_i\)\(\prod_{i=1}^N\mathbf{E}\exp\left(-\lambda X_i\right)\le\exp\left(\left(e^{-\lambda}-1\right)\mu\right)\).  Therefore, \(\mathbf{P}\left\{S_N\le t\right\}\le e^{\lambda t}\exp\left(\left(e^{-\lambda}-1\right)\mu\right)\).  Set \(\lambda=-\ln\left(\frac{t}{\mu}\right)=\ln\left(\frac{\mu}{t}\right)\).  Since \(t<\mu\), we have \(\lambda>0\), so this is a valid choice for \(\lambda\). By the definition of \(\lambda\)\(e^{\lambda t}=\left(\frac{\mu}{t}\right)^t\)\(e^{-\lambda}=\frac{t}{\mu}\)\(\left(e^{-\lambda}-1\right)\mu=\left(t-\mu\right)\).  Putting all this together, we obtain  \(\mathbf{P}\left\{S_N\le t\right\}\le\frac{\left(\frac{\mu}{t}\right)^te^t}{e^{\mu}}\le e^{-\mu}\left(\frac{e\mu}{t}\right)^t\).  
Exercise 2.3.3. Let \(X_{N,i},\ \ 1\le i\ \le N\) be independent random Bernoulli variables with mean \(p_{N,i}\) satisfying \(\mathbf{E}S_N=\sum_{i=1}^Np_{N,i}\rightarrow\lambda\) as \(N\rightarrow\infty\).  By the Poisson limit theorem, Theorem 1.3.4, p. 10, \(S_N\ \rightarrow\ \mathrm{Pois}\left(\lambda\right)\) as \(N\rightarrow\infty\) in distribution.  Therefore, \(\mathbf{P}\left\{S_N\ge t\right\}\) is arbitrarily close to \(\mathbf{P}\left\{X\ge t\right\}\) for sufficiently large \(N\).  Further, since \(t>\lambda\) and the partial sum \(\sum_{_{i=1}}^Np_{N,i}\le\lambda\) for all \(N\) (because the summands are positive), we have \(t\ge\mathbf{E}S_N\).  Therefore, the hypotheses of Chernoff's inequality, Theorem 2.3.1, are satisfied, and we can conclude that  
 \(\mathbf{P}\left\{S_N\ge t\right\}\le e^{-\lambda}\left(\frac{e\lambda}{t}\right)^t\) for all \(N\).  The conclusion follows from the previous sentence.
Exercise 2.3.5.  [Temporarily omitting because it is messy].
Exercise 2.3.6.  \(\mathbf{P\ \ }\left\{\right|X-\lambda\left|\ge t\right\}=\mathbf{P}\left\{\right|X-\lambda\left|\ge\lambda\frac{t}{\lambda}\right\}\le2e^{-c\lambda\frac{t^2}{^{\lambda^2}}}=2\exp\left(\frac{-ct^2}{\lambda}\right).\)  By the
Poisson limit theorem, if \(X_i\sim\mathrm{Ber}\left(p_i\right)\), where \(p_i\rightarrow0\) and \(\sum_{_{ }}^{ }p_i\rightarrow\lambda\), then \(S_n:=\sum_{_{i=1}}^NX_{N,i}\rightarrow X:=X\left(\lambda\right)\) in distribution.
Exercise 2.3.8.  Claim: since the moment generating function of the Poisson random variable is \(e^{\lambda\left[e^{\theta}-1\right]}\) and since the mgf of the sum of independent random variables is the product of the moment generating functions, the moment generating function of \(\sum_{ }^{ }X_i\), for \(X_i\sim\ \mathrm{Pois}\left(\lambda_i\right)\) independently is \(\exp\left(\sum_{ }^{ }\lambda_i\left[e^{\theta}-1\right]\right)\).  Thus, since the moment generating function determines the distribution uniquely, \(\sum_{ }^{ }X_i\sim\mathrm{Pois}\left(\sum_{ }^{ }\lambda_i\right)\).   Thus, we can choose \(\lambda_i\equiv1\) and the \(X_i\sim\mathrm{Pois\ }\left(\lambda_i\right)=\mathrm{Pois}\left(1\right)\).  Then the L-L Central Limit Theorem (Theroem 1.3.2) states that \(S_N:=X_1+\cdots X_N\sim\mathrm{Pois}\left(\sum_{ }^{ }\lambda_i\right)=\mathrm{Pois}\left(N\right)\).