Appendix

We derive the minimum order statistic for the independent random variables \(t_{1},t_{2},\dots,t_{N}\), where \(t_{i}\sim\text{Gamma}(K,x_{i})\).
\begin{equation} \par F(t;K,x)=\int_{0}^{t}f(t;K,x)~{}dt=\int_{0}^{t}\frac{t^{K-1}}{\Gamma(K)}x^{K}e^{-xt}~{}dt\,.\par \nonumber \\ \end{equation}
We note the cumulative density function of the gamma distribution is
Without loss of generality, suppose we wish to find the probability that \(i=1\) will reach the threshold of \(K\) Poisson events in the fastest time. Let \(T=min(t_{2},t_{3},\dots,t_{N})\). Then, the cumulative density function of \(t_{1}\) is
\begin{equation} \label{mincdf}\begin{split}\mathbb{P}(T>t_{1})&\displaystyle=\mathbb{P}(T_{2}>t_{1},T_{3}>t_{1},\dots,T_{N}>t_{1})\\ &\displaystyle=\mathbb{P}(T_{2}>t_{1})\mathbb{P}(T_{3}>t_{1})\cdots\mathbb{P}(T_{N}>t_{1})\\ &\displaystyle=\left[1-F_{2}(t_{1})\right]\left[1-F_{3}(t_{1})\right]\cdots\left[1-F_{N}(t_{1})\right]\\ &\displaystyle=\prod_{i=2}^{N}\left[1-F_{i}(t_{1})\right]\,.\end{split}\\ \end{equation}
This gives the probability that the \(t_{1}\) is the fastest realized time for all \(t_{i}\).
The probability density function of \(t_{1}\) is
\begin{equation} \label{minpdf}\begin{split}\mathbb{P}(T=t_{1})&\displaystyle=\frac{d}{dt_{i}}[1-\mathbb{P}(T>t_{1})]\\ &\displaystyle=\sum_{i=2}^{N}f_{i}(t_{1})\prod_{\begin{subarray}{c}j=2\\ j\neq i\end{subarray}}^{N}\left[1-F_{j}(t_{1})\right]\,.\end{split}\nonumber \\ \end{equation}
Now, suppose that the random variables \(t_{i}\) are i.i.d. to \(\text{Gamma}(K,x)\), (\ref{mincdf}) becomes
\begin{equation} \label{eqmcdf}\mathbb{P}(T>t_{1})=\left[1-F(t_{1})\right]^{N-1}\,.\\ \end{equation}
Taking the first order derivative of (\ref{eqmcdf}) with respect to the realized time \(t_{1}\)
\begin{equation} \label{dpidt}\frac{\partial\mathbb{P}(T>t_{1})}{\partial t_{1}}=-(N-1)f(t_{1})[1-F(t_{1})]^{N-2}<0\,,\\ \end{equation}
for all \(t_{1}>0\). The probability that \(i=1\) realizes the fastest time for all \(i\) is decreasing in the actual time realized.
The second derivative with respect to \(t_{1}\) is
\begin{equation} \begin{split}\displaystyle\frac{\partial^{2}\mathbb{P}(T>t_{1})}{\partial t_{1}^{2}}\\ =-&\displaystyle(N-1)\left[f^{\prime}(t_{1})[1-F(t_{1})]^{N-2}-(N-2)f(t_{1})^{2}[1-F(t_{1})]^{N-3}\right]\,.\end{split}\nonumber \\ \end{equation}
We note that \(f^{\prime}(t_{1})>0\) for \(t_{1}<\frac{K-1}{x}\) and \(f^{\prime}(t_{1})<0\) for \(t_{1}>\frac{K-1}{x}\), where \(\frac{K-1}{x}\) is the mode of \(f(t_{1};K,x)\). Furthermore, \(f^{\prime\prime}(t_{1})<0\). All other terms are positive for all \(t_{1}\).
\begin{equation} \par \lvert f^{\prime}(t_{1})[1-F(t_{1})]^{N-2}\rvert>\lvert(N-2)f(t_{1})^{2}[1-F(t_{1})]^{N-3}\rvert\,,\par \nonumber \\ \end{equation}
The sign of the second derivative on \(t_{1}<\frac{K-1}{x}\) depends on the relative size of the terms in the square bracket. If
then the second derivative will be negative for some \(\hat{t}_{1}\leq\frac{K-1}{x}\). If not, then the second derivative is positive on \(t_{1}<\frac{K-1}{x}\).
For \(t_{1}>\frac{K-1}{x}\), the term in square brackets is always negative, so the sign of the second derivative is always positive in this region.
In summary, the cumulative density function is concave when \(t_{1}<\hat{t}_{1}\), and convex when \(t_{1}>\hat{t}_{1}\), where \(\hat{t}_{1}\geq 0\).
\begin{equation} \label{pisoc}\begin{cases}\frac{\partial^{2}\mathbb{P}(T>t_{1})}{\partial t_{1}^{2}}<0\,,\quad t_{1}<\hat{t}_{1}\\ \frac{\partial^{2}\mathbb{P}(T>t_{1})}{\partial t_{1}^{2}}=0\,,\quad t_{1}=\hat{t}_{1}\\ \frac{\partial^{2}\mathbb{P}(T>t_{1})}{\partial t_{1}^{2}}>0\,,\quad t_{1}>\hat{t}_{1}\,.\end{cases}\\ \end{equation}