Onto the real problem

Having seen how we can solve the relaxed toy problem, let’s now turn our attention to the full problem at hand. to make it easier to work with later on, let’s assume that we start off at some arbitrary point \((n,1)\) and we’re going down towards the origin. First, let \(P^{0}_n = \bar\gamma^n\) be the probability that a particle on the \(x\)-axis will be able to random walk to the origin. Next, let \(P^{1}_n\) be the probability that a particle on the next row at column \(n\) will be able to random walk to the origin \((0,0)\). Our problem reduces down to finding an expression for \(P^1_n\).

Now, let \(\hat k\) be the event that a particle walks to a final displacement of \(k\) horizontal steps and immediately takes a vertical step. and let \(s\) be the event that we successfully found the target point, then we know that starting from \(n\): \[P^1_n[s] = \sum_{k \in \mathbb{Z}} P^0[s \mid \hat k] \times P_n[\hat k]\] where \(P^1_n[s] = P^1_n\) is the likelihood of succeeding (towards the origin) starting at \((1,n)\); \(P^0[s \mid \hat k]\) is the likelihood of succeeding given that we climbed down to the point \((0,k)\) at some point; and \(P_n[\hat k]\) is the probability that a particle starting at \(n\) climbs down at \(k\) on the second row. It follows that \[P^0[s \mid \hat k] = P^0_{|k|} = \bar\gamma^{|k|}\] but what of \(P_n[\hat k]\)?

In order to describe the behavior of descent, let us again break it into a one-dimensional scenario. Suppose I start off at \(k\), and I want to know the probability \(\hat P_k\) that I will random-walk to the origin and climb down there. As per the relaxed problem, we still have \(\hat P_k = 3 \hat P_{k-1} - \hat P_{k-2}\) when \(k \ne 0\). However, if we’re at the origin, then we can either go down with probability \(\frac13\), or we can go to either the left or the right side. Notice that the probabilities are symmetric about the origin, so this gives us (\(k \ge 0\), recall that \(\hat P_{-k} = \hat P_k\)): \[\hat P_k = \begin{cases} \frac{1 + 2\hat P_1}{3} & \text{when }k = 0 \\ 3 \hat P_{k-1} - \hat P_{k-2} & \text{otherwise} \end{cases} ~~~~ \text{and} ~~~~ \lim_{k\to\infty} \hat P_k = 0\] This again suggests that \(\hat P_k = \nu \bar\gamma^k\) and that (I.H. the induction hypothesis) \[P_0 = \nu = \frac13 + \frac23\hat P_1 = \frac13 + \frac23\underbrace{\nu \bar\gamma}_{\text{I.H.}}\] so that \(\nu = \frac1{\sqrt{5}}\) and \(\hat P_k = \frac{\bar\gamma^n}{\sqrt{5}}\).

Finally, this means that \[\begin{aligned} P^1_n &= \sum_{k\in\mathbb{Z}} \hat P_{|n-k|} P^0_{|k|} \\ &=\sum_{k\in\mathbb{Z}} \frac{\bar\gamma^{|n-k|}\bar\gamma^{|k|}}{\sqrt{5}} \\ &= \boxed{\frac{\bar\gamma^n}{\sqrt{5}}\left(n + \frac{1 + \bar\gamma^2}{1 - \bar\gamma^2}\right)}\end{aligned}\]