Brownian Motion

Since we are learning the Brownian motion analogous to random walk- today we are see the First passage time distribution. In simple words, the first passage time of a process, let \(X(t)\) of a value \(m\) is the time at which the process takes the value \(m\) for the first time. i.e. \[\tau_m = \min\{t \geq 0: X(t) = m\}\]

Quite a while ago we went through the random walk, scaled symmetric random walk. From that we went to the Brownian motion. To calculate the first passage time distribution we are going do in the same way.

Let us recap the notations once again. Suppose \(\omega_j\)= The outcome of the j-th toss with \(P(\omega_j = H) = p\). \[\begin{equation} X_j = \begin{cases} +1 & \text{if $\omega_j$ = H} \\ -1 & \text{if $\omega_j$ = T} \\ \end{cases} \end{equation}\] \(M_k\) is the symmetric random walk if \(p = \frac{1}{2}\). \[\begin{aligned} &M_0 = 0 \\ &M_k = \sum_{j=1}^{k}X_j\qquad \text{for k = 1,2,3...} \end{aligned}\]

We want to see the \(\tau_m\) for symmetric random walk.

Lemma: Let \(M_n\) be a symmetric random walk. Fix a number \(\sigma\) and define a process \[S_n = e^{\sigma M_n}(\frac{2}{e^\sigma+e^{-\sigma}})^n\] Then \(S_n, n = 0,1,2, \cdots\) is a martingale.

Proof: We have the expression of \(S_n\) as- \[S_n = e^{\sigma M_n}(\frac{2}{e^\sigma+e^{-\sigma}})^n\] Hence we can write- \[S_{n+1} =e^{\sigma M_{n+1}}(\frac{2}{e^\sigma+e^{-\sigma}})^{n+1} = S_n(\frac{2}{e^\sigma+e^{-\sigma}})e^{\sigma X_{n+1}}\] so, if we evaluate the expectation of \(S_{n+1}\) given the information up to time \(t=n\), i.e. \(E_n(S_{n+1})\). \[\begin{aligned} E_n(S_{n+1}) &= E_n[S_n(\frac{2}{e^\sigma+e^{-\sigma}})e^{\sigma X_{n+1}}] \\ &= S_n.(\frac{2}{e^\sigma+e^{-\sigma}}).E_n[e^{\sigma X_{n+1}}] \qquad \text{since given the information up to time n, $S_n$ is constant} \\ &= S_n.(\frac{2}{e^\sigma+e^{-\sigma}}).(\frac{1}{2}e^{\sigma}+\frac{1}{2}e^{-\sigma}) \\ &= S_n \end{aligned}\] So, \(S_n\) is a martingale.

Now, the Optional Sampling Theorem states that- A martingale stopped at stopping time is a martingale.

Suppose \(\min(n,\tau_m) = n \land \tau_m\). Then using the Optional Sampling Theorem we can write- \(S_{n \land \tau_m}\) is a martingale.

So using Optional Sampling Theorem we can write- \(S_{n \land \tau_m}\) is a martingale. Hence it has a constant expectation. So we can write-

\[\begin{aligned} 1 = S_0 &= E(S_{n \land \tau_m}) \\ &= E[e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m}] \qquad \text{ for all $n \geq 0$} \end{aligned}\]

We would like to see what happens to the above expression when \(n \rightarrow \infty\); for that we need to see what happens to the expression inside the expectation. i.e. \[\lim_{n \rightarrow \infty}e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m}\] We treat this two factors separately.

Hence \(\frac{e^\sigma+e^{-\sigma}}{2} = \cosh(\sigma)\) takes minimum value when \(\sigma = 0\) and \(\cosh(0) = 1\). So \(\cosh(\sigma) > 1\) for \(\sigma > 0\). This implies- \[0 < \frac{2}{e^\sigma+e^{-\sigma}} < 1 \qquad \text{For $\sigma > 0$}\]

Now, fix \(\sigma > 0\), we can conclude that-

\[\begin{equation} \lim_{n \rightarrow \infty} (\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m} = \begin{cases} (\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m} & \text{if $\tau_m < \infty$} \\ 0 & \text{if $\tau_m = \infty$} \\ \end{cases} = \mathbb{I_{(\tau_m < \infty)}}(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m} \qquad \text{[combining both the cases]} \end{equation}\]

Now, the other factor is- \(e^{\sigma M_{n \land \tau_m}}\). Assume- \(m > 0\), then we must have- \(M_{n \land \tau_m} \leq m\) because we stop the martingale when it reaches at level m. Hence, whether \(\tau_m\) is finite or not, we must have- \[0 \leq e^{\sigma M_{n \land \tau_m}} \leq e^{\sigma.m}\]

So, if we can write- \[\lim_{n \rightarrow \infty}e^{\sigma M_{n \land \tau_m}} = e^{\sigma M_{\tau_m}} = e^{\sigma. m} \qquad \text{when $\tau_m < \infty$}\] So combining the two parts in case of \(\tau_m < \infty\), we get- \[\lim_{n \rightarrow \infty}e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m} = e^{\sigma. m}.(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m} \qquad \text{if $\tau_m < \infty$}\]

Now we don’t know \(\lim_{n \rightarrow \infty} e^{\sigma. M_{n \land \tau_m}}\) for the paths for which \(\tau_m = \infty\). But it does not matter because- \(\lim_{n \rightarrow \infty}(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m} = 0\). Hence, even if \(\tau_m = \infty\), \[\lim_{n \rightarrow \infty}e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m} = 0 \qquad \text{where $\tau_m = \infty$}\]

Hence we can write- \[\begin{aligned} \lim_{n \rightarrow \infty}e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m} = \mathbb{I_{(\tau_m < \infty)}}e^{\sigma.m}.(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m} \end{aligned}\]

Now, previously we claimed that, for \(\sigma > 0\);

\(e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m}\) is a martingale, and- \[\begin{aligned} 1 = S_0 &= E(S_{n \land \tau_m}) \\ &= E[e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m}] \\ &= E[\mathbb{I_{(\tau_m < \infty)}}e^{\sigma.m}.(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m}] \end{aligned}\]

We have calculated all these for \(\sigma > 0\), to see what happens when \(\sigma = 0\), just take both sides of the above equation and limit \(\sigma \downarrow 0\), i.e.

\[\begin{aligned} &\lim_{\sigma \downarrow 0} E(S_{n \land \tau_m}) = \lim_{\sigma \downarrow 0}1 = 1 \\ \implies &\lim_{\sigma \downarrow 0}E[e^{\sigma M_{n \land \tau_m}}.(\frac{2}{e^\sigma+e^{-\sigma}})^{n \land \tau_m}] = 1 \\ \implies &\lim_{\sigma \downarrow 0}E[\mathbb{I_{(\tau_m < \infty)}}e^{\sigma.m}.(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m}] = 1 \\ \implies &E[\mathbb{I_{(\tau_m < \infty)}}] = 1 \\ \implies &\Pr[\tau_m < \infty] = 1 \end{aligned}\]

The significance of this result is- the symmetric random walk reaches the level m with probability 1.

This implies, there will paths for which the symmetric random walk will never reach level \(m\), with \(m > 0\). For example a path- \(\{T,H,TT,H,TTT,H,TTTT,H,\cdots \}\). In this pattern the symmetric random walk will never take positive value and hence will not reach the level \(m\) at any point. There are uncountable infinitely many such paths all together. But the probability of these sets of paths have zero probability.

Theorem: Let \(m\) be an arbitrary non-zero integer. The symmetric random walk reaches the level \(m\) almost surely. i.e. the first passage time to level m, \(\tau_m\) is finite almost surely.

Proof: To prove the following, we need to determine the distribution of first passage time, i.e. \(\tau_m\).

Now one of the best trick to find the distribution of a random variable is- to find the Moment generating function of that random variable. So in this case moment generating function will be- \[\phi_{\tau_m}(u) = E(e^{u.\tau_m})\] For \(x \geq 0\) we can write- \[e^x = 1+x + \frac{x^2}{2!}+ \frac{x^3}{3!}+ \cdots \geq x\]

Hence for any positive value of \(u\), we must have- \[\begin{aligned} e^{u.\tau_m} &\geq u.\tau_m \\ \implies \phi_{\tau_m}(u) &= E(e^{u.\tau_m}) \geq u.E(\tau_m) \end{aligned}\]

Now, if \(E(\tau_m) = \infty\), that will imply- \[\begin{equation} \phi_{\tau_m}(u) = \begin{cases} \infty & \text{if $u > 0$} \\ 1 & \text{if $u = 0$} \\ \end{cases} \end{equation}\] Therefore, the moment generating function \(\phi_{\tau_m}(u)\) is useful for- \(u < 0\). Let us set for \(u < 0, \quad \alpha = e^{u}\); So we must have- \(0 < \alpha <1\) and \(\phi_{\tau_m}(u) = E(\alpha^{\tau_m})\).

Theorem: Let \(m\) be a non-zero integer. The first passage time - \(\tau_m\) for the symmetric random walk satisfies- \[E(\alpha^{\tau_m}) = (\frac{1-\sqrt{1-\alpha^2}}{\alpha})^{|m|} \qquad \text{for all $\alpha \in (0,1)$}\] Proof: For symmetric random walk \(\tau_m\) and \(\tau_{-m}\) have same distribution. So it is enough to prove the theorem for a positive integer \(m\). We take \(m\) to be positive integer and since \(\Pr[\tau_m < \infty] = 1\), we may simplify \(E(\mathbb{I_{(\tau_m < \infty)}}e^{\sigma.m}(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m})\) to \[E[e^{\sigma.m}(\frac{2}{e^\sigma+e^{-\sigma}})^{\tau_m}] = 1\] This holds for all strictly positive \(\sigma\).

Here we have- \[\begin{aligned} &\alpha = \frac{2}{e^{\sigma}+e^{-\sigma}} \\ \implies &\alpha e^{\sigma} + \alpha e^{-\sigma} - 2 = 0 \\ \implies &\alpha(e^{-\sigma})^2 - 2e^{-\sigma} + \alpha = 0 \\ \end{aligned}\]

Applying Sridharacharya Method on the above equation we get- \[e^{-\sigma} = \frac{2\overset{+}{-} \sqrt{4-4\alpha^2}}{2\alpha} = \frac{1\overset{+}{-} \sqrt{1-\alpha^2}}{\alpha}\] So we need \(e^{-\sigma}\) to be less than 1. i.e. \(e^{-\sigma} =\frac{1- \sqrt{1-\alpha^2}}{\alpha}\). Since we have chosen \(\alpha\) such that- \(0 < \alpha < 1\), we can write- \[\begin{aligned} 0 < &(1-\alpha)^2 < 1- \alpha < 1- \alpha^2 \\ \implies &(1-\alpha) < \sqrt{1- \alpha^2} \\ \implies &1-\sqrt{1- \alpha^2} < \alpha \\ \implies &\frac{1-\sqrt{1- \alpha^2}}{\alpha} < 1 \\ \end{aligned}\] This implies \(\sigma\) is strictly positive. Hence we can write the following- \[\begin{aligned} &E[(\frac{\alpha}{1-\sqrt{1- \alpha^2}})^m.\alpha^{\tau_m}] = 1 \\ \implies &(\frac{\alpha}{1-\sqrt{1- \alpha^2}})^m.E[\alpha^{\tau_m}] = 1 \\ \implies &E[\alpha^{\tau_m}] = (\frac{1-\sqrt{1- \alpha^2}}{\alpha})^m \end{aligned}\]

Now, let us see under which condition we have- \(E(\tau_m) = \infty\). To prove it we just need to prove- \[E(\tau_1) = \infty\] and since for \(m \geq 1\) we have \(E(\tau_m) \geq E(\tau_1)\) the first part would be enough.

Proof: Just differentiate both sides of \(E(\alpha^{\tau_1})\) with respect to \(\alpha\), we get- \[\begin{aligned} E(\tau_1.\alpha^{\tau_1-1}) &= \frac{\partial E(\alpha^{\tau_1})}{\partial \alpha} \\ &= \frac{\partial}{\partial \alpha}\frac{1-\sqrt{1- \alpha^2}}{\alpha} \\ &= \frac{-\frac{1}{2}(1-\alpha^2)^{-\frac{1}{2}}(-2\alpha)\alpha- (1-(1-\alpha^2)^{\frac{1}{2}})}{\alpha^2} \\ &= \frac{\alpha^2(1-\alpha^2)^{-\frac{1}{2}}-1+(1-\alpha^2)^{-\frac{1}{2}}}{\alpha^2} \\ &= \frac{\alpha^2-\sqrt{1-\alpha^2}+1-\alpha^2}{\alpha^2\sqrt{1-\alpha^2}} \\ &= \frac{1-\sqrt{1-\alpha^2}}{\alpha^2\sqrt{1-\alpha^2}} \\ \end{aligned}\]

This is valid for \(\alpha \in (0,1)\). we can not put \(\alpha =1\), because that would be \(\frac{0}{0}\) form. But what we can do is take both sides and increase \(\alpha \uparrow 1\). That would give us- \[E(\tau_1) = \infty\] For \(m \geq 1\), \(\tau_m \geq \tau_1\), hence \(E(\tau_m) \geq E(\tau_1) = \infty\). For strictly negative integers, the symmetry of random walk now implies \(E(\tau_m) = \infty\).

From now we can write - \[E(\alpha^{\tau_1}) = \frac{1-\sqrt{1-\alpha^2}}{\alpha} \qquad \text{for all $\alpha \in (0,1)$}\]

Now, if you think very clearly, \(\tau_1\) can only be reached by odd number of steps.

First passage time to reach level m = 1 Example of such paths
\(\tau_1 = 1\) {H}
\(\tau_1 = 3\) {T,H,H}
\(\tau_1 = 5\) {T,T,H,H,H}/ {T,H,T,H,H}

Then we can rewrite \(E(\alpha^{\tau_1})\) as follows-

\[E\alpha^{\tau_1} = \sum_{j = 1}^{\infty}\alpha^{2j-1} \Pr(\tau_1 = 2j-1)\] We are going to evaluate \(\Pr(\tau_1 = 2j-1)\) using the help of Maclaurin expansion of function \(f(x) = 1-\sqrt{1-x}\). A Maclaurin series is a Taylor series expansion about zero. \[f(x) = f(0) + f'(0)\frac{x}{1!}+f''(0)\frac{x^2}{2!}+f^{(3)}(0)\frac{x^3}{3!}+ \cdots + f^{(n)}(0)\frac{x^n}{n!}+ \cdots \] For \(f(x) = 1- \sqrt{1-x}\), we can write- \[\begin{aligned} f'(x) &= \frac{1}{2}(1-x)^{-\frac{1}{2}} \\ f''(x) &= \frac{1}{4}(1-x)^{-\frac{3}{2}} \\ f'''(x) &= \frac{3}{8}(1-x)^{-\frac{5}{2}} \\ \end{aligned}\] In general the j-th order derivative off \(f\) is- \[f^{j}(x) = \frac{1.3\cdots(2j-3)}{2^j}(1-x)^{-\frac{2j-1}{2}} \qquad \text{j = 1,2,3,...}\]

Taking \(x=0\), we get- \(f'(0) = \frac{1}{2}, \ f''(0) = \frac{1}{4}, \ f'''(0) = \frac{3}{8}\) and in general,

\[\begin{aligned} f^{(j)}(0) &= \frac{1.3\cdots(2j-3)}{2^j} \qquad \text{j = 1,2,3,...} \\ &= \frac{1.3\cdots(2j-3)}{2^j}. \frac{2.4\cdots(2j-2)}{2^{j-1}.(j-1)!} \\ &= (\frac{1}{2})^{j-1}.\frac{(2j-2)!}{(j-1)!} \end{aligned}\]

So, the Maclaurin series expansion of function \(f(x)\) is- \[\begin{aligned} f(x) &= 1-\sqrt{1-x} \\ &= \sum_{j=0}^{\infty}\frac{1}{j!}.f^{(j)}(0).x^j \\ &= \sum_{j=0}^{\infty}(\frac{1}{2})^{j-1}.\frac{(2j-2)!}{j!(j-1)!}.x^j \\ \end{aligned}\] Therefore, \[\begin{aligned} \frac{1-\sqrt{1-\alpha^2}}{\alpha} &= \frac{f(\alpha^2)}{\alpha} \\ &= \sum_{j=1}^{\infty} (\frac{\alpha}{2})^{2j-1}.\frac{(2j-2)!}{j!(j-1)!} \\ \end{aligned}\]

Comparing \(E(\alpha^{\tau_1}) = \sum_{j = 1}^{\infty}\alpha^{2j-1} \Pr(\tau_1 = 2j-1)\) with the above Maclaurin series expansion we see-

\[\begin{aligned} E(\alpha^{\tau_1}) = &\sum_{j = 1}^{\infty}\alpha^{2j-1} \Pr(\tau_1 = 2j-1) = \sum_{j=1}^{\infty} (\frac{\alpha}{2})^{2j-1}.\frac{(2j-2)!}{j!(j-1)!} \qquad \text{for all $\alpha \in (0,1)$} \end{aligned}\] Using the uniqueness of power series and equating the coefficient of two series term-by-term, the coefficient of \(\alpha^{2j-1}\), we can write- \[\Pr(\tau_1 = 2j-1) = (\frac{1}{2})^{2j-1}.\frac{(2j-2)!}{j!(j-1)!} \qquad \text{j = 1,2,3, ...}\] Let us see with some examples if the formula above is true.

For \(j=1\), \[\Pr[\tau_1 = 1] = \frac{1}{2}. \frac{0!}{0!.1!} = \frac{1}{2}\] The only way \(\tau_1\) can be 1 if- first toss results in H. Hence the probability for a symmetric random walk to have \(\tau_1 = 1\) is \(\frac{1}{2}\).

For \(j=2\), \[\Pr[\tau_1 = 3] = (\frac{1}{2})^{3}. \frac{2!}{2!.1!} = (\frac{1}{2})^{3}\] The only way \(\tau_1\) can be 3 if the- first three toss results in THH. Hence the probability for a symmetric random walk to have \(\tau_1 = 3\) is \((\frac{1}{2})^3\).

For \(j=3\), \[\Pr[\tau_1 = 5] = (\frac{1}{2})^{5}. \frac{4!}{3!.2!} = 2.(\frac{1}{2})^{5} = (\frac{1}{2})^{4}\]

If you see the table I have written above, there are two ways we can have- \(\tau_1 = 5\), if first five toss results in either THTHH or TTHHH and the probability of each of these outcome is \((\frac{1}{2})^5\).

If you generalize the idea a little bit and move to random walk with \(P[H] = p \neq \frac{1}{2}\) then we can write- \[\Pr(\tau_1 = 2j-1) = \frac{(2j-2)!}{j!(j-1)!}p^j.q^{j-1} \qquad \text{j = 1,2,3, ...}\]

In the next blog, we will all the tools and tricks to find the first passage time distribution for Brownian motion.