Random Processes
Course Summary
Course Summary
Instructor: Dr. Pham Hai Ha
1 Random Process
Consider a probability space \((\Omega,\mathcal{F},\mathbb{P}).\)
- A random process (or stochastic process) is a collection of random variables \(\left\{X_t\right\}_{t\in I};\)
- \(I\) is called the index set (or parameter set) of the (random) process, and each \(t\in I\) is called a time;
- For each \(\omega\in\Omega,X_t(\omega)\) is a function relative to \(t,\) called the sample path;
- The set \(S=\left\{X_t(\omega):t\in I,\omega\in\Omega\right\}\) is called the state space of the process.
A filtration is an increasing collection of \(\sigma-\)algebras \(\left\{\mathcal{F}_t\right\}_{t\geq0}\) satisfying \[\mathcal{F}_t\subset\mathcal{F},\forall t\geq0\textrm{ and }\mathcal{F}_s\subset\mathcal{F}_t,\forall 0\leq s\leq t.\] The quadruple \((\Omega,\mathcal{F},\left\{\mathcal{F}_t\right\},\mathbb{P})\) is called a ;
Given a process \(\left\{X_t\right\}_{t\in I},\) the collection \[\sigma(\left\{X_t\right\})=\sigma\left(\left\{X_t^{-1}(A):t\in I,A\in\mathcal{B}(\mathbb{R})\right\}\right)\] is a \(\sigma-\)algebra on \(\Omega,\) called the \(\left\{X_t\right\}.\)
2 Martingale
2.1 Discrete Case
Consider a probability space \((\Omega,\mathcal{F},\mathbb{P})\) and a process \(\left\{X_n\right\}_{n\in\mathbb{N}}.\)
- A filtration is an increasing collection of \(\sigma-\)algebras \(\left\{\mathcal{F}_n\right\}_{n\in\mathbb{N}}\) satisfying \[\mathcal{F}_n\subset\mathcal{F}_{n+1}\subset\mathcal{F},\forall n\in\mathbb{N}.\] The quadruple \((\Omega,\mathcal{F},\left\{\mathcal{F}_n\right\},\mathbb{P})\) is called a filtered probability space;
- We say that \(\left\{X_n\right\}\) is adapted to \(\left\{\mathcal{F}_n\right\}\) if \(X_n\) is \(\mathcal{F}_n-\)measurable, \(\forall n\in\mathbb{N};\)
- \(\left\{X_n\right\}\) is called a martingale if it is adapted and \[\mathbb{E}(X_{n+1}|\mathcal{F}_n)=X_n,\forall n\in\mathbb{N}.\]
2.2 Continuous Case
Consider a filtered probability space \((\Omega,\mathcal{F},\left\{\mathcal{F}_t\right\}_{t\geq0},\mathbb{P})\) and a process \(\left\{X_t\right\}_{t\geq0}.\)
- We say that \(\left\{X_t\right\}\) is adapted to \(\left\{\mathcal{F}_t\right\}\) if \(X_t\) is \(\mathcal{F}_t-\)measurable, \(\forall t\geq0;\)
- \(\left\{X_t\right\}\) is called a martingale if it is adapted and \[\mathbb{E}(X_t|\mathcal{F}_s)=X_s,\forall 0\leq s\leq t.\]
3 Poisson Process
A Poisson process with intensity (or rate) \(\lambda\) is a process \(\left\{N_t\right\}_{t\geq0}\) satisfying:
- \(N_0=0;\)
- \(N_t\sim\textrm{Pois}(\lambda t),\forall t\geq0;\)
- Stationary Increments: \(N_{s+t}-N_s\sim N_t,\forall s,t>0;\)
- Independent Increments: if \(s,t,u,v>0\) and \(s+t<u,\) then \(N_{s+t}-N_s\) and \(N_{u+v}-N_u\) are independent.
\(N_t\) is called the arrival count up to time \(t.\)
3.1 Arrival Time & Inter-Arrival Time
Consider a Poisson process \(\left\{N_t\right\}_{t\geq0}.\) Suppose \(S_n\) is the time of the \(n^{th}\) arrival, \(\forall n\in\mathbb{N}.\)
- \(N_t=\max\left\{n:S_n\leq t\right\},\forall t\geq0;\)
- \(S_n\sim\textrm{Gamma}(n,\lambda),\forall n\in\mathbb{N};\)
- The inter-arrival time is defined by \[X_n=S_n-S_{n-1},\forall\hspace{0.25mm}n\in\mathbb{N}.\] Then \(X_n\) are i.i.d. and \(X_n\sim\textrm{Exp}(\lambda),\forall n\in\mathbb{N};\)
- Given \(n\) random variables \(X_1,...X_n,\) we define the order statistic \(\left\{X_{(i)}\right\}_{i=1}^n\) as \[X_{(k)}=\min\left(\left\{X_i\right\}_{i=1}^n\setminus\left\{X_{(j)}\right\}_{j=1}^{k-1}\right),\forall\hspace{0.25mm}k=\overline{1,n}.\] If \(N_t=m\) and \(U_1,...,U_m\) are independent, identically distributed random variables with \[U_i\sim U([0,t]),\forall i=\overline{1,m}\] then \[[(S_1,...,S_m)|N_t=m]\sim(U_{(1)},...,U_{(m)}).\]
3.2 Compound Poisson Process
Let \(\left\{W_i\right\}_{i\in\mathbb{N}}\) be independent, identically distributed random variables of some distribution \(F\) and are independent of a Poisson process \(\left\{N_t\right\}_{t\geq0}\) with rate \(\lambda>0.\)
- The process \(\left\{R_t\right\}_{t\geq0}\) defined by \[R_t=\sum_{i=1}^{N_t}W_i,\forall t\geq0\] is called a compound Poisson process;
- \(\mathbb{E}(R_t)=\lambda t\cdot\mathbb{E}(W_i)\) and \(\textrm{Var}(R_t)=\lambda t\cdot\mathbb{E}(W_i^2),\forall t\geq0.\)
4 Markov Chain
- A process \(\left\{X_n\right\}_{n\in\mathbb{N}}\) is called a Markov chain if it satisfy the Markov property: \[X_{n+1}|(X_0,...,X_n)=X_{n+1}|X_n,\forall n\in\mathbb{N}.\]
- Associated with a Markov chain \(\left\{X_n\right\}_{n\in\mathbb{N}}\) is the transition probability \[p_{ij}=\mathbb{P}(X_{n+1}=j|X_n=i),\forall n\in\mathbb{N}\] and the transition matrix \[P=\begin{pmatrix}p_{11} & p_{12} & ...\\p_{21} & p_{22} & ...\\... & ... & ...\end{pmatrix}.\]
- For each \(n\in\mathbb{N},\) define the transition probability after \(n\) steps as \[r_{ij}(n)=\mathbb{P}(X_{n+k}=j|X_k=i),\forall k\in\mathbb{N}.\] Then \[P^n=\begin{pmatrix}p_{11} & p_{12} & ...\\p_{21} & p_{22} & ...\\... & ... & ...\end{pmatrix}^n=\begin{pmatrix}r_{11}(n) & r_{12}(n) & ...\\r_{21}(n) & r_{22}(n) & ...\\... & ... & ...\end{pmatrix}.\]
- For each \(n\in\mathbb{N},\) define the unconditional distribution of \(X_n\) as \[\pi^{(n)}=(\pi_1^{(n)},\pi_2^{(n)},...)\textrm{ where }\pi_i^{(n)}=\mathbb{P}(X_n=i)\] then \(\pi^{(n)}=\pi^{(0)}\cdot P^n.\) In the long term, \(\pi^{(n)}\) approaches the stationary distribution \(\pi=(\pi_1,\pi_2,...)\) given by \[\pi\cdot P=\pi\textrm{ and }\sum_i\pi_i=1.\] Furthermore, \(r_{ij}(n)\) approaches \(\pi_j\) as \(n\rightarrow\infty.\)
5 Random Walk
Consider a process \(\left\{X_n\right\}_{n\in\mathbb{N}}\) of independent, identically distributed random variables with distribution \[\mathbb{P}(X_n=-1)=\mathbb{P}(X_n=1)=\frac{1}{2}\] then the process \(\left\{M_n\right\}_{n\in\mathbb{N}}\) defined by \[M_0=0\textrm{ and }M_n=\sum_{i=1}^nX_i,\forall n\geq1\] is called a symmetric random walk.
Properties of \(\left\{M_n\right\}:\)
- It is a martingale and \(\mathbb{E}(M_n)=0,\textrm{Var}(M_n)=n;\)
- The first passage time \(\tau_1=\inf\left\{n:M_n=1\right\}\) has distribution \[\mathbb{P}(\tau_1=2j-1)=\frac{1}{2^{2j-1}}\cdot\frac{(2j-2)!}{j!\cdot(j-1)!};\]
- Stationary Increments: \(M_{s+t}-M_s\sim M_t,\forall t,s\in\mathbb{N};\)
- Independent Increments: if \(s,t,u,v\in\mathbb{N}\) and \(s+t<u,\) then \(M_{s+t}-M_s\) and \(M_{u+v}-M_u\) are independent.
- Quadratic Variation: \[\left<M,M\right>_k=\sum_{j=1}^k(M_j-M_{j-1})^2=k,\forall k\in\mathbb{N}.\]
- The process \(\left\{W_t^{(n)}\right\}\) defined by \[W_t^{(n)}=\frac{M_{nt}}{\sqrt{n}}\textrm{ whenever }nt\textrm{ is an integer}\] is called a scaled symmetric random walk. It preserves all properties of \(\left\{M_n\right\}:\) martigale property, stationary & independent increments, quadratic variation.
- By the Central Limit Theorem, the process \(\left\{W_t^{(n)}\right\}\) converges in distribution to a process \(\left\{B_t\right\}\) called the Brownian motion as \(n\rightarrow\infty.\)
6 Brownian Motion
- A process \(\left\{B_t\right\}_{t\geq0}\) is called a Brownian motion if it has the following properties:
- For each \(\omega\in\Omega,B_t(\omega)\) is a continuous function of \(t;\)
- \(B_0=0\) and \(B_t\sim\mathcal{N}(0,t),\forall t>0;\)
- Stationary Increments: \(B_{s+t}-B_s\sim B_t,\forall t,s>0;\)
- Independent Increments: if \(s,t,u,v>0\) and \(s+t<u,\) then \(B_{s+t}-B_s\) and \(B_{u+v}-B_u\) are independent.
- Properties of \(\left\{B_t\right\}:\)
- \(\textrm{cov}(B_{t+s},B_s)=s,\forall s,t\geq0;\)
- It is a martingale;
- The first passage time \(\tau_m=\inf\left\{t:B_t=m\right\}\) has distribution \[\mathbb{P}(\tau_m\leq t)=2\cdot\mathbb{P}(B_t\geq m)\] and density \[f_{\tau_m}(t)=\frac{|m|}{t\sqrt{2\pi t}}\cdot e^{-\frac{m^2}{2t}},t\geq0.\] Also, \(\mathbb{P}(\tau_m<\infty)=1\) and \(\mathbb{E}(\tau_m)=\infty;\)
- The maximum to date \(\left\{M_t\right\}_{t\geq0}\) where \(M_t=\max\left\{B_s:0\leq s\leq t\right\}\) has distribution \[\mathbb{P}(M_t\geq x)=2\cdot\mathbb{P}(B_t\geq x);\]
- Quadratic Variation: \(\left<B\right>(T)=T\) almost surely, i.e. \[(dB_t)^2=dt,\hspace{3mm}dB_tdt=dtdt=0.\]
7 Ito Integral
- An Ito integral has the form \[I_t=\int_0^t\delta_sdB_s\] where the integrator \(\left\{B_s\right\}\) is the Brownian motion and the integrand \(\left\{\delta_s\right\}\) is a square-integrable, adapted process.
- Properties of \(I_t:\)
- Series representation: \[\int_0^Tf(s)dB_s=\lim_{n\rightarrow\infty}\sum_{i=0}^{n-1}f\left(\frac{iT}{n}\right)\left[B_{(i+1)T/n}-B_{iT/n}\right];\]
- It is a continuous function of \(t;\)
- Linearity: \[\int_0^t(\alpha\cdot\delta_s\pm\beta\cdot\gamma_s)dB_s=\alpha\int_0^t\delta_sdB_s\pm\beta\int_0^t\gamma_sdB_s;\]
- Quadratic Variation: \[\left<I,I\right>(t)=\int_0^t\delta_s^2ds;\]
- Isometry: \(\mathbb{E}(I_t^2)=\mathbb{E}(\left<I,I\right>(t)).\)
- Integration by Parts: if \(f\) is continuous and deterministic then \[\int_0^tf(s)dB_s=f(t)\cdot B_t-\int_0^tB_sdf(s);\]
- If \(\delta_s\) is deterministic then \(I_t\sim\mathcal{N}(0,\left<I,I\right>(t)).\)
8 Ito-Doeblin Formula
Let \(f(t,x)\) be twice continuously differentiable and \(\left\{B_t\right\}\) be the Brownian motion.
- The simple case: \[df(t,B_t)=f_t(t,B_t)dt+f_x(t,B_t)dB_t+\frac{1}{2}f_{xx}(t,B_t)dt.\]
- Consider an Ito process \(\left\{X_t\right\}_{t\geq0}\) satisfying \[dX_t=\mu_tdt+\sigma_tdB_t\] where the drift term \(\mu_t\) and the diffusion term \(\sigma_t\) are adapted processes, then \[df(t,X_t)=f_t(t,X_t)dt+f_x(t,X_t)dX_t+\frac{1}{2}f_{xx}(t,X_t)\sigma_t^2dt.\]
9 Stochastic Differential Equation (SDE)
All SDEs considered in this section is of the form \[dX_t=\mu(t,X_t)dt+\sigma(t,X_t)dB_t\] with \(X_0=x_0.\)
9.1 Arithmetic Brownian Motion
Consider the SDE \[dX_t=\mu(t)dt+\sigma(t)dB_t\] where \(\mu\) and \(\sigma\) depend on \(t\) only. Integrating both sides gives \[X_T=X_0+\int_0^T\mu(t)dt+\int_0^T\sigma(t)dB_t.\]
9.2 Geometric Brownian Motion
Consider the SDE \[dX_t=\mu X_tdt+\sigma X_tdB_t\] Dividing both sides by \(X_t\) gives \[\frac{dX_t}{X_t}=\mu dt+\sigma dB_t\] It is natural to think of a process \(Y_t=f(t,X_t)\) satisfying \[dY_t=\mu dt+\sigma dB_t=\frac{dX_t}{X_t}\] and distribution of \(Y_t\) is known from the arithmetic Brownian motion SDE \[Y_t=Y_0+\mu t+\sigma B_t\] By Ito-Doeblin formula, \[dY_t=f_t(t,X_t)dt+f_x(t,X_t)dX_t+\frac{1}{2}f_{xx}(t,X_t)\sigma^2X_t^2dt\] implying \[\left\{\begin{matrix}f_x(t,x)=1/x\\f_t(t,x)+f_{xx}(t,x)\sigma^2x^2/2=0\end{matrix}\right.\Rightarrow f(t,x)=\ln x+\frac{\sigma^2t}{2}\] hence \(Y_t=\ln S_t+\sigma^2t/2.\) Therefore, \[X_t=X_0\exp\left(\left(\mu-\frac{\sigma^2}{2}\right)t+\sigma B_t\right).\] The procedure above is also applicable for SDEs of the form \[dX_t=\mu(t)X_tdt+\sigma(t)X_tdB_t\]
9.3 Vasicek Interest Rate Model
Consider the SDE \[dX_t=(\alpha-\beta X_t)dt+\sigma dB_t\] or equivalently \[dX_t+\beta X_tdt=\alpha dt+\sigma dB_t\] Attempting to find a process \(Y_t=f(t,X_t)\) satisfying \[dY_t=\alpha dt+\sigma dB_t=dX_t+\beta X_tdt\] gives \[\left\{\begin{matrix}f_x=1\Rightarrow f_{xt}=0\\f_t=\beta x\Rightarrow f_{tx}=1\end{matrix}\right.\] so no such \(Y_t\) exists. In order for the system to be solvable, we increase the complexity of \(f_x,\) i.e. \[f_x=g(t,X_t)\] for some function \(g.\) Then \[dY_t=g(t,X_t)\cdot dX_t+A\cdot\beta X_tdt\] for some unknown quantity \(A.\) To leverage on the equality \(\alpha dt+\sigma dB_t=dX_t+\beta X_tdt\) for solving \(Y_t,\) we expect \(A=g(t,X_t),\) i.e. \[dY_t=g(t,X_t)(\alpha dt+\sigma dB_t)\] and hence \(Y_t\) is solvable if \(g\) is independent of \(X_t,\) i.e. \(g(t,X_t)=g(t).\) Now with \[dY_t=g(t)(dX_t+\beta X_tdt)\] we have \[\left\{\begin{matrix}f_x=g(t)\Rightarrow f_{xt}=g'(t)\\f_t=g(t)\beta x\Rightarrow f_{tx}=g(t)\beta\end{matrix}\right.\Rightarrow g'(t)=g(t)\beta\] so \(g(t)=e^{\beta t}\) and \(Y_t=X_te^{\beta t}.\) Hence \[Y_T=Y_0+\frac{\alpha e^{\beta t}-\alpha}{\beta}+\int_0^T\sigma e^{\beta t}dB_t\] implying \[X_T=e^{-\beta T}\left(X_0+\frac{\alpha e^{\beta t}-\alpha}{\beta}+\int_0^T\sigma e^{\beta t}dB_t\right)\]