What we are going to cover in this article is “log-normal” property of stock prices.
In other words, the article is for someone who has a hard time wrapping their heads around the below formula:
\[ lnS_T \sim \mathrm{N}(lnS_0 + (\mu - \sigma^2/2)T, \sigma^2T) \\ \Rightarrow\\ E(S_T) = S_0e^{\mu T} \\ Var(S_T) = S_0^2e^{2 \mu T}(e^{\sigma^2 T} -1) \]
Related information can be found in fomula 14.4 and 14.5 in 8th edition of Hull’s Derivitatives textbook.
In fact, the authors provide in one of their technical notes how to derive it.
Technical note: Properties of Lognormal Distribution
However, for someone who lack mathematical backgrounds may find it a little hard to parse the meaning.
Before we start, I will assume you already know below:
\[ lnS_T \sim \mathrm{N}(lnS_0 + (\mu - \sigma^2/2)T, \sigma^2T) \]
Now let’s get started.
There is a randome variable X that follows normal distribtuion. V follows log-normal distribution.
→ that is, log(V) (i.e. X) follows normal distribution. \[
X = ln(V), \ X \sim \mathrm{N}(m, s^2)
\]
probability denstiy function(pdf) of normal distribtuion is defined as below:
\[
f(x): probability \ density \ function \ of \ X \\
f(x) = \frac{1}{\sqrt {2\pi}}*exp[\frac{(x-\mu)^2}{-2s^2}]
\]
\[ v = u(x) \\ x = w(y) \\ (u, v: 1 \ to \ 1 \ corresponding function) \\ \Rightarrow \\ h(v) = f[w(v)]*|J| \\ s.t.\\ f(x): \ pdf \ of \ X \\ h(v): \ pdf \ of \ V \\ J: \ Jacobian \ Matrix \\ where \ J = \frac{d}{dv}w(v) \]
3.2 By applying it to our case, h(v) (i.e. v’s pdf) can be expressed as below:
\[ v = u(x) = e^x \\ x = w(v) = lnv \\ \Rightarrow \ h(v) = f[w(v)]*|J| = \frac{1}{\sqrt{2\pi}\sigma v}exp[\frac{(lnv-\mu)^2}{-2s^2}] \\ (J = |J| = \frac{1}{v}) \]
\[ <moment \ generating \ function> \\ M_V(t) = E(e^{vt}) \]
4.3. Plus, by definition of mgf, nth moment can be computed as below:
\[ <n^{th} moment>\\ \frac{d^n}{dt^n}M_V(t)|_{t=0} = E(V^n) \]
4.4. That can be rewritten as below:
\[ \int_0^\infty V^nh(v)dv \]
4.5 Since v = e^x, we substitute x for v. Be ware of the change in the range. (x: all real number, v: positive real number)
+ Be careful when subsituting dx for dx.
\[ \int_0^\infty v^n \frac{1}{\sqrt{2\pi}s v} exp[\frac{(lnv-m)^2}{-2s^2}]dv \\ = \int_{-\infty}^\infty exp(nx) \frac{1}{\sqrt{2\pi}s} exp[\frac{(x-m)^2}{-2s^2}]dx \]
4.6 You have to grind away a bit. I leave this part to readers. At the end, you can obtain this:
\[ exp(nm + \frac{n^2s^2}{2}) \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}s} exp[\frac{(x-m-ns^2)^2}{-2s^2}]dx \]
4.7 But check out the latter part of the outcome. You can see this is just 1 in accordance with the definition of pdf.
\[ \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi}s} exp[\frac{(x-m-ns^2)^2}{-2s^2}]dx \\ \Rightarrow integral \ of \ pdf \ of \ X \sim \mathrm N(m+ns^2, s^2) \ \Rightarrow 1 \]
4.8 Therefore, 4.6 can be rewritten as below. This is V’s nth moment.
\[ \Rightarrow E(V^n) = exp(nm + \frac{n^2s^2}{2}) \]
5.2 2nd moment (mean of square): plug n = 2 in the formula we got from 4.9. \[ E(V^2) = exp(2m + 2s^2) \]
5.3 Variance \[ E(V^2) - {E(V)}^2 = Var(V) \]
\[
lnS_T \sim \mathrm{N}(lnS_0 + (\mu - \sigma^2/2)T, \sigma^2T)
\] Then,
\[
m = ln(S_0) + (\mu - \sigma^2/2)T \\
and \\
s = \sigma\sqrt{T}
\]
6.1 By pluging m and s we computed in E(V) and V(V) we calculated, we can get the below outcome. (Conclusion)
\[ E(V) = E(S_T) = S_0e^{\mu T} \\ Var(V) = Var(S_T) = S_0^2e^{2\mu T}(e^{\sigma^2 T} -1) \]