pg. 363 Ex. 11

The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by \(Y_n\) on the \(n^{th}\) day of the year. Finn observes that the differences \(X_n = Y_{n+1} - Y_n\) appear to be independent random variables with a common distribution having mean \(\mu = 0\) and variance \(\sigma^2= \frac{1}{4}\). If \(Y_1=100\), estimate the probability that \(Y_365\) is

Note that \(E(x) = \mu = 0\), \(\sigma=12\), \(n=364\)

This can be modeled as a Gaussian random walk of 364 steps. The probability distribution is given by the normal curve: \(N(100, n\sigma^2) = N(100,91)\), since the starting point of the price is 100 USD.

We can use pnorm to calculate the probabilities.

  1. \(\geq100\)
  2. \(\geq110\)
  3. \(\geq120\)

Solution:

Price Pilsdorff Beer Company stock on day one be \(Y_1=100\), over the course of 365 days \(Y_{n+1} = \{100, Y_2, Y_3, ...., Y_{365}\}\).

Let \(X_1\), be \(Y_2−Y_1\), \(X_n = \{(Y_2-100), (Y_3-Y_2), (Y_4-Y_3), ...., (Y_{365}-Y_{364})\}\)

Given mean \(\mu=0\), variance \(\sigma=14\), \(n=365\)

Sum of the price difference =\(S_{365} = X_1 + X_2 + X_3+....+ X_{365}\)

\(S_{365} = (Y_2-100)+ (Y_3-Y_2)+ (Y_4-Y_3)+ ....+ (Y_{365}-Y_{364})\)

Applying simple algebra,

\(S_{365} = (Y_365 - 100) = (Y_{n+1} - 100)\)

Sum of the means over 365 days \(\mu + \mu + \mu + ... + \mu\)=

= \(0+0+0+...+0\)

= \(\mu365\)=\(0\)

Variance over 365 days \(\sigma_{365}^2 = \frac{1}{4} + \frac{1}{4} + \frac{1}{4}+ ... + \frac{1}{4}\).

=\(365∗14\)

\(\sigma_{365}^2 = 91.25\)

Standard deviation over 365 days = \(\sigma_{365} = \sqrt{91.25} = 9.5524866\)

Hence,

\(S_{n+1} = (Y_{n+1} - 100)\)

\(\mu_{n+1} = 0\)

\(\sigma_{n+1} = 9.5524866\)

(a) ≥100

\(= P(|Y_{n+1} - 100| \ge 100)\)

\(= P(Y_{n+1} \ge 100 - 100)\)

\(= P(Y_{n+1} \ge 0)\)

Since data is normally distributed, and mean \(\mu_{n+1} = 0\), it is highly likely half of the values will be greater than 0.

Therefore \(= P(Y_{n+1}) \ge 0 = 0.5\)

Using R function pnorm

mean = 0
sd = sqrt(365/4)
y = 100 - 100

pnorm(y, mean = mean, sd = sd, lower.tail = FALSE)
## [1] 0.5

(b) ≥110

\(= P(|Y_{n+1} - 100| \ge 110)\)

By applying \(z = \frac{point~ estimate - null~ value}{SE_{point~ estimate}}\)

Point estimate = 110, null value = 100 and standard error = 9.5524866

\(= P(Y_{n+1}) \ge \frac{(110 - 100)}{9.5524866}\) \(= P(Y_{n+1}) \ge \frac{10}{9.5524866}\) \(= P(Y_{n+1}) \ge 1.0468478\)

Using normal distribution tables z value for 1.0468478=0.1475849

Therefore, probability of \(Y_n+1\) greater than or equal to 110 is 14.76%

Using R function pnorm

mean = 0
sd = sqrt(365/4)
y = 110 - 100

pnorm(y, mean = mean, sd = sd, lower.tail = FALSE)
## [1] 0.1475849

(c) ≥120

\(= P(|Y_{n+1} - 100| \ge 120)\)

By applying \(z = \frac{point~ estimate - null~ value}{SE_{point~ estimate}}\)

Point estimate = 120, null value = 100 and standard error = 9.5524866

\(= P(Y_{n+1}) \ge \frac{(120 - 100)}{9.5524866}\)

\(= P(Y_{n+1}) \ge \frac{20}{9.5524866}\)

\(= P(Y_{n+1}) \ge 2.0936957\)

Using normal distribution tables \(z\) value for 2.0936957=0.0181435 Therefore, probability of \(Y_n+1\) greater than or equal to 120 is 1.81%

Using R function pnorm

mean = 0
sd = sqrt(365/4)
y = 120 - 100

pnorm(y, mean = mean, sd = sd, lower.tail = FALSE)
## [1] 0.01814355

2. Calculate the expected value and variance of the binomial distribution using the moment generating function

Solution:

Binomial Formula. Suppose a binomial experiment consists of \(n\) trials and results in \(x\) successes. If the probability of success on an individual trial is \(p\), then the binomial probability is: \(b(x,n,p)\)

By expanding,

\(b(x,n,p) = {{n}\choose{x}}p^{x}q^{n-x}\), where \(q=(1−p)\)

Expected value and variance are two main moments of the distribution.

Expected value is also known as \(mean = <X> = \sum p_ix_i\). Mean is the weighted sum of all \(x′s\) weighted by its probabilites.

Using Binomial distribution where the outcome is success(\(p\)) or failure(\(q\)), mean \(<X> = E(X) = np\), lets prove it.

\(P(X = x) = {{n}\choose{x}}p^{x}q^{n-x}\), where \(x = \{0,1,2,...,n\}\)

\(E(X) = \sum_{x=0}^{n}xp(x) = \sum_{x=0}^{n}x{{n}\choose{x}}p^{x}q^{n-x}\)

expanding the values x=0

\((x =0) = 0*{{n}\choose{0}}p^{0}q^{n-0} = 0\)

\((x=1) = 1*{{n}\choose{1}}p^{1}q^{n-1} = 1*\frac{n!}{(n-1)! * 1!}p^1q^{(n-1)} = npq^{(n-1)}\)

\((x=2) = 2*{{n}\choose{2}}p^{2}q^{n-2} = 2*\frac{n!}{(n-2)! * 2!}p^2q^{(n-2)} = n(n-1)p^2q^{(n-2)}\)

\((x=3) = 3*{{n}\choose{3}}p^{3}q^{n-3} = 3*\frac{n!}{(n-3)! * 3!}p^3q^{(n-3)} = \frac{n(n-1)(n-2)p^3q^{(n-3)}}{2}\)

continuing up to \(x=n\) \((x=n) = n*{{n}\choose{n}}p^{n}q^{n-n} = n*\frac{n!}{(n-n)! * n!}p^nq^{(n-n)} = np^nq^{0}\),

In this method \({{n}\choose{c}}\) is called binomial coefficient

Adding all the values.

\(E(X=n) = 0 + npq^{(n-1)} + n(n-1)p^2q^{(n-2)} + \frac{n(n-1)(n-2)p^3q^{(n-3)}}{2} + .... + np^nq^{0}\)

\(E(X=n) = np(q^{(n-1)} + (n-1)pq^{(n-2)} + \frac{(n-1)(n-2)p^2q^{(n-3)}}{2} + .... + p^{n-1})\)

By applying binomial identity,\(binomial~ identity, (q^{(n-1)} + (n-1)pq^{(n-2)} + \frac{(n-1)(n-2)p^2q^{(n-3)}}{2} + .... + p^{n-1})\) it results in \((p+q)^{n-1}\).

Hence \(E(X) = np(p+q)^{n-1}\). According to Binomial distribution \(p+q=1\), substituting the values

\(E(X=n) = np(1)^{n-1} = np\)

Therefore Expected valued, \(E(X)=np\). This is also the first movement of Binomial distribution.

\(E(X^2) = \sum_{x=0}^{n}x^2p(x)\),\(x^2 = x(x-1) + x\) and \(p(x) = {{n}\choose{x}}p^{x}q^{n-x}\)

Substituting the values

\(E(X^2) = \sum_{x=0}^{n}(x(x-1) + x){{n}\choose{x}}p^{x}q^{n-x}\)

\(E(X^2) = \sum_{x=0}^{n}x(x-1){{n}\choose{x}}p^{x}q^{n-x} + \sum_{x=0}^{n}x{{n}\choose{x}}p^{x}q^{n-x}\)

Since \(\sum_{x=0}^{n}x{{n}\choose{x}}p^{x}q^{n-x} = np\)

\(E(X^2) = \sum_{x=0}^{n}x(x-1){{n}\choose{x}}p^{x}q^{n-x} + np\)

Expanding \(\sum_{x=0}^{n}x(x-1){{n}\choose{x}}p^{x}q^{n-x}\)

\((x=0) = 0(0-1){{n}\choose{0}}p^{0}q^{n-0} =0\)

\((x=1) = 1(1-1){{n}\choose{1}}p^{1}q^{n-1} =0\)

\((x=2) = 2(2-1){{n}\choose{2}}p^{2}q^{n-2} = n(n-1)p^2q^{n-2}\)

\((x=3) = 3(3-1){{n}\choose{3}}p^{3}q^{n-3} = n(n-1)(n-2)p^3q^{n-3}\)

Continuing upto \(n\)

\((x=n) = n(n-1){{n}\choose{n}}p^{n}q^{n-n} = n(n-1)p^n\)

\((X^2) = 0 + 0 + n(n-1)p^2q^{n-2} + n(n-1)(n-2)p^3q^{n-3} + ... + n(n-1)p^n + np\)

\(E(X^2) = n(n-1)p^2(q^{n-2} + (n-2)pq^{n-3} + ... + p^{n-2}) + np\)

Equation \((q^{n-2} + (n-2)pq^{n-3} + ... + p^{n-2})\), meets binomia identity property

\(E(X^2) = n(n-1)p^2(p+q)^{n-2} + np\)

\(E(X^2) = n(n-1)p^2 + np = n^2p^2 - np^2 + np\)

The second moment of the binomial distribution is variance

\(V(X) = E(X^2) - E(X)^2\) since \(E(X) = np, E(X)^2 = n^2p^2\)

\(E(X) = np, E(X)^2 = n^2p^2\)

\(V(X) = np(1-p)\), since \(V(X) = np(1-p)\)

\(V(X) = npq\)

3. Calculate the expected value and variance of the exponential distribution using the moment generating function.

The probability density function of an exponential distribution is

\(f(x,\lambda) = \lambda e^{{-\lambda}x}\)

Moment generating function for \(X\), with exponential(\(\lambda\)) is

\(M(T) = E[e^{tX}]\), since moment generating function is continuous function; hence it will have an integral and not summation

\(M(T) = E[e^{tX}] = \int_0^\infty e^{tx}\lambda e^{{-\lambda}x}dx\)

\(= \lambda\int_0^\infty e^{(t-\lambda)x}dx\)

\(\frac{\lambda}{(t-\lambda)}\bigg[e^{(t-\lambda)x} \bigg]_0^\infty\)

In this case \(t\) cannot be positive value because it casues equation to diverge instead of converging. Hence one has to assume \(t\) is negative. \(e^{(t-\lambda)x}\), when \(x = \infty\) is \(e^{-\infty}\) resulting in 0 and \(e^{0}\) as 1

Therefore \(M(T) = \frac{\lambda}{(t-\lambda)}(0-1) = \frac{\lambda}{(\lambda - t)}\) where \(t<\lambda\)

Expected value \(E(X)\) is first integral of Moment generating function.

\(E(X) = \int M(X)dx\)

\(E(X) = \int \frac{\lambda}{(\lambda - t)} dt\)

Applying Calculus

\(E(t) = \int \frac{\lambda}{(\lambda - t)} dt\)

\(E(t) = \frac{\lambda}{(\lambda - t)^2}\)

for t = 0

\(E(t = 0) = \frac{1}{\lambda}\)

Hence expected value \(E(X) = \frac{1}{\lambda}\)

The second derivate gives us \(E(X^2)\)

\(E(X^2) = \int \frac{\lambda}{(\lambda - t)^2}\)

Applying calculus

\(E(X^2) = \frac{2\lambda}{(\lambda - t)^3}\)

for \(t=0\)

\(E(X^2) = \frac{2\lambda}{(\lambda - t)^3} = \frac{2}{\lambda^2}\)

Variance is \(V(X) = E(X^2) - E(X)^2 = \frac{2}{\lambda^2} - (\frac{1}{\lambda})^2\)

Therefore \(V(X) = \frac{1}{\lambda^2}\)