Data 605 HW 9

library(knitr)
library(rmdformats)

## Global options
options(max.print="85")
opts_chunk$set(cache=TRUE,
               prompt=FALSE,
               tidy=TRUE,
               comment=NA,
               message=FALSE,
               warning=FALSE)
opts_knit$set(width=31)

library(visualize)

#11 page 363

The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by Yn on the nth day of the year. Finn observes that the differences Xn = Yn+1 - Yn appear to be independent random variables with a common distribution having mean \(\mu\)= 0 and variance \(\sigma^2\) = 1/4. If Y1 = 100, estimate the probability that Y365 is

Ans:

Since Xn = Yn+1 - Yn, we get Yn+1 = Xn + Yn

For n = 2, Y2 = Y1 + X1

For n = 365, Y365 = Y364 + X364

\[ \begin{multline*} \begin{split} & = Y_1 + X_1 + X_2 + ... + X_{364} \end{split} \end{multline*} \]

If Y1 = 100, \(\mu_{x_i}\) = 0, then \(\mu_{y_{365}}\) = E(Y365) = E(Y1) + E(X1) + E(X2) + … + E(X364) = 100 + 0 + 0 + … + 0 = 100.

\(\sigma_{y_{365}} = \sqrt{n \times \frac{1}{4}} = \sqrt{\frac{365}{4}}\)

(a) ≥ 100.

s = sqrt(365/4)
mu = 100
x = 100
pnorm(q = x, mean = mu, sd = s, lower.tail = FALSE)
[1] 0.5
visualize.norm(stat = x, mu = mu, sd = s, section = "upper")

(b) ≥ 110.

x = 110
pnorm(q = x, mean = mu, sd = s, lower.tail = FALSE)
[1] 0.1475849
visualize.norm(stat = x, mu = mu, sd = s, section = "upper")

(c) ≥ 120.

x = 120
pnorm(q = x, mean = mu, sd = s, lower.tail = FALSE)
[1] 0.01814355
visualize.norm(stat = x, mu = mu, sd = s, section = "upper")

2.

Calculate the expected value and variance of the binomial distribution using the moment generating function. 

Ans: The probability mass function for the binomial distribution is:

\(\binom{n}{j}p^{j}q^{n-j}\)

Thus the moment generating function \(g(t) = \sum_{j=0}^{n}e^{tj}\binom{n}{j}p^{j}q^{n-j}\)

\(= (pe^{t} + q )^{n}\)

The first moment function is:

\[ \begin{multline*} \begin{split} g'(t) &= n(pe^{t} + q)^{n-1}pe^{t} \\ E(X) = \mu_1 = g'(0) &= n(pe^{0} + q)^{n-1}pe^0 \\ & = np(1-p+p)^{n-1} & = np \end{split} \end{multline*} \]

The second moment function is:

\[ \begin{multline*} \begin{split} g''(t) &= n(n-1)(pe^t+q)^{n-2}p^2e^{2t} + n(pe^t + q)^{n-1}pe^t \\ E(X^2) = \mu_2 = g''(0) & = n(n-1)(pe^0 + q)^{n-2} p^2e^{0}+ n(pe^0 + q)^{n-1}pe^0\\ & = n (n-1) p^2 + np \end{split}^0 \end{multline*} \]

Therefore, expected value (\(\mu\)) is just \(\mu_1\) = \(np\) and variance \(\sigma^2 = \mu_2 - \mu_1^{2} = np(1-p)\), as expected.

3.

Calculate the expected value and variance of the exponential distribution using the moment generating function.

Ans: The probability density function for exponential distribution is:

\(\lambda e^{-\lambda x}\)

Thus the moment generating function for the exponential distribution is:

g(t) = \(\frac{\lambda}{\lambda - t}\) for \(t \lt \lambda\)

The first moment function is:

$$ \[\begin{multline*} \begin{split} g'(t) &= - \lambda(\lambda - t)^{-2}(-1) \\ & = \frac{\lambda}{(\lambda-t)^2} \\ E(X) = \mu_1 &= g'(0) = \frac{1}{\lambda} \\ \end{split} \end{multline*}\] $$

The second moment function is:

\[ \begin{multline*} \begin{split} g''(t) &= -2 \lambda(\lambda-t)^{-3}(-1) \\ & = \frac{2\lambda}{(\lambda-t)^3} \\ E(X^2) = \mu_2 &= g''(0) = \frac{2\lambda}{(\lambda-0)^3} \\ & = \frac{2}{\lambda^2} \\ \end{split} \end{multline*} \]

Therefore, expected value, \(\mu\), is just \(\mu_1\)= 1/\(\lambda\) and variance \(\sigma^2 = \mu_2 - \mu_1^{2} = \frac{2}{\lambda^2}-\frac{1}{\lambda^2} = \frac{1}{\lambda^2}\), as expected.