Please refer to the Assignment 9 Document.

1 Problem Set 1 (Sec9.3 Ex11)

The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by \(Y_{n}\) on the \(n\)th day of the year. Finn observes that the differences \(X_{n} = Y_{n+1} − Y_{n}\) appear to be independent random variables with a common distribution having mean \(μ = 0\) and variance \(\sigma^{2} = \frac{1}{4}\).

If \(Y_{1} = 100\), estimate the probability that \(Y_{365}\) is

(a.) \(\geq 100\)

(b.) \(\geq 110\)

(c.) \(\geq 120\)

1.1 PS1 Answer

From the given information, we have

  • \(\mu = 0, \; \sigma^{2} = \frac{1}{4}\)

  • \(X_{n} + Y_{n} = Y_{n+1}\)

  • \(Y_{1} = 100\)

Thus, \(Y_{n+1} = X_{n} + Y_{n} = X_{n} + X_{n-1} + Y_{n-1} = X_{n} + X_{n-1} + \cdots + X_{2} + X_{1} + Y_{1} = \sum_{i=1}^{n}X_{i} + 100\)

Therefore, \(Y_{365} = \sum_{i=1}^{364}X_{i} + 100\).

As \(X_{i}\) has \(\mu = 0, \; \sigma^{2} = \frac{1}{4}\), by Central Limit Theorem, we have

\[\frac{1}{n} \sum_{i=1}^{n}X_{i} \sim N(0, \frac{\sigma^{2}}{n}) \Rightarrow \sum_{i=1}^{n}X_{i} \sim N(0, \sigma^{2}n)\] \[\therefore Y_{365} \sim N(100, \frac{364}{4})\]

1.1.1 Part (a)

## [1] 0.5

\(P(Y_{365}\geq 100) = 0.5\)

1.1.2 Part (b)

## [1] 0.1472537

\(P(Y_{365}\geq 110) = 0.1472537\)

1.1.3 Part (c)

## [1] 0.01801584

\(P(Y_{365}\geq 120) = 0.01801584\)

2 Problem Set 2

Calculate the expected value and variance of the binomial distribution using the moment generating function.

2.1 PS2 Answer

Binomial distribution is \(\begin{pmatrix}n\\k\end{pmatrix} p^{k} (1-p)^{n-k}\)

Moment generating function is

\[g(t) = \sum_{k=0}^{n} e^{tk} \begin{pmatrix}n\\k\end{pmatrix} p^{k} (1-p)^{n-k}\] \[= \sum_{k=0}^{n} \begin{pmatrix}n\\k\end{pmatrix} (pe^{t})^{k} (1-p)^{n-k}\] \[=(pe^{t}+(1-p))^{n}\]

First derivative of g(t) is

\[g'(t) = n(pe^{t}+(1-p))^{n-1}(pe^{t})\]

Thus,

\[E[X] = \mu = g'(t=0) = n(pe^{0}+(1-p))^{n-1}(pe^{0}) = np\]

Second derivative of g(t) is

\[g''(t) = n(n-1)(pe^{t}+(1-p))^{n-2}(pe^{t})^{2}+n(pe^{t}+(1-p))^{n-1}(pe^{t})\]

Thus, \[E[X^{2}] = g''(t=0) = n(n-1)(pe^{0}+(1-p))^{n-2}(pe^{0})^{2}+n(pe^{0}+(1-p))^{n-1}(pe^{0})\]

\[= n(n-1)(p)^{2} + np = n^{2}p^{2}-np^{2}+np\] and,

\[\sigma^{2} = E[X^{2}]-E[X]^{2} = n^{2}p^{2}-np^{2}+np - (np)^{2} = -np^{2}+np = np(1-p)\]

Therefore, the expected value and the variance of the binomial distribution using the moment generating function is \(np\) and \(np(1-p)\).

3 Problem Set 3

Calculate the expected value and variance of the exponential distribution using the moment generating function.

3.1 PS3 Answer

Exponential distribution is \(\lambda e^{-\lambda x}\) for \(X \in [0,\infty)\).

Moment generating function is

\[g(t) =\int_{x=0}^{\infty} e^{tx} \lambda e^{-\lambda x} dx\] \[= \lambda \int_{x=0}^{\infty} e^{(t-\lambda) x} dx\] \[= \begin{bmatrix} \frac{\lambda e^{(t-\lambda) x}}{(t-\lambda)} \end{bmatrix} ^{\infty}_{0}\] \[= \frac{0-\lambda}{t-\lambda} \;\; for\;t<\lambda\] \[= \lambda (\lambda-t)^{-1} \;\; for\;t<\lambda\]

First derivative of g(t) is

\[g'(t) = -\lambda(\lambda-t)^{-2}(-1)=\lambda(\lambda-t)^{-2}\]

Thus, \(E[X] = \mu = g'(t=0) = \lambda^{-1}\)

Second derivative of g(t) is

\[g''(t) = (-2)\lambda(\lambda-t)^{-3}(-1) = 2\lambda(\lambda-t)^{-3}\]

Thus, \(E[X^{2}] = g''(t=0) = 2\lambda^{-2}\)

and,

\[\sigma^{2} = E[X^{2}]-E[X]^{2} = 2\lambda^{-2} - (\lambda^{-1})^{2} = \lambda^{-2}\]

Therefore, the expected value and the variance of the binomial distribution using the moment generating function is \(\lambda^{-1}\) and \(\lambda^{-2}\).