## installed_and_loaded.packages.
## prettydoc TRUE
p363, 11. The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by Yn on the nth day of the year. Finn observes that the differences \(X_n = Y_{n+1} ??? Y_n\) appear to be independent random variables with a common distribution having mean \(\mu = 0\) and variance \(\sigma^2 = 1/4\). If \(Y_1 = 100\), estimate the probability that \(Y_365\) is
- \(\geq 100\)
day_num <- 365
n <- day_num - 1
mu <- 0
y1 <- 100
mu <- mu + y1
VarX <- 0.25 * n
sigma <- sqrt(VarX)
x <- 100
pnorm(x, mu, sigma, lower.tail = F)
## [1] 0.5
- \(\geq 110\)
x <- 110
pnorm(x, mu, sigma, lower.tail = F)
## [1] 0.1472537
- \(\geq 120\)
x <- 120
pnorm(x, mu, sigma, lower.tail = F)
## [1] 0.01801584
2. Calculate the expected value and variance of the binomial distribution using the moment generating function.
\[\begin{equation}
f(x) = {n \choose k} p^k (1 - p)^{n - k}, k \in \{0, 1, 2, ...\}
\end{equation}\]
\[\begin{equation}
M_X(t) = E(e^{tx}) = \sum_{k = 0}^{n} e^{tk} {n \choose k} p^x (1 - p)^{n - k}
\end{equation}\]
\[\begin{equation}
M_X(t) = \sum_{k = 0}^{n} {n \choose k} (pe^t)^k (1 - p)^{n - k}
\end{equation}\]
\[\begin{equation}
M_X(t) = \sum_{k = 0}^{n} {n \choose k} (pe^t)^x (1 - p)^{n - k}
\end{equation}\]
\[\begin{equation}
M_X(t) = [pe^t + 1 - p]^n
\end{equation}\]
The first moment is the expected value:
\[\begin{equation}
M_X'(0) = E(x) = n[pe^t + 1 - p]^{n-1} (pe^t) = n(p + 1 - p)^{n - 1} * p = np
\end{equation}\]
The second moment:
\[\begin{equation}
M_X''(0) = n(n - 1)[pe^t + 1 - p]^{n-2} (pe^t)^2 n[pe^t + 1 - p]^{n-1} (pe^t) = n(n-1)p^2+np
\end{equation}\]
\[\begin{equation}
V(X) = E(X^2) - E(X)^2 = n(n-1)p^2+np - n^2p^2 = np(1 - p)
\end{equation}\]
reference: http://mathworld.wolfram.com/BinomialDistribution.html
3. Calculate the expected value and variance of the exponential distribution using the moment generating function.
\[\begin{equation}
f(x) = \lambda e^{-\lambda x}, 0 < x < \infty
\end{equation}\]
\[\begin{equation}
M_X(t) = E(e^{tx}) = \int_{0}^{\infty} e^{tx} \lambda e^{-\lambda x} dx
\end{equation}\]
\[\begin{equation}
M_X(t) = \lambda \int_{0}^{\infty} e^{tx} e^{-\lambda x} dx
\end{equation}\]
\[\begin{equation}
M_X(t) = \lambda \int_{0}^{\infty} e^{(t -\lambda) x} dx
\end{equation}\]
\[\begin{equation}
M_X(t) = \lambda \int_{0}^{\infty} e^{(t -\lambda) x} dx
\end{equation}\]
\[\begin{equation}
M_X(t) = \frac{\lambda}{t -\lambda}
\end{equation}\]
The first moment is the expected value:
\[\begin{equation}
M_X'(0) = E(x) = \frac{\lambda}{(t -\lambda)^2} = \frac{1}{\lambda}
\end{equation}\]
The second moment:
\[\begin{equation}
M_X''(0) = \frac{-2\lambda}{(t -\lambda)^3}
\end{equation}\]
\[\begin{equation}
V(X) = E(X^2) - E(X)^2 = \frac{-2\lambda}{(t -\lambda)^3} - \frac{1}{\lambda^2} = \frac{1}{\lambda^2}
\end{equation}\]