The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by \(Y_n\) on the nth day of the year. Finn observes that the differences \(X_n = Y_{n+1} − Y_n\) appear to be independent random variables with a common distribution having mean \(\mu = 0\) and variance \(\sigma^2 = 1/4\). If \(Y_1 = 100\), estimate the probability that \(Y_{365}\) is:
n <- 365 - 1
mu <- 0
var <- 1 / 4
upper <- (Inf - mu) / (n * var)^0.5
a <- 0
z_a <- (a - mu) / (n * var)^0.5
p_a <- round(pnorm(z_a, mean = 0, sd = 1, lower.tail = FALSE), 3)
b <- 10
z_b <- (b - mu) / (n * var)^0.5
p_b <- round(pnorm(z_b, mean = 0, sd = 1, lower.tail = FALSE), 3)
c <- 20
z_c <- (c - mu) / (n * var)^0.5
p_c <- round(pnorm(z_c, mean = 0, sd = 1, lower.tail = FALSE), 3)
\(P(Y_{365} \ge 100) = 0.5\).
\(P(Y_{365} \ge 110) = 0.147\).
\(P(Y_{365} \ge 120) = 0.018\).
Calculate the expected value and variance of the binomial distribution using the moment generating function.
Here is the moment generating function:
\(M(t) = ((1 - p) + pe^t)^n\)
Take the first derivative:
\(M^\prime (t) = n(1 - p + pe^t)^{n-1}(pe^t)\)
Evaluate the first derivative at \(t = 0\):
\(M^\prime(0) = \mu = n(1 - p + pe^0)^{n-1}(pe^0)\)
\(M^\prime(0) = \mu = n(1 - p + p)^{n-1}(p)\)
\(M^\prime(0) = \mu = n(1)^{n-1}(p)\)
\(M^\prime(0) = \mu = n(1)(p)\)
\(M^\prime(0) = \mu = np\). So we now have the mean.
Take the second derivative:
\(M^{\prime\prime}(t) = n(1 - p + pe^t)^{n-1}(pe^t) + (pe^t)n(n - 1)(1 - p + pe^t)^{n-2}(pe^t)\)
Evaluate the second derivative at \(t = 0\):
\(M^{\prime\prime}(0) = n(1 - p + pe^0)^{n-1}(pe^0) + (pe^0)n(n - 1)(1 - p + pe^0)^{n-2}(pe^0)\)
\(M^{\prime\prime}(0) = n(1 - p + p)^{n-1}(p) + p(n)(n - 1)(1 - p + p)^{n-2}(p)\)
\(M^{\prime\prime}(0) = n(1)^{n-1}(p) + p(n)(n - 1)(1)^{n-2}(p)\)
\(M^{\prime\prime}(0) = n(1)(p) + p(n)(n - 1)(1)(p)\)
\(M^{\prime\prime}(0) = np + p(n)(n - 1)(p)\)
\(M^{\prime\prime}(0) = np + n^2p^2 - np^2\)
Subtract \((M^{\prime}(0))^2\) from \(M^{\prime\prime}(0)\) to get the variance:
\(\sigma^2 = M^{\prime\prime}(0) - (M^\prime(0))^2 = np + n^2p^2 - np^2 - (np)^2\)
\(\sigma^2 = np + n^2p^2 - np^2 - n^2p^2\)
\(\sigma^2 = np - np^2\)
\(\sigma^2 = np(1 - p)\). So we now have the variance.
Calculate the expected value and variance of the exponential distribution using the moment generating function.
Here is the moment generating function:
\(M(t) = \frac{\lambda}{\lambda - t}\)
Take the first derivative:
\(M^\prime(t) = \frac{\lambda}{(\lambda - t)^2}\)
Evaluate the first derivative at \(t = 0\):
\(M^\prime(0) = \mu = \frac{\lambda}{(\lambda - 0)^2}\)
\(M^\prime(0) = \mu = \frac{\lambda}{\lambda^2}\)
\(M^\prime(0) = \mu = \frac{1}{\lambda}\). So now we have the mean.
Take the second derivative:
\(M^{\prime\prime}(t) = \frac{2\lambda}{(\lambda - t)^3}\)
Evaluate the second derivative at \(t = 0\):
\(M^{\prime\prime}(0) = \frac{2\lambda}{(\lambda - 0)^3}\)
\(M^{\prime\prime}(0) = \frac{2\lambda}{\lambda^3}\)
\(M^{\prime\prime}(0) = \frac{2}{\lambda^2}\)
Subtract \((M^{\prime}(0))^2\) from \(M^{\prime\prime}(0)\) to get the variance:
\(\sigma^2 = M^{\prime\prime}(0) - (M^\prime(0))^2 = \frac{2}{\lambda^2} - (\frac{1}{\lambda})^2\)
\(\sigma^2 = \frac{2}{\lambda^2} - (\frac{1}{\lambda})^2\)
\(\sigma^2 = \frac{2}{\lambda^2} - \frac{1}{\lambda^2}\)
\(\sigma^2 = \frac{1}{\lambda^2}\). So now we have the variance.