Question 1

Page 363 Grinstead Probability Question 11:

A:

\(P(Y_{365}) \geq 100\)

This can be represented by a normal distribution. For probabilities less than or equal to a value, we should use pnorm() with lower.tail = FALSE. Note, the variable \(X_n = Y_{n+1} - Y_n\) is the normally distributed variable. This also qualifies for use with the Central Limit Theorem.

\(z = \frac {X - n \mu}{\sigma \sqrt{n}}\)

The z-score value can be used in conjunction with the pnorm() function to determine the probability.

y <- 100
x <- 100
n <- 365
mu <- 0
var <- 1/4
sig <- sqrt(var)
z <- (y - x - n*mu)/(sig*sqrt(n))

y365 <- pnorm(y, mean = x, sd = sqrt(var*n), lower.tail = FALSE)

round(y365, 3)
## [1] 0.5
round(1 - pnorm(z),3)
## [1] 0.5

B:

\(P(Y_{365}) \geq 110\)

y <- 110

y365b <- pnorm(y, mean = x, sd = sqrt(var*n), lower.tail = FALSE)
round(y365b, 3)
## [1] 0.148
z <- (y - x - n*mu)/(sig*sqrt(n))

round(1 - pnorm(z), 3)
## [1] 0.148

C:

\(P(Y_{365}) \geq 120\)

y <- 120

y365c <- pnorm(y, mean = x, sd = sqrt(var*n), lower.tail = FALSE)
round(y365c, 3)
## [1] 0.018
z <- (y - x - n*mu)/(sig*sqrt(n))

round(1 - pnorm(z), 3)
## [1] 0.018

Question 2

Expected Value and Variance of binomial distribution using MGF:

The MGF for a binomial distribution is: \(M(t) = {(pe^t + q)}^n\), where \(p\) is the probability of success and \(q\) is the complement of the probability of success (p(failure)). \(n\) denotes the sample size and \(t\) is the variable of integration. The mean is given by the first moment of the MGF, and the first moment is found by evaluating the first derivative the MGF at t = 0.

\(M(t) = {(pe^t + q)}^n\)

\(\frac d{dt} M(t) =\frac d{dt} {(pe^t + q)}^n\) \(\frac d{dt} M(t) = pe^tn{(pe^t + q)}^{(n - 1)}\) For \(t = 0\): \(\frac d{dt} M(t = 0) = 1pn{(1 + q)}^{(n - 1)}\)

Because q = 1 - p,

\(\frac d{dt} M(t = 0) = 1pn{(1p + 1 - p)}^{(n - 1)}\)

\(\frac d{dt} M(t = 0) = pn \times 1\)

Expectation:

\(\therefore E[X] = pn\) for the binomial distribution.

The variance is given by \(E[X^2] - {E[X]}^2\), the second moment of the MGF minus the first moment squared, evaluated at t = 0. For simplicity, from here on, \(\frac d{dt}M(t)\) will be written as \(M'(t)\)

\(M(t) = {(pe^t + q)}^n\)

\(M'(t) = pe^tn{(pe^t + q)}^{(n - 1)}\)

Recall the product rule, \(\frac d{dx} f(x)g(x) = f'(x)g(x) + g'(x)f(x)\)

\(M''(t) = ((n - 1)(pe^t))npe^t{(pe^t + q)}^{(n - 2)} + pe^tn{(pe^t + q)}^{(n - 1)}\)

\(M''(0) = (n - 1)(p)(n)(p)({1^{(n - 2)}}) + pn({1^{(n - 1)}})\)

\(M''(0) = p^2n(n - 1) + pn\)

\(Var(X) = M''(0) - {M'(0)}^2\)

\(Var(X) = p^2n(n - 1) + pn - {(pn^2)}\)

\(Var(X) = p^2n^2 - p^2n + pn - p^2n^2\)

\(Var(X) = pn - p^2n\)

Variance:

\(Var(X) = pn(1 - p)\)

Question 3

Expected Value and Variance of exponential distribution using MGF:

The workflow for this process is similar to that of question 2, even though the exponential distribution is continuous while the binomial distribution is discrete.

MGF: \(\frac 1{1 - \theta t}\), where \(t\) is the variable of integration/derivation.

\(M'(t) = \frac d{dt} \frac 1{1 - \theta t}\)

\(M'(t) = \frac \theta{({1 - \theta t})^2}\), using the chain rule with \(u = 1 - \theta t\) and \(\frac 1{du} = - \frac {1}{{u}^2}\)

\(M'(0) = \frac {\theta}{{(1 - 0)}^2}\)

\(M'(0) = \theta\)

Expectation

\(E[X] = \theta\)

Recalling the first moment, \(M'(t) = \frac \theta{({1 - \theta t})^2}\), the second moment can be found by differentiating for a second time.

\(M''(t) = \frac d{dt} \frac \theta{({1 - \theta t})^2}\)

\(M''(t) = \frac {2 \theta \theta}{({1 - \theta t})^3}\), by following the same pattern.

\(M''(0) = \frac {2 \theta^2}{1} = 2\theta^2\)

Bringing it all together, \(Var[X] = E[X^2] - {E[X]}^2\)

\(Var[X] = 2\theta^2 - {\theta}^2\)

Variance

\(Var[X] = \theta^2\)