\[ S_n = X_1 + X_2 + ... + X_n \] \[ Y_{365} = Y_1 + X_1 + X_2 + ... + X_n \] We are given that \[ Y_1 = 100 \]
So we can substitute its value:
\[ Y_{365} = 100 + S_{364} \] Now we can find \(P(100 + S_{364} \ge 100)\)
\[ P(S_{364} \ge 0) \] From this, we know that \(n = 364\). And we are given that the mean is 0. The variance \(\sigma^2=n \times \frac{1}{4}\) = \(\frac{364}{4}=91\). So the standard deviation \(\sigma=\sqrt{91}\).
We can use pnorm. The first argument is 0 because we are calculating the probability of deviation from \(Y_1\) by 0.
1 - pnorm(0, 0, sqrt(91))
## [1] 0.5
Same as part (a), except now the deviation is 10, so change pnorm accordingly.
1 - pnorm(10, 0, sqrt(91))
## [1] 0.1472537
1 - pnorm(20, 0, sqrt(91))
## [1] 0.01801584
Moment generating function (MGF) for a random variable X:
\[ M_X(t) = E[e^{tX}] \]
For a binomial distribution, the probability mass function (PMF) is
\[ P(X = k) = \binom{n}{k} p^k (1-p)^{n-k} \]
You can use it to derive the MGF specific to the binomial distribution.
\[ M_X(t) = E[e^{tX}] \]
\[ e^{tX} = \sum_{k=0}^{n} e^{tk} P(X = k) \]
Substitute
\[ M_X(t) = E\left[\sum_{k=0}^{n} e^{tk} P(X = k)\right] \] \[ = \sum_{k=0}^{n} e^{tk} P(X = k) \]
\[ M_X(t) = \sum_{k=0}^{n} e^{tk} \binom{n}{k} p^k (1-p)^{n-k} \]
\[ M_X(t) = (1 - p + pe^t)^n \]
You can use this to derive the expected value for the binomial distribution.
We have to differentiate the MGF wrt to t and evaluate at t=0.
When you differentiate you get: \[ n(1 - p + pe^t)^{n-1} \cdot p \cdot e^t \]
Then when you evaluate at \(t = 0\): \[ \text{Expected Value} = n \cdot p \]
Now you need to differentiate the MGF twice and evaluate at \(t = 0\). Then you have to subtract the square of the first derivative evaluated at \(t = 0\):
Differentiate twice to get: \[ n(n-1)(1 - p + pe^t)^{n-2} \cdot p^2 \cdot e^{2t} + n(1 - p + pe^t)^{n-1} \cdot p \cdot e^t \]
Evaluating at \(t = 0\): \[ \text{Variance} = n(n-1)p^2 + np - (np)^2 \]
\[ \text{Variance} = np(n-1)p + np - np(n+p) \]
\[ \text{Variance} = np(1-p) \]
This is the PDF for the exponential distribution: \[ f(x) = \lambda e^{-\lambda x} \]
Just like for the binomial distribution, we can derive the MGF for the exponential distribution.
\[ e^{tX} = \int_{0}^{\infty} e^{tx} f(x) dx \]
The generic MGF for PDFs uses an integral but it’s the same idea.
\[ M_X(t) = E\left[\int_{0}^{\infty} e^{tx} f(x) dx\right] \] \[ = \int_{0}^{\infty} e^{tx} f(x) dx \]
Substitute
\[ M_X(t) = \int_{0}^{\infty} e^{tx} \lambda e^{-\lambda x} dx \]
\[ M_X(t) = \int_{0}^{\infty} \lambda e^{-(\lambda - t)x} dx \] \[ = \left[\frac{\lambda}{-(\lambda - t)} e^{-(\lambda - t)x}\right]_{0}^{\infty} \] \[ = \left[0 - \frac{\lambda}{-(\lambda - t)} \right] \] [ =
To find the expected value (mean), we differentiate the MGF with respect to \(t\) once and evaluate it at \(t = 0\):
\[ \text{Expected Value} = \frac{{dM_X(t)}}{{dt}} \bigg|_{t=0} \]
Differentiating \(M_X(t)\) with respect to \(t\): \[ \frac{{dM_X(t)}}{{dt}} = \frac{{d}}{{dt}} \left(\frac{\lambda}{\lambda - t}\right) \] \[ = \frac{{\lambda}}{{(\lambda - t)^2}} \]
Evaluating at \(t = 0\): \[ \text{Expected Value} = \frac{{\lambda}}{{\lambda^2}} = \frac{1}{\lambda} \]
Differentiating wrt t twice: \[ \frac{{d^2M_X(t)}}{{dt^2}} = \frac{{d}}{{dt}} \left(\frac{{\lambda}}{{(\lambda - t)^2}}\right) \] \[ = \frac{{d}}{{dt}} \left(\lambda (\lambda - t)^{-2}\right) \] \[ = -2 \lambda (\lambda - t)^{-3} (-1) \] \[ = \frac{{2\lambda}}{{(\lambda - t)^3}} \]
Evaluate at \(t = 0\): \[ \frac{{2\lambda}}{{\lambda^3}} = \frac{2}{\lambda^2} \]
Subtract:
\[ \text{Variance} = \frac{2}{\lambda^2} - \frac{1}{\lambda^2} = \frac{1}{\lambda^2}\]