R Markdown
≥ 100.
≥ 110.
≥ 120.
To apply the Central Limit Theorem, we can’t use \(Y_{n}\) but need to express the probability for \(Y_{n}\) in terms of a sum \(S_{n}\) of i.i.d random variables. Using simple algebra, we rearrange \(Y_{n}\) as a telescoping series and use the identity \(X_{i}=Y_{i+1}-Y_{i}\). \[ \begin{array}{rr} Y_{n}= & Y_{1}+\left(Y_{2}-Y_{1}\right)+\left(Y_{3}-Y_{2}\right)+\cdots+\left(Y_{n}-Y_{n-1}\right) \\ = & Y_{1}+X_{1}+X_{2}+\cdots+X_{n-1} \\ Y_{n}-Y_{1}= & X_{1}+X_{2}+\cdots+X_{n-1} \\ Y_{n}-Y_{1}= & S_{n-1} \end{array} \] It follows that for \(a<b\), by the CLT, \[ \lim _{n \rightarrow \infty} P\left[a<\frac{S_{n}-n \mu}{\sqrt{n} \sigma}<b\right]=\Phi(b)-\Phi(a) \] We use the same approach to solve (a)-(c) by formulating a lemma stated below: For any real \(C\), \[ P\left[Y_{365}>100+C\right] \sim \Phi\left(-\frac{2 C}{\sqrt{364}}\right) \]
Proof of Lemma We know that \(Y_{1}=100\) and \[ \begin{array}{rrr} P\left[Y_{n}>100+C\right] & =P\left[Y_{n}>Y_{1}+C\right] \\ & =\quad P\left[Y_{n}-Y_{1}>C\right] \\ & =\quad P\left[S_{n-1}>C\right] \end{array} \] By the CLT, the latter term is equivalent by minor algebra re-arrangement to: \[ P\left[S_{n-1}>C\right]=P\left[\frac{S_{n-1}-(n-1) \mu}{\sqrt{n-1} \sigma}>\frac{C-(n-1) \mu}{\sqrt{n-1} \sigma}\right] \sim 1-\Phi\left(\frac{C-(n-1) \mu}{\sqrt{n-1} \sigma}\right) \] Since \(\mu=0\) and \(n=365\) and \(\sigma=1 / 2\), we obtain the equivalent format: \[ P\left[Y_{365}>100+C\right] \sim \Phi\left(-\frac{C}{\sqrt{364} \frac{1}{2}}\right)=\Phi\left(-\frac{2 C}{\sqrt{364}}\right) \] This proves the lemma. To solve (a), set \(C=0\). To solve (b), set \(C=10\) and to solve \((\mathrm{c})\), set \(C=20\). We calculate the estimates in R: We calculate the estimates in R: (prob_100 \(=\operatorname{pnorm}(-2 * 0 / \operatorname{sqrt}(364)))\) \(\# \#[1] 0.5\) (prob_110 \(=\operatorname{pnorm}(-2 * 10 / \operatorname{sqrt}(364)))\) ## \([1] 0.1472537\) (prob_120 \(=\operatorname{pnorm}(-2 * 20 / \operatorname{sqrt}(364)))\) ## \([1] 0.01801584\) a. We estimate \[ P\left[Y_{365}>100\right] \approx 50.000 \% \] b. We estimate \[ P\left[Y_{365}>110\right] \approx 14.725 \% \] c. We estimate \[ P\left[Y_{365}>120\right] \approx 1.802 \% \]
Problem 2
Calculate the expected value and variance of the binomial distribution using the moment generating function.
The moment generating function of the binomial distribution with parameters \(n\) and \(p\) is: \[ m(t)=\left(1-p+p e^{t}\right)^{n} \] Using the identity \[ E\left[X^{k}\right]=\frac{\partial^{(k)} m}{\partial t}(0) \] We solve for the first and second derivatives with respect to \(t\) : \[ \begin{aligned} m^{\prime}(t) &=& n\left(1-p+p e^{t}\right)^{n-1}\left(p e^{t}\right) \\ m^{\prime \prime}(t) &=n(n-1)\left(1-p+p e^{t}\right)^{n-2}\left(p e^{t}\right)^{2}+n\left(1-p+p e^{t}\right)^{n-1} p e^{t} \end{aligned} \] Noting that when \(t=0\), we get \(\$\left(1-\mathrm{p}+\mathrm{pe}^{\wedge} 0\right)=1 \$\) and \(p e^{0}=p\), the above derivatives can be evaluated to: \[\begin{aligned} m^{\prime}(0) &=& E\left[X^{1}\right]=& n p \\ m^{\prime \prime}(0) &=& E\left[X^{2}\right]=& n(n-1) p^{2}+n p \end{aligned}\]The mean is of the binomial distribution is therefore \[ E[X]=n p \] The variance is obtained from the identity: \[ \begin{array}{rrr} E\left[X^{2}\right]-\left(E[X]^{2}\right)= & & n(n-1) p^{2}+n p-n^{2} p^{2} \\ V(X)= & & n^{2} p^{2}-n p^{2}+n p-n^{2} p^{2} \\ & = & n p-n p^{2} \\ & = & n p(1-p) \end{array} \] Thus, the variance of the binomial distribution is \(V(X)=n p(1-p)\). Problem 3 Statement Calculate the expected value and variance of the exponential distribution using the moment generating function. Solution The moment generating function of the exponential distribution is \(m(t)\) where \[ m(t)=\frac{\lambda}{\lambda-t} \text { for } t<\lambda \] Calculating the first derivative of \(m(t)\) with respect to \(t\), we obtain: \[ \begin{aligned} \frac{\partial m}{\partial t}=& \lambda(-1)(\lambda-t)^{-2}(-1) \\ &=\quad \lambda(\lambda-t)^{-2} \end{aligned} \] Setting \(t=0\), we get \[ \frac{\partial m}{\partial t}(0)=E[X]=\lambda(\lambda-0)^{-2}=\frac{1}{\lambda} \] Thus the mean of the exponential distribution is E[X]= Taking the second derivative of \(m(t)\) with respect to \(t\) we get: \[ \begin{array}{rlr} \frac{\partial^{2} m}{\partial t} & =\quad \lambda(-2)(\lambda-t)^{-3}(-1) \\ & =\quad \frac{2 \lambda}{(\lambda-t)^{3}} \end{array} \] Plugging in \(t=0\) gives \[ \frac{\partial^{2} m}{\partial t}(0)=E\left[X^{2}\right]=\frac{2}{\lambda^{2}} \] Lastly, using the identity for the variance \(V(X)\) : \[ \begin{aligned} V(X) &=E\left[X^{2}\right]-E[X]^{2} \\ &=\frac{2}{\lambda^{2}}-\left(\frac{1}{\lambda}\right)^{2} \\ &=\frac{1}{\lambda^{2}} \end{aligned} \] Thus, the variance of the exponential distribution is \[ V(X)=\frac{1}{\lambda^{2}} \]