Assignment 9

Week 9, CLT & Gen Fnct

Fundamentals of Computational Mathematics

CUNY MSDS DATA 605, Fall 2018

Rose Koh

10/27/2018

1. #11 page 363

Since \(X_n\) is an independent random variable, we assume \(\sim N(0, 1/4)\).

Given that \(X_n = Y_{n+1} − Y_n\), we can also say \(Y_{n+1} = X_n + Y_n\).

We are also given \(mu = 0\), \(\sigma^2 = 1/4\) and \(Y_1 = 100\).

In the case of year 1 and year to, \(Y_2 = X_1 + Y_1\)

Thus, \(Y_{365} = Y_1 + X_1 + X_2 + X_3 + ... + X_{364}\)

Since we know \(Y_1 = 100\), and \(X_1,X_2,X_3,X_4 ... X_{364}\) have \(\mu = 0\), we know that the expected value of \(Y_1\) is \(100 + 0 + 0 + 0 + 0 ... + 0\).

The variance of \(Y_1,X_1,X_2,X_3,X_4 ... X_364\) is \(\frac{1}{4}\) for each. Then the variance for \(Y_{365} = 365*\frac{1}{4}\) = 91.25

Therefore, given the \(mu = 0\), \(\sigma^2 = 1/4\), the standard deviation is \(\sqrt{91.25} =\) 9.5524866.

mean = 0
sd = sqrt(91.25)
  1. ≥ 100.

Since \(S_n\) has normal distribution with \(\mu = 0\), given that normal distribution is symmetric, \(\frac{1}{2}\) of values is greater than mean.

pnorm(100 - 100, mean = mean, sd = sd, lower.tail =FALSE)
## [1] 0.5
  1. ≥ 110.
pnorm(110 - 100, mean = mean, sd = sd, lower.tail =FALSE)
## [1] 0.1475849
  1. ≥ 120.
pnorm(120 - 100, mean = mean, sd = sd, lower.tail =FALSE)
## [1] 0.01814355

2. Calculate the expected value and variance of the binomial distribution using the moment generating function.

Binomial distribution, \(P(X=k) = {n \choose k} p^k q^{n-k}\), where \(q=1-p\).

We can use this this prob. mass function to get the MGF of \(X\): \(M_X(t)=(q+pe^t)^n\).

This is the MGF for binomial distribution. We can differentiate the MGF with respect to \(t\) using the function of a function rule.

The first moment is: \(M'_X(t) = n(q+pe^t)^{n-1}pe^t\).

As we set t=0 for the first moment, we will obtain the expected value.

\[ \begin{split} E(X)=M'_X(0) &= n(q+pe^0)^{n-1}pe^0\\ &= n(q+p)^{n-1}p\\ &= np(1-p+p)^{n-1}\\ &= np1^{n-1}\\ &=np \end{split} \]

To obtain variance, we differentiate again and set t=0.

The second moment is \(M''_X(t) = n(n-1)(q+pe^t)^{n-2}p^2 e^{2t}+n(q+pe^t)^{n-1}pe^t\).

\[ \begin{split} E(X^2)=M''_X(0) &= n(n-1)(q+pe^0)^{n-2}p^2 e^0+n(q+pe^0)^{n-1}pe^0\\ &= n(n-1)(1-p+p)^{n-2}p^2+n(1-p+p)^{n-1}p\\ &= n(n-1)p^2+np \end{split} \]

The variance is \(V(X)=E(X^2)-E(X)^2\):

\[ \begin{split} V(X) &= n(n-1)p^2+np-n^2p^2 \\ &= np((n-1)p+1-np) \\ &= np(np-p+1-np) \\ &= np(1-p) \\ &= npq \end{split} \]

\(E(X)=np\), \(V(X)=npq\).

3. Calculate the expected value and variance of the exponential distribution using the moment generating function.

The prob. density function for exponential distribution is \(f(x)=\lambda e^{-\lambda x}\).

We can obtain the MGF of X using PDF\(M_x(t)=\frac{\lambda}{\lambda-t}, t<\lambda\).

Using calculator, our first derivative is \(M'_x(t) = \frac{\lambda}{(\lambda-t)^2}\) and second derivative is \(M''_x(t) = \frac{2\lambda}{(\lambda-t)^3}\).

As we set t=0 for the first moment, we will obtain the expected value.

\[ \begin{split} E(x)=M'_x(0) &= \frac{\lambda}{(\lambda-0)^2} \\ &= \frac{\lambda}{\lambda^2}\\ &= \frac{1}{\lambda} \end{split} \]

To obtain variance, we differentiate again and set t=0.

\[ \begin{split} V(X) = E(X^2)-E(X)^2 &= M''_X(0)-M'_X(0)^2 \\ &=\frac{2\lambda}{(\lambda-0)^3} - \frac{1}{\lambda^2}\\ &=\frac{2\lambda}{\lambda^3} - \frac{1}{\lambda^2}\\ &=\frac{2}{\lambda^2} - \frac{1}{\lambda^2}\\ &=\frac{1}{\lambda^2} \end{split} \]

\(E(X)=\frac{1}{\lambda}\), \(V(X)=\frac{1}{\lambda^2}\).