#11 pg.363

Since Xn=Yn+1-Yn, we can rearrange it as Yn+1=Xn+Yn.

Therefore, Y365=Y1+X1+X2+X3…X364.

We know that variance of Y1,x1 … x364 is 1/4 for each. Also, we know E(Y1) = 100 as expected value of x1,x2 … is 0.

We know that variance of Y365 = 365 x 1/4 = 91.25. We know that, given that mu = 0, standard deviation of Y365 = sqrt(91.25).

Let’s use these values to answer the questions below.

#a)
pnorm(100 - 100, mean = 0, sd = sqrt(91.25), lower.tail = FALSE)
## [1] 0.5
#b)
pnorm(110 - 100, mean = 0, sd = sqrt(91.25), lower.tail = FALSE)
## [1] 0.1475849
#c)
pnorm(120 - 100, mean = 0, sd = sqrt(91.25), lower.tail = FALSE)
## [1] 0.01814355

2)

We know that, \[ M(t)=E(e^{xt}) \] \[ =\sum_{x=0}^{n}e^{xt}p^{x}{{n}\choose{k}} (1-p)^{n-k} \]

First derivative:

\[ E(X) = M'(t) = n(1 - p + pe^{t})^{n-1}pe^{t} \]

At t = 0

\[ M'(0) = n(1 - p + pe^{0})^{n-1}pe^{0} = np \]

Second derivative:

\[ M''(t) = n(n - 1)(1-p+pe^{t})^{n-2}p^2e^t + n(1-p+pe^{t})^{n-1}pe^t \]

At t = 0

\[ M''(0) = n(n - 1)p^2 +np \]

And we know that variance is E(x^2) - E(X)^2 \[ var = M''(0) - [M'(0)]^2 = n(n - 1)p^2 +np - (np)^2 = np(1 - p) \]

3)

We know that, \[ M(t)=E(e^{tx})=\int_{0}^{\infty}e^{tx}\lambda e^{-\lambda x}dx \]

After solving the integral, we have \[ M(t) =\frac1{1-\lambda t} \]

expected value:

\[ M'(t) = \frac{-\lambda}{(\lambda - t)^2}\] \[M'(0) = \frac1 {\lambda}\] \[ \mu = \frac1 {\lambda}\]

variance:

We know, \[ var = E(X^2)-E(X)^2 = M''(0) - M'(0)^2 \] Therefore, \[ var = \frac1 {\lambda^2} \]