The price of one share of stock in the Pilsdorff Beer Company is given by \(Y_n\) on the \(n\)th day of the year. Finn observes that the differences \(X_n = Y_{n+1} - Y_n\) appear to be independent random variables with a common distribution having mean \(\mu = 0\) and variance \(\sigma^2 = 1/4\). If \(Y_1 = 100\), estimate the probability that \(Y_{365}\) is
Answer
Since \(X_n\) is an independent random variable, then its sum \(S_n\) is normally distributed.
\[ \begin{split} S_n &= X_1+X_2+X_3...+X_n \\ &=(Y_2-Y_1)+(Y_3-Y_2)+(Y_4-Y_3)+...+(Y_{n+1}-Y_n)\\ &=Y_2-Y_1+Y_3-Y_2+Y_4-Y_3+...+Y_{n+1}-Y_n\\ &=Y_{n+1}-Y_1\\ &=Y_{n+1}-100 \end{split} \]
We first need to use the information given to find the standard deviation and expected values.
\[ X_n=Y_{n+1}-Y_n\\ Y_2=X_1+Y_1\\ E(Y_2)=E(X_1)+E(Y_1)\\ =0+100\\ =100 \]
\[ V(Y_2)=V(X_1)+V(Y_1)\\ =1/4+0\\ =1/4 \] Apply the same formula and we will find the expected value of Y356 and the variance of Y365
\[ X_n=Y_{n+1}-Y_n\\ Y_{n+1}=X_n-Y_n\\ Y_{365}=X_{364}+...+X_1+Y_1\\ E(Y_{365})=E(X_{364}+...+X_1+Y_1)\\ =100 \] \[ V(Y_{365})=V(X_{364})+...+V(X_1)+V(Y_1)\\ =364(1/4)\\ =\frac{365}{4}\\ \sigma=\sqrt\frac{365}{4} \]
Mean of \(S_n\): \(\mu_{S_n} = \mu_{X_1}+\mu_{X_2}+\mu_{X_3}+...+\mu_{X_n} = n\mu_X = 0\)
Variance of \(S_n\): \(\sigma^2_{S_n} = \sigma^2_{X_1}+\sigma^2_{X_2}+\sigma^2_{X_3}+...+\sigma^2_{X_n} = n\sigma^2_X = n \times 1/4 = n/4\)
Standard deviation of \(S_n\): \(\sigma_{S_n} = \sqrt{n}/2\)
Consider \(n=365\), then \(S_{365} = Y_{365} - 100\) and \(Y_{365} = S_{365}+100\).
\(\sigma^2_{S_{365}} = 365/4 = 91.25\) and \(\sigma_{S_{365}} = \sqrt{91.25}\)
The probability distribution is given by N(100,91.25)
\[ \ge100\\ P(Y_{365}\ge100)=P(Y_{365}-100\ge0)=P(\frac{Y_{365}-100}{\sqrt\frac{365}{4}}\ge0) \] =0.5 by the central limit theorem
Since \(S_n\) is normally distributed with mean \(0\) and normal distribution is symmetric, exactly half of values will be greater than mean.
z <- (100-100)/sqrt(91.25)
pnorm(z, lower.tail=FALSE)
## [1] 0.5
ANSWER: \(P(S_{364}\ge 0) =0.5\)
\[ \ge110\\ P(Y_{365}\ge110)=P(Y_{365}-100\ge10)=P(\frac{Y_{365}-100}{\sqrt\frac{365}{4}}\ge \frac{10}{\sqrt\frac{365}{4}}) \]
=0.147 by the central limit theorem
z <- (110-100)/sqrt(91.25)
pnorm(z, lower.tail=FALSE)
## [1] 0.1475849
ANSWER: \(P(Y{365} \ge 110) \approx 0.14725\)
\[ \ge120\\ P(Y_{365}\ge120)=P(Y_{365}-100\ge20)=P(\frac{Y_{365}-100}{\sqrt\frac{365}{4}}\ge \frac{20}{\sqrt\frac{365}{4}}) \] =0.018 by the central limit theorem
z <- (120-100)/sqrt(91.25)
pnorm(z, lower.tail=FALSE)
## [1] 0.01814355
ANSWER: \(P(Y{365} \ge 120) \approx 0.01802\)
Answer
For binomial distribution, \(P(X=k) = {n \choose k} p^k q^{n-k}\), where \(q=1-p\).
Find the moment generating function, then compute first and second derivative of the derived MGF.
The moment generating function is \(M_X(t)=(q+pe^t)^n\).
The first moment is \(M'_X(t) = n(q+pe^t)^{n-1}pe^t\). The expected value is the first moment evaluated at \(t=0\): \[ \begin{split} E(X)=M'_X(0) &= n(q+pe^0)^{n-1}pe^0\\ &= n(q+p)^{n-1}p\\ &= np(1-p+p)^{n-1}\\ &= np1^{n-1}\\ &=np \end{split} \] The second moment is \(M''_X(t) = n(n-1)(q+pe^t)^{n-2}p^2 e^{2t}+n(q+pe^t)^{n-1}pe^t\).
Evaluate the second moment at \(t=0\): \[ \begin{split} E(X^2)=M''_X(0) &= n(n-1)(q+pe^0)^{n-2}p^2 e^0+n(q+pe^0)^{n-1}pe^0\\ &= n(n-1)(1-p+p)^{n-2}p^2+n(1-p+p)^{n-1}p\\ &= n(n-1)p^2+np \end{split} \]
The variance is \(V(X)=E(X^2)-E(X)^2\): \[ \begin{split} V(X) &= n(n-1)p^2+np-n^2p^2 \\ &= np((n-1)p+1-np) \\ &= np(np-p+1-np) \\ &= np(1-p) \\ &= npq \end{split} \] We arrived at the known definitions for binomial distribution - \(E(X)=np\) and \(V(X)=npq\).
Answer
For exponential distribution, \(f(x)=\lambda e^{-\lambda x}\). The moment generating function is \(M_X(t)=\frac{\lambda}{\lambda-t}, t<\lambda\).
Using WolframAlpha, we get \(M'_X(t) = \frac{\lambda}{(\lambda-t)^2}\) and \(M''_X(t) = \frac{2\lambda}{(\lambda-t)^3}\).
Expected value: \[ \begin{split} E(X)=M'_X(0) &= \frac{\lambda}{(\lambda-0)^2} \\ &= \frac{\lambda}{\lambda^2}\\ &= \frac{1}{\lambda} \end{split} \]
Variance: \[ \begin{split} V(X) = E(X^2)-E(X)^2 &= M''_X(0)-M'_X(0)^2 \\ &=\frac{2\lambda}{(\lambda-0)^3} - \frac{1}{\lambda^2}\\ &=\frac{2\lambda}{\lambda^3} - \frac{1}{\lambda^2}\\ &=\frac{2}{\lambda^2} - \frac{1}{\lambda^2}\\ &=\frac{1}{\lambda^2} \end{split} \]
We arrived at the known definitions for binomial distribution - \(E(X)=1/\lambda\) and \(V(X)=1/\lambda^2\).