The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by \(Y_{n}\) on the \(n\)th day of the year. Finn observes that the differences \(X_{n}=Y_{n+1}-Y_{n}\) appear to be independent random variables with a common distribution having mean \(\mu=0\) and variance \(\sigma^2=\frac{1}{4}\). If \(Y_{1}=100\), estimate the probability for the following parts:
\(Y_{365}\) is the sum of the daily changes
\(Y_{365}=Y_{1}+(Y_{2}-Y_{1})+(Y_{3}-Y_{2})+ \dots +(Y_{365}-Y_{364})\) \(Y_{365}=Y_{1}+X_{1}+X_{2}+\dots +X_{364}\) \(Y_{365}=Y_{1}+\sum_{i=1}^{364}X_{i}\)
\(n = 364\)
Since \(\mu=0\) then, \(n\mu=0\).
Since \(\sigma^2=\frac{1}{4}\) then, \(n\sigma^2=364\cdot\frac{1}{4}=91\) and $= $
\(P(Y_{365}) \geq 100\)
\(P(\frac{100-100-0}{\sqrt{91}}\leq S_{n}\leq \frac{+\infty-100-0}{\sqrt{91}})=P(0\leq S_{n }\leq +\infty) = 0.5\)
mu <- 0
sd <- sqrt((1/4)*365)
n <- 364
Yn <- 100
Y1 <- 100
pnorm(Yn-Y1,mu,sd, lower.tail = FALSE)
## [1] 0.5
\(P(Y_{365})\geq 110\)
\(P(\frac{110-100-0}{\sqrt{91}}\leq S_{n}\leq \frac{+\infty-100-0}{\sqrt{91}})=P(10 \leq S_{n}\leq +\infty) = 0.1476\)
Yn <- 110
Y1 <- 100
pnorm(Yn-Y1,mu,sd, lower.tail = FALSE)
## [1] 0.1475849
\(P(Y_{365})\geq 120\)
\(P(\frac{120-100-0}{\sqrt{91}}\leq S_{n}\leq \frac{+\infty-100-0}{\sqrt{91}})=P(20 \leq S_{n}\leq +\infty) = 0.0181\)
Yn <- 120
Y1 <- 100
pnorm(Yn-Y1,mu,sd, lower.tail = FALSE)
## [1] 0.01814355
Calculate the expected value and variance of the binomial distribution using the moment generating function.
Binomial Distribution Function: \(b(x;n,p)=\frac{n!}{x!(n-x)!}p^xq^{n-x}\) where \(q=1-p\)
Moment Generating Function: \(M_{x}(t)=\sum_{x=0}^{n}e^{xt}\frac{n!}{x!(n-x)!}p^xq^{n-x}\) \(M_{x}(t)=\sum_{x=0}^{n}(pe^t)^x\frac{n!}{x!(n-x)!}q^{n-x}\) \(M_{x}(t)=(pe^t+q)^n\)
First Derivative of the Moment Generating Function:
\(M'_{x}(t)=n(pe^t)(q+pe^t)^{n-1}\)
Expected Value
\(\mu=E[X]=M'(0)=n(pe^0)(q+pe^0)^{n-1}=np(q+p)^{n-1}\)
Since \(q+p=1\) then,\((q+p)^{n-1}=1^{n-1}=1\)
\(E[x]=M'(0)=n(pe^0)(q+pe^0)^{n-1}=np(q+p)^{n-1}=np\)
Second Derivative of the Moment Generating Function:
\(M''_{x}(t)=(npe^t)[(n-1)(q+pe^t)^{n-2}(pe^t)]+[n(pe^t)(q+pe^t)^{n-1}]\) \(M''_{x}(t)=(npe^t)(q+pe^t)^{n-2}[(n-1)(pe^t)+(q+pe^t)]\) \(M''_{x}(t)=(npe^t)(q+pe^t)^{n-2}[(npe^t-pe^t)+(q+pe^t)]\) \(M''_{x}(t)=(npe^t)(q+pe^t)^{n-2}[q+npe^t]\)
Variance
\(M''(0)=(npe^0)(q+pe^0)^{n-2}[q+npe^0]=np(q+p)^{n-2}[q+np]\)
Since \(q+p=1\) then,\((q+p)^{n-1}=1^{n-1}=1\)
\(M''(0)=(npe^0)(q+pe^0)^{n-2}[q+npe^0]=np(q+p)^{n-2}(q+np)=np(1)^{n-2}(q+np)=np(q+np)\)
Since \(M'(0)=np\), then \([M'(0)]^2=n^2p^2\)
\(\sigma^2=Var[x]=M''(0)-[M'(0)]^2=[np(q+np)]-(n^2p^2)=[npq+n^2p^2]-(n^2p^2)=npq\)
Calculate the expected value and variance of the exponential distribution using the moment generating function.
Exponential Distribution Function: $f(x;p)=e^{-x} $ for \(x\geq0\)
Moment Generating Function: \(M_{x}(t)=\int_{-\infty}^{\infty }e^{tx}\lambda e^{-\lambda x} dx\)
Since \(x\geq0\),
\(M_{x}(t)=\int_{0}^{\infty }e^{tx}\lambda e^{-\lambda x} dx\)
\(M_{x}(t)=\lambda \int_{0}^{\infty }e^{tx-\lambda x} dx\)
\(M_{x}(t)=\lambda \int_{0}^{\infty }e^{x(t-\lambda )} dx\)
\(M_{x}(t)=\frac{\lambda}{\lambda-t}\)
First Derivative of the Moment Generating Function:
\(M'_{x}(t)=\frac{\lambda}{(\lambda-t)^2}\)
Expected Value
\(\mu=E[X]=M'(0)=\frac{\lambda}{(\lambda-0)^2}=\frac{1}{\lambda}\)
Second Derivative of the Moment Generating Function:
\(M''_{x}(t)=\frac{2\lambda}{(\lambda-t)^3}\)
Variance
\(M''(0)=\frac{2\lambda}{(\lambda-0)^3}=\frac{2}{\lambda^2}\)
Since \(M'(0)==\frac{1}{\lambda}\), then \([M'(0)]^2=\frac{1}{\lambda^2}\)
$2=Var[x]=M’’(0)-[M’(0)]2= - = $