Week 9







P. 363 # 11


The price of one share of stock in the Pilsdorff Beer Company (see Exercise 8.2.12) is given by \(Y_n\) on the nth day of the year.

Finn observes that the differences \(X_n\) = \(Y_n+1 − Y_n\) appear to be independent random variables with a common distribution having mean \(\mu = 0\) and variance \(\sigma^2\) = 1/4. If \(Y_1\) = 100, estimate the probability that \(Y_{365}\) is


For this question, I coded a function that accepts some upper bounds and then based on a mean of 0 and variance of .25, calculates a z_score and a cumulative percentile.


calc_upper_prob<-function(upper_b) {
  
  n<-365
  ev<-0
  var<-1/4

  
  # square root of n times var
  sr_nvar<-sqrt(n*var)           
  
  mu<-0         # set up the normal distribution
  sigma<-1
  upper_z_score<-(upper_b - (n * ev))/sr_nvar
  u_cpct<-pnorm(upper_z_score,mu,sigma)
  
  print(sprintf("Z scores or standard deviations is %.2f", upper_z_score))
  
  print(sprintf("The cumalitive percentile is %.3f", u_cpct))
  
  print(sprintf("The probability of being greater than %d is %.3f", upper_b,  1-u_cpct))
  
}


estimate the probability that \(Y_365\) is


\[a) \ \ge 100\]

calc_upper_prob(0)
## [1] "Z scores or standard deviations is 0.00"
## [1] "The cumalitive percentile is 0.500"
## [1] "The probability of being greater than 0 is 0.500"


estimate the probability that \(Y_365\) is


\[a) \ \ge 110\]

calc_upper_prob(10)
## [1] "Z scores or standard deviations is 1.05"
## [1] "The cumalitive percentile is 0.852"
## [1] "The probability of being greater than 10 is 0.148"


estimate the probability that \(Y_365\) is


\[a) \ \ge 120\]

calc_upper_prob(20)
## [1] "Z scores or standard deviations is 2.09"
## [1] "The cumalitive percentile is 0.982"
## [1] "The probability of being greater than 20 is 0.018"






MGF - Binomial Distribution



Moment Generating Functions are differentiations of an original function that can be used to quantify important attributes. Its a syntactically simplified equivalent of another function. It all starts with E(X) which quantifies (equals) the mean…



Moment Equation Attribute
1st E(X) Mean
2nd E(\(X^2\)) Variance
3rd E(\(X^3\)) Skewness
4th E(\(X^4\)) Kurtosis







Calculating a MGF for a discrete distribution involves a series.


Calculating a MGF for a continuous distribution involves a integral.


The binomial PMF is

\[{n \choose k} p^k(1-p)^{n-k}\]


We will use k (from n choose k) to be our x. Insert \(\mathcal{e}^{xt}\) to qualify the original function…


\[M_x(t) \ = \ \sum_{x=0}^n \ e^{xt} \ \frac{n!}{x!(n-x)!} p^x q^{n-x}\]

where q = 1-p


We can move the \(p^x\) into the \(e^{xt}\)


\[M_x(t) \ = \ \sum_{x=0}^n \ (pe^t)^x \ \frac{n!}{x!(n-x)!} q^{n-x}\]


The evaluation of the binomial series can simplify it to this…


\[M_x(t) \ = \ (q + p\mathcal{e}^t)^n\]


Now we use the Chain Rule to convert


\[\frac{dM_x(t)}{dx} \ = n(q+pe^t)^{n-1}p\mathcal{e}^t\]


Deriviative Calculator is a web site that will help with differentiation. Enter (q+pe^t)^n and specify that you want to differentiate in terms of t.


Evaluating at t=0 we arrive at E(x)

\[E(x) \ = \ np(q+p)^{n-1} \ = \ np\]

Now we go back to \[n(q+pe^t)^{n-1}pe^t\]

Let u be \(n(q+pe^t)^{n-1}\) and let v be \(pe^t\)

And calculate

\[\frac{duv}{dx} \ = \ u\frac{dv}{dx} \ + v\frac{du}{dx}\]


This is the Product Rule

\[\frac{d^2M_x(t)}{dx} \ = npe^t(q + pe^t)^{n-2} q + npe^t\]


Evaluating at t=0 we arrive at E(\(x^2\))


\[E(x^2) \ = \ np(q+p)^{n-2} (q+np) \ = \ np(q+np) \]


Finally to get Variance, we need to subtract…


\[V(X) \ = \ E(X^2) -E(X)^2\]

\[V(X) \ = \ np(q+np) - n^2p^2 \ = \ npq\]





MGF - Exponential Distribution



The PDF for Exponential Distribution is


\[\lambda e^{-\lambda x}\]


and again we add \(e^{tx}\) and this time we ingegrate


\[M_x(t) \ = \ \int_{0}^{\infty} \lambda \mathcal{e}^{-\lambda x} \mathcal{e}^{tx} dt\]


Integral Calculator is a web site that will help with integratiation. Enter e^{tx} \lambda{e}^{-\lambda{x}} and specify that you are integrating from 0 to infinity.


This results in


\[- \ \frac{\lambda \mathcal{e}^{-(\lambda-t)x} }{\lambda - t}\]


Further evaluations to find the moments …


\[M_x(t) \ = \ \frac{\lambda}{\lambda-t} \ = \ \frac{1}{1-\frac{t}{\lambda}}\]


This evaluates to a general form :


\[E(X^k) \ = \frac{k!}{\lambda^k}\]


so

Mean = \(E(X) \ = \ \frac{1}{\lambda}\)



Finally to get Variance, we need to subtract…


\[V(X) \ = \ E(X^2) -E(X)^2\]


Variance = \(E(X^2) \ = \ \frac{2}{\lambda^2} - \frac{1}{\lambda^2} \ = \ \frac{1}{\lambda^2}\)






References


The internet has many helpful web sites to understand MGFs.


This one helped with the binomial derivations.



This one helped with the exponential.