Assignment 8

Tom Detzel

3/24/2018

Problem 1

11. A company buys 100 lightbulbs, each of which has an exponential lifetime of 1,000 hours. What is the expected time for the first of these bulbs to burn out?

A. I didn’t realized when I chose this problem for the discussion that it was also in the assignment. Hope this is OK. The probability density function for \(minX\) when X is exponentially distributed is:

\(P(minX)\quad =\quad \frac { n }{ \lambda } e^{ \frac { -n }{ \lambda } (x) }\quad\)

which translates to an expected value of:

\(E[X]\quad =\frac { \lambda }{ n }\)

which results in:

\(E[X]\quad =\frac { \lambda }{ n } =\quad \frac { 1000 }{ 100 } \quad =\quad 10\)

So, the first bulb should go in ~10 hours. The probability that a bulb will fail in 10 hours is very low:

\(P(X\le 10)=\ { e }^{ -.001*10 }\ =\ .0099\)

Which makes intuitive sense given the expected lifetime. In \(R\) using \(pexp()\):

p <- pexp(10, .001)
p
## [1] 0.009950166

Here’s a plot of the reliability rate for this problem over 1,000 hours:

plot.new()
curve(exp(-.001*x), 0, 1000,
main='Lightbulb Reliability Curve',
xlab = "Time in hours",
ylab = "Reliability", type = "l",
col='red', lwd=3)
abline(v=10, col='blue', lty=3, lwd=2)
text(100, .55, 'H = 10', font=3)
grid()



Problem 2

Q. Assume that X1 and X2 are independent random variables, each having an exponential density with parameter \(λ\). Show that Z = X1 − X2 has density \({ f }_{ Z }(z)\ =\ \frac { λ }{ 2 } { e }^{ -λz }\).

A. This is a convolution problem. Following the example in Grimstead p. 292, which determines the density for the sum of X and Y, and this tutorial the densities for X and Y are equivalent:

\({ f }_{ X }(x)\ =\ { f }_{ Y }(y)\ =\ \begin{cases} { λe }^{ −λx },\quad if\quad x\ge 0 \\ 0,\quad if\quad otherwise \end{cases}\)

and the convolution formula is:

\({ f }_{ Z }(z)\ =\ \int _{ -\infty }^{ +\infty }{ { f }_{ X }(x) }\ { f }_{ -Y }(z-x)\ dx\)

Because in general \(P(-Y=-1) = P(Y=1):\)

\({ f }_{ Z }(z)\quad =\quad \int _{ -\infty }^{ +\infty }{ { f }_{ X }(x) } \quad { \cdot \quad f }_{ Y }(x-z)\quad dx,\ z<0\)

Continuing:

\({ f }_{ Z }(z)\ =\ \int _{ 0 }^{ +\infty }{ { λe }^{ −λx } } \cdot \ { λe }^{ −λ(x-z) }\ dx,\quad \quad z<0\)

\(=\quad { λe }^{ λz }\ \int _{ 0 }^{ +\infty }{ { λe }^{ −2λx } } \ dx,\quad \quad z<0\)

\(=\quad { λe }^{ λz }\ \left( -\frac { 1 }{ 2 } { { e }^{ −2λx } } \right),\quad \quad z<0\)

\(=\quad \frac { λ }{ 2 } { e }^{ λz },\quad \quad z<0\)

Because \(Z = X-Y and -Z = Y-X\), and because all variables are exponential and iid, we have \({ f }_{ Z }(z) = { f }_{ Z }(-z)\) we can substitute for cases of \(z≥0\):

\(\frac { λ }{ 2 } { e }^{ -λz }, \quad \quad z≥0\)

And there you have it.



Problem 3

Let X be a continuous random variable with mean μ = 10 and variance σ2 = 100/3. Using Chebyshev’s Inequality, find an upper bound for the following probabilities.

Chebyshev’s rule states that for any \(k>1\) in a sample, at least \(1-\frac { 1 }{ { k }^{ 2 } } \cdot 100\) percent of the values will fall within \(k\) standard deviations from the mean. This gives us a rough approximation for probabilities regardless of the true distribution for a random variable.

(a) P(|X−10|≥2)

\(P(|X−10|\ge 2)\ ≤\ \frac { 1 }{ { 2 }^{ 2 } } =\) 75 percent.

(b) P(|X−10|≥5)

\(P(|X−10|\ge 5)\ ≤\ \frac { 1 }{ { 5 }^{ 2 } } =\) 96 percent.

(c) P(|X−10|≥9)

\(P(|X−10|\ge 9)\ ≤\ \frac { 1 }{ { 9 }^{ 2 } } =\) 98.77 percent.

(d) P(|X − 10| ≥ 20)

\(P(|X−10|\ge 20)\ ≤\ \frac { 1 }{ { 20 }^{ 2 } } =\) 99.75 percent.

The standard deviation is so large in this oddball distribution that 2k eats up most of the probability. Chebyshev’s rule doesn’t appear to be a very precise estimator on this data. Of course the rule does say ‘at least’ x percent.

set.seed(86)
x <- rnorm(10000, 10, sqrt(100/3))
sigma <- sqrt(100/3)

upper <- 10+(2*sigma)
lower <- 10-(2*sigma)

plot.new()
hist(x, freq=F, breaks=100, col='lightblue', main='N(10,sqrt(100/3))', font=3)
abline(v=upper, col='red', lwd=3)
abline(v=lower, col='red', lwd=3)
text(25,.05, 'k=2', font=3)