Page 303 (Exercise 11): A company buys 100 light bulbs, each of which has an exponential lifetime of 1000 hours. What is the expected time for the first of these bulbs to burn out? (See Exercise 10.)
Answer:
By referring to exercise 10, suppose \({ X }_{ 1 },{ X }_{ 2 },...,{ X }_{ n }\) are independent. Then, the exponential random variable will be
\({ X }_{ i }=\lambda { e }^{ -\lambda x },\quad x\ge 0\)
\(P(X>x)={ e }^{ -\lambda x }\)
So, \(P(min({ X }_{ 1 },{ X }_{ 2 },...,{ X }_{ n })>x)=P\{ { X }_{ i }>x,\quad where\quad i=1,2,..,n\}\)
\(={ e }^{ -{ \lambda }_{ 1 }x }\bullet { e }^{ -{ \lambda }_{ 2 }x },...,{ e }^{ -{ \lambda }_{ n }x }\)
\(={ e }^{ -{ (\lambda }_{ 1 }+{ \lambda }_{ 2 }+,...,+{ \lambda }_{ n })x }\)
Therefore, the expected time for the first bulb to burn out is
\[E(X)=\frac { 1 }{ \sum _{ i=1 }^{ n }{ { \lambda }_{ i } } } ,\quad where\quad X=first\quad of\quad these\quad bulbs\quad to\quad burn\quad out\]
For \(n = 100\), \({ \lambda }_{ i }=\frac { 1 }{ 1000 }\) for \(i = 1,2,...,100\)
\(E(X)=\frac { 1 }{ { \lambda }_{ 1 }+{ \lambda }_{ 2 }+,...,+{ \lambda }_{ 100 } }\)
\(=\frac { 1 }{ \frac { 1 }{ 1000 } +\frac { 1 }{ 1000 } +,...,+\frac { 1 }{ 1000 } }\)
\(=\frac { 1000 }{ 100 }\)
\(=10\)
So, the expected time for the first of these bulbs to burn out is 10 hours.
Page 303 (Exercise 14): Assume that \({X}_{1}\) and \({X}_{2}\) are independent random variables, each having an exponential density with parameter \(\lambda\). Show that \(Z={X}_{1}−{X}_{2}\) has density
\[{ f }_{ Z }(z)=(1/2)\lambda { e }^{ -\lambda |z| }\]
Answer:
Assume that \({X}_{1}\) and \({X}_{2}\) are independent random variables, each having an exponential density with parameter \(\lambda\).
That is, the
PDF of \({ x }_{ 1 },f({ x }_{ 1 })=\lambda { e }^{ -\lambda { x }_{ 1 } },{ x }_{ 1 }\ge 0\)
and
PDF of \({ x }_{ 2 },f({ x }_{ 2 })=\lambda { e }^{ -\lambda { x }_{ 2 } },{ x }_{ 2 }\ge 0\)
Since \({X}_{1}\) and \({X}_{2}\) are independent, the joint density function of \({X}_{1}\) and \({X}_{2}\) is given by
\(f({ x }_{ 1 },{ x }_{ 2 })=f({ x }_{ 1 })f({ x }_{ 2 })\)
\(=\lambda { e }^{ -\lambda { x }_{ 1 } }\bullet \lambda { e }^{ -\lambda { x }_{ 2 } }\)
\(={ \lambda }^{ 2 }{ e }^{ -\lambda { (x }_{ 1 }+{ x }_{ 2 }) }\)
Let \(\begin{matrix} Z={ X }_{ 1 }+{ X }_{ 2 } \\ V={ X }_{ 2 } \end{matrix}\Longrightarrow \quad \begin{matrix} { X }_{ 1 }={ X }_{ 2 }-Z \\ { X }_{ 2 }=V \end{matrix}\)
By using the Jacobian of transformation
\(J=\frac { \partial ({ x }_{ 1 },{ x }_{ 2 }) }{ \partial (z,v) }\)
\(=\begin{vmatrix} d{ x }_{ 1 }/dz & d{ x }_{ 1 }/dv \\ d{ x }_{ 2 }/dz & d{ x }_{ 2 }/dv \end{vmatrix}\)
\(=\begin{vmatrix} 1 & 1 \\ 0 & 1 \end{vmatrix}\)
\(=1\)
Thus, the PDF of Z and V becomes:
\[g(z,v)={ \lambda }^{ 2 }{ e }^{ -\lambda (z+2v) }\bullet 1\]
\(z={ x }_{ 1 }-v\quad and\quad v={ x }_{ 1 }-z\), so
\(v>-z,\quad if\quad -\infty <z<0\quad and\quad v>0,\quad if\quad z>0\)
For \(-\infty <z<0\),
\(f(z)=\int _{ -z }^{ -\infty }{ g(z,v) } dv\)
\(=\int _{ -z }^{ -\infty }{ { \lambda }^{ 2 }{ e }^{ -\lambda (z+2v) } } dv\)
\(={ \lambda }^{ 2 }{ e }^{ -\lambda z }\left| \frac { { e }^{ -2\lambda v } }{ -2\lambda } \right| _{ -z }^{ -\infty }\)
\(=\frac { \lambda }{ 2 } { e }^{ \lambda z }\)
For \(z>0\),
\(f(z)=\int _{ 0 }^{ \infty }{ g(z,v) } dv\)
\(=\int _{ 0 }^{ \infty }{ { \lambda }^{ 2 }{ e }^{ -\lambda (z+2v) } } dv\)
\(={ \lambda }^{ 2 }{ e }^{ -\lambda z }\left| \frac { { e }^{ -2\lambda v } }{ -2\lambda } \right| _{ -z }^{ -\infty }\)
\(=\frac { \lambda }{ 2 } { e }^{ -\lambda z }\)
Therefore, the PDF of \(Z={X}_{1}−{X}_{2}\) is given by
\(g(z)=\begin{cases} \frac { \lambda }{ 2 } { e }^{ \lambda z },\quad -\infty <z<0 \\ \frac { \lambda }{ 2 } { e }^{ -\lambda z },\quad 0<z<\infty \end{cases}\)
Combining these two results will give
\(g(z)=\frac { \lambda }{ 2 } { e }^{ -\lambda |z| },\quad -\infty <z<\infty\)
\(=\frac { \lambda }{ 2 } { e }^{ -\lambda |{ x }_{ 1 }-{ x }_{ 2 }| }\)
Reference: Hallam, A. (2004). Transformations of Random Variables. Econometrics I. Retrieved from http://www2.econ.iastate.edu/classes/econ671/hallam/documents/Transformations.pdf
Page 320-321 (Exercise 1): Let X be a continuous random variable with mean μ = 10 and variance \(\sigma^2\) = 100/3. Using Chebyshev’s Inequality, find an upper bound for the following probabilities.
Answer:
Chebyshev Inequality: \(P(|X-\mu |\ge \epsilon )\le \frac { { \sigma }^{ 2 } }{ { \epsilon }^{ 2 } } \quad\)
If \(\epsilon =k\sigma\), then the Chebyshev Inequality can be written as \(\quad P(|X-\mu |\ge k\sigma )\le \frac { 1 }{ { k }^{ 2 } }\)
We’re given the \(\sigma^2 =\frac { 100 }{ 3 }\), so the \(\sigma=\frac { 10 }{ \sqrt { 3 } }\). Since the \(\epsilon =k\sigma \longrightarrow k=\frac { \epsilon }{ \sigma } \longrightarrow k=\frac { \epsilon \sqrt { 3 } }{ 10 }\)
Let \(u\) be the upper bound in the Chebyshev Inequality. Therefore, the formula to calculate the upper bound for the following probabilities is
\(u=\frac { 1 }{ { k }^{ 2 } }\)
\(=\frac { 1 }{ (\frac { \epsilon \sqrt { 3 } }{ 10 } )^{ 2 } }\)
\(=\frac { 100 }{ 3{ \epsilon }^{ 2 } }\)
a.) For the \(P(|X − 10| \ge 2)\) and \(\epsilon=2\), then the upper bound is
\(u=\frac { 100 }{ 3({ 2 }^{ 2 }) } =\frac { 100 }{ 12 }=\frac { 25 }{ 3 } \approx 8.3333\)
Since the probability of an event cannot be less than 0 and greater than 1, thus the upper bound for this \(P(|X − 10| \ge 2)\) is 1.
b.) For the \(P(|X − 10| \ge 5)\) and \(\epsilon=5\), then the upper bound is
\(u=\frac { 100 }{ 3({ 5 }^{ 2 }) } =\frac { 100 }{ 75 }=\frac { 4 }{ 3 } \approx 1.3333\)
Since the probability of an event cannot be less than 0 and greater than 1, thus the upper bound for this \(P(|X − 10| \ge 5)\) is also 1.
c.) For the \(P(|X − 10| \ge 9)\) and \(\epsilon=9\), then the upper bound is
\(u=\frac { 100 }{ 3({ 9 }^{ 2 }) } =\frac { 100 }{ 243 } \approx 0.4115\)
d.) For the \(P(|X − 10| \ge 20)\) and \(\epsilon=20\), then the upper bound is
\(u=\frac { 100 }{ 3({ 20 }^{ 2 }) } =\frac { 100 }{ 1200 }=\frac { 1 }{ 12 } \approx 0.08333\)