A company buys 100 light bulbs, each of which has an exponential lifetime of 1000 hours. What is the expected time for the first of these bulbs to burnout? (See Exercise 10.)
lambda <- 1/1000
calc1 <- 100 * lambda #For 100 light bulbs
exp_min <- 1/calc1
sprintf("The expected time (in hours) for the first of these bulbs to burnout is: %d", exp_min)
## [1] "The expected time (in hours) for the first of these bulbs to burnout is: 10"
Assume that X1 and X2 are independent random variables, each having an exponential density with parameter λ. Show that Z = X1 − X2 has density fZ (z) = (1/2)λe−λ|z|.
The probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions. The general formula for the distribution of the sum Z = X + Y of two independent integer-valued and discrete random variables is
\(P(Z=z)=\sum _{k=-\infty}^{\infty}{P(X=k)*P(Y=z-k)}\)
Let \(Y = -X_{2}\), then the counterpart for independent continuously distributed random variables with density functions \(f_{X_{1}}, f_{X_{2}}\) is
\(=\int _{-\infty}^{\infty}{{f}_{X_{1}}({x}_{1}){f}_{X_{2}}({x}_{2})dx}\)
\(=\int _{-\infty}^{\infty}{{f}_{{X}_{1}}({x}_{1}){f}_{ {-X}_{2}}(z-{x}_{1})dx}\)
\(=\int _{-\infty}^{\infty}{{f}_{ {X}_{1}}({x}_{1}){f}_{{X}_{2}}({x}_{1}-z)dx_{1}}\)
Now for z < 0, the above can be expressed as the exponential distribution \({f}_{Z}(z)\)
\(=\int _{0}^{\infty}{\lambda {e}^{-\lambda {(x}_{1}-z)}\lambda{e}^{-\lambda{x}_{1}}d{x}_{1}}\)
\(=\int _{0}^{\infty }{{\lambda}^{2}{e}^{-2\lambda {x}_{1}}{e}^{\lambda z) }d{x}_{1}}\)
\(= \lambda {e}^{\lambda z}(-\frac {1}{2} {e }^{-2\lambda {x}_{1}}\overset {\infty}{\underset {0}{|})}\)
\(=\frac {1}{2} \lambda {e}^{\lambda z}\)
Now, \(-Z = X_{2} - X_{1}\), Z and -Z have the same distribution but which are symmetric, \({f}_{Z}(z)={f}_{-Z}(-z)\).
Therefore, for \(z \geq 0\), \({f}_{Z}(z)=\frac {1}{2} \lambda {e}^{\lambda (-z)}\).
\({f}_{Z}(z)=\frac {1}{2} \lambda {e}^{-\lambda |z|}\).
Let X be a continuous random variable with mean µ = 10 and variance σ2 = 100/3. Using Chebyshev’s Inequality, find an upper bound for the following probabilities.
P (|X − µ| ≥ e) ≤ (V(X) / e^2)
Let X by any discrete random variable with E(X) = µ and V (X) = σ2. Then, if e = kσ, Chebyshev’s Inequality states that P (|X − µ| ≥ kσ) ≤ σ^2 / (k^2 * σ^2) <= (1 / k^2). Thus, for any random variable, the probability of a deviation from the mean of more than k standard deviations is ≤ 1/k^2 which is essentially the upper bound. Using this formula, we can solve for each of the cases below.
a). P (|X − 10| ≥ 2).
P (|X − µ| ≥ kσ) ≤ (1 / k^2).
Here, σ2 = 100/3 and k*σ = 2 or k = 2 / σ. Hence, the upper bound is (1 / k^2).
k <- 2/(sqrt(100/3))
upper_a <- 1/(k^2)
sprintf("The upper bound is: %f", upper_a)
## [1] "The upper bound is: 8.333333"
b). P (|X − 10| ≥ 5).
P (|X − µ| ≥ kσ) ≤ (1 / k^2).
Here, σ2 = 100/3 and k*σ = 5 or k = 5 / σ. Hence, the upper bound is (1 / k^2).
k <- 5/sqrt(100/3)
upper_b <- 1/k^2
sprintf("The upper bound is: %f", upper_b)
## [1] "The upper bound is: 1.333333"
c). P (|X − 10| ≥ 9).
P (|X − µ| ≥ kσ) ≤ (1 / k^2).
Here, σ2 = 100/3 and k*σ = 9 or k = 9 / σ. Hence, the upper bound is (1 / k^2).
k <- 9/sqrt(100/3)
upper_c <- 1/k^2
sprintf("The upper bound is: %f", upper_c)
## [1] "The upper bound is: 0.411523"
d). P (|X − 10| ≥ 20).
P (|X − µ| ≥ kσ) ≤ (1 / k^2).
Here, σ2 = 100/3 and k*σ = 20 or k = 20 / σ. Hence, the upper bound is (1 / k^2).
k <- 20/sqrt(100/3)
upper_d <- 1/k^2
sprintf("The upper bound is: %f", upper_d)
## [1] "The upper bound is: 0.083333"