A company buys 100 light bulbs, each of which has an exponential lifetime of 1000 hours. What is the expected time for the rst of these bulbs to burn out? (See Exercise 10.) (exercise 10 was: Let X1, X2, . . . , Xn be n independent random variables each of which has an exponential density with mean mu. Let M be the minimum value of the Xj . Show that the density for M is exponential with mean mu/n. Hint: Use cumulative distribution functions.)
Each lightbulb has an exponential lifetime, so the time until a bulb burns out follows an exponential distribution. The mean (\(\mu\)) of this distribution is 1000 hours for each lightbulb.
Given \(X_1, X_2, \ldots, X_{100}\) are independent random variables representing the lifetimes of the 100 lightbulbs, and each has an exponential density with mean \(\mu = 1000\) hours. Let \(M\) be the minimum value of the \(X_j\), which represents the time until the first bulb burns out.
The density for \(M\) is exponential with mean \(\mu/n\), where \(n\) is the number of bulbs (in this case, \(n=100\)) and \(\mu\) is the mean lifetime of each bulb.
The cumulative distribution function (CDF) of an exponential distribution is:
\[ F(x) = 1 - e^{-\lambda x} \]
where \(\lambda = 1/\mu\) is the rate parameter of the distribution.
To find the density of \(M\), we consider the CDF of \(M\), which is the probability that the minimum value is less than or equal to some value \(x\). Since the bulbs are independent, this is the same as 1 minus the probability that all bulbs last longer than \(x\):
\[ F_M(x) = 1 - P(M > x) = 1 - (P(X_1 > x) \times P(X_2 > x) \times \ldots \times P(X_{100} > x)) \]
Given \(P(X_j > x) = 1 - F(x) = e^{-\lambda x}\), for each bulb:
\[ F_M(x) = 1 - (e^{-\lambda x})^{100} \]
Since \(\lambda = 1/1000\):
\[ F_M(x) = 1 - e^{-100x/1000} \]
This CDF gives an exponential distribution with a rate parameter of \(100/1000 = 1/10\), so its mean is the inverse of the rate:
\[ \mu_M = \frac{1}{\lambda_M} = \frac{1}{1/10} = 10 \text{ hours} \]
Bottom line: the expected time for the first of these bulbs to burn out is 10 hours. This makes sense as it shows how the reliability of a set of components depends not only on the average lifetime of individual components but also on the number of components in use. With many components, the likelihood of at least one failing within a given time frame is significantly higher.
================================================================================
Assume that \(X_1\) and \(X_2\) are independent random variables, each having an exponential density with parameter \(\lambda\). Show that \(Z = X_1 - X_2\) has density
\[ f_Z(z) = \frac{1}{2}\lambda e^{-\lambda |z|} . \]
Let \(X_1\) and \(X_2\) be independent random variables, each with an exponential density function with parameter \(\lambda\), given by:
\[ f_{X}(x) = \lambda e^{-\lambda x}, \quad x \geq 0 \]
We want to find the density function of \(Z = X_1 - X_2\). To do this, we use the convolution of \(X_1\) and \(-X_2\). The density function of \(-X_2\) is given by:
\[ f_{-X_2}(x) = \lambda e^{\lambda x}, \quad x \leq 0 \]
For the convolution of two independent random variables, we have:
\[ f_Z(z) = \int_{-\infty}^{\infty} f_{X_1}(x) f_{-X_2}(z - x) \, dx \]
However, since \(X_1\) and \(X_2\) are both non-negative, we can only have positive values, so the integral limits can be adjusted as follows:
For \(z \geq 0\):
\[ f_Z(z) = \int_{z}^{\infty} \lambda e^{-\lambda x} \cdot \lambda e^{\lambda (z - x)} \, dx \] \[ f_Z(z) = \lambda^2 e^{\lambda z} \int_{z}^{\infty} e^{-2\lambda x} \, dx \] \[ f_Z(z) = \lambda^2 e^{\lambda z} \left[ -\frac{1}{2\lambda} e^{-2\lambda x} \right]_{z}^{\infty} \] \[ f_Z(z) = \frac{\lambda}{2} e^{-\lambda z} \]
For \(z < 0\), we need to consider that \(X_1\) will always be less than \(X_2\) by the magnitude of \(z\), and the density function of \(-X_2\) will be reflected across the y-axis:
\[ f_Z(z) = \int_{0}^{\infty} \lambda e^{-\lambda x} \cdot \lambda e^{\lambda (z + x)} \, dx \] \[ f_Z(z) = \lambda^2 e^{\lambda z} \int_{0}^{\infty} e^{-2\lambda x} \, dx \] \[ f_Z(z) = \lambda^2 e^{\lambda z} \left[ -\frac{1}{2\lambda} e^{-2\lambda x} \right]_{0}^{\infty} \] \[ f_Z(z) = \frac{\lambda}{2} e^{\lambda z} \]
Since \(e^{\lambda z} = e^{-\lambda |z|}\) for \(z < 0\), the function simplifies to the same expression as for \(z \geq 0\):
\[ f_Z(z) = \frac{\lambda}{2} e^{-\lambda |z|} \]
Combining the two cases, we obtain the density function for \(Z\):
\[ f_Z(z) = \frac{1}{2}\lambda e^{-\lambda |z|} \]
This shows that the difference of two independent exponential random variables with the same rate parameter \(\lambda\) has a Laplace distribution with mean 0 and parameter \(\lambda/2\).
lambda <- 1
# a large number of samples for X1 and X2
n_samples <- 1000000
X1 <- rexp(n_samples, rate = lambda)
X2 <- rexp(n_samples, rate = lambda)
Z <- X1 - X2
# the characteristic function for the exponential distribution
phi_X <- function(t) {
lambda / (lambda - 1i * t)
}
# characteristic function for Z = X1 - X2
phi_Z <- function(t) {
phi_X1 <- phi_X(t)
phi_X2 <- phi_X(-t)
phi_X1 * phi_X2
}
# a range of t values
t_values <- seq(-10, 10, by = 0.1)
# phi_Z for the range of t values
phi_Z_values <- sapply(t_values, phi_Z)
plot(t_values, Re(phi_Z_values), type = "l", col = "blue",
xlab = "t", ylab = "Real part of Phi_Z(t)",
main = "Characteristic Function of Z = X1 - X2")
# histogram of Z
hist(Z, probability = TRUE, breaks = 100, col = 'skyblue', main = 'Histogram of Z', xlab = 'Z', ylab = 'Density')
# theoretical density of Z
z_vals <- seq(min(Z), max(Z), length = 300)
f_Z <- (1/2) * lambda * exp(-lambda * abs(z_vals))
lines(z_vals, f_Z, col = 'red', lwd = 2)
# legend
legend("topright", legend=c("Simulated", "Theoretical"), col=c("skyblue", "red"), lwd=2)
================================================================================
Let \(X\) be a continuous random variable with mean \(\mu = 10\) and variance \(\sigma^2 = \frac{100}{3}\). Using Chebyshev’s Inequality, find an upper bound for the following probabilities:
Given that \(X\) is a continuous random variable with a mean \(\mu = 10\) and variance \(\sigma^2 = \frac{100}{3}\), Chebyshev’s Inequality states that for any positive number \(k\),
\[ P(|X - \mu| \geq k) \leq \frac{\sigma^2}{k^2} \]
So:
For \(P(|X - 10| \geq 2)\), we have: \[ P(|X - \mu| \geq 2) \leq \frac{\sigma^2}{2^2} = \frac{\frac{100}{3}}{4} = \frac{100}{12} = \frac{25}{3} \]
For \(P(|X - 10| \geq 5)\), we have: \[ P(|X - \mu| \geq 5) \leq \frac{\sigma^2}{5^2} = \frac{\frac{100}{3}}{25} = \frac{4}{3} \]
For \(P(|X - 10| \geq 9)\), we have: \[ P(|X - \mu| \geq 9) \leq \frac{\sigma^2}{9^2} = \frac{\frac{100}{3}}{81} = \frac{100}{243} \]
For \(P(|X - 10| \geq 20)\), we have: \[ P(|X - \mu| \geq 20) \leq \frac{\sigma^2}{20^2} = \frac{\frac{100}{3}}{400} = \frac{1}{12} \]
mu <- 10
variance <- 100 / 3
sigma <- sqrt(variance)
# Chebyshev's inequality function
chebyshev_upper_bound <- function(k, variance) {
return(variance / k^2)
}
upper_bound_a <- chebyshev_upper_bound(2, variance)
upper_bound_b <- chebyshev_upper_bound(5, variance)
upper_bound_c <- chebyshev_upper_bound(9, variance)
upper_bound_d <- chebyshev_upper_bound(20, variance)
upper_bound_a
## [1] 8.333333
upper_bound_b
## [1] 1.333333
upper_bound_c
## [1] 0.4115226
upper_bound_d
## [1] 0.08333333