You have a biased coin so that if you flip the coin once, the chance of heads is \(P({H}) = 1/3\) and the chance of tails \(P({T }) = 2/3\). Now suppose you flip the coin 10 times. Note that this is a collection of 10 independent trials. For \(i = 1, 2, . . . , 10\) let \(E_i\) be the event that the \(i\)-th toss is a Head.
2. Suppose \(A,\) \(B,\) and \(C\) are independent events. Assume \(P(A) = 2/3, P(B) = 3/4\) and \(P(C) = 1/5\). Find the probability of the following two events:
3. The Wikipedia definition of Triage is the “process of determing the priority of patients’ treatments based on the severity of their condition”. In a hospital emergency room, patients were characterized as ‘critical’, ‘serious’ or ’stable. In this hospital, 15% of patients were critial, 45% were seriousand the rest were stable. Among these patients, the death rate for critical, serious, and stable patients were 50%, 20%, and 2%, respectively.
4. Suppose you toss a fair coin three times. Which of the following events are independent? Give mathematical justification for your answer.
5. Shankar has decided to train to be a Carbucks Barrista. Being young and inexperienced, for every order he makes a mistake in making that order with probability 1/3 and makes the order correctly with probability 2/3, with the probabilities of making an error independent across different orders.
Hint: Recall the formula for a geometric series: \[\text{For}\ \ -1<x<1,\ \ \ \ \ \ \ \ \frac{1}{1-x} = 1 + x + x^2 + \cdots = \sum_{k=0}^\infty x^k .\]
6. A small college town is supplied electricity by a company called Fluke Energy. This company has 10 different power generators supplying electricity to the town. Power plant \(i\) fails with probability \(p_i\), independently for the different power plants (so one power plant failing does not affect the probability of another power plant failing). Assume that any one power generator produces enough power to supply the entire town. What is the probability that the town has a blackout? Here, \(p_i\) are fixed numbers between 0 and 1. Your answer will be a formula in terms of the \(p_i\)’s.
\(A\) and \(B\) play a series of games. Each game is independently won by \(A\) with probability \(p\) and by \(B\) with probability \(1-p\). They stop when the total numbe of wins of one of the players is two greater than that of the other player. The player with the greater number of total wins is the winner of the series.
There is a 50-50 chance that the queen carries the gene for hemophilia. If she is a carrier, then each prince has a 50-50 chance of having hemophilia. (If she is not a carrier the chance for each prince is zero.) If the queen has 3 princes without the disease, what is the probability that she is a carrier? If there is a fourth prince, what is the probability that he will have hemophilia (assuming the first 3 princes do not)?