Here we calculate how many games we would win by multiplying the total number of games by the probability to win.
n <- 240
p <- 1/4
q <- 1-p
stdev <- sqrt(n*p*q)
win <- n*p
lose <- n - win
c(win, lose)
## [1] 60 180
Since we want to know how much money we would win or lose we take the total number of games won and multiply it by $2 and subtract by the total number of games lost. This means we can expect to lose $60 after playing 240 games
ev <- win*2 - lose
ev
## [1] -60
Now lets calculate the probability of breaking even or being positive. In order to break even we need to win at least 80 games. This gives us a probability of 0.2% of being cash positive.
break_even_wins <- 80
p <- (break_even_wins-(1/2)-win)/stdev
p
## [1] 2.906888
1-pnorm(p)
## [1] 0.001825217
Moment generating function: \[ \sum\limits_{j=1}^\infty e^{tx_j}p(x_j) \] We know that the probability function for the binomial function is: \[ p(x) = \frac{n!}{x!(n-x)!}p^x q^{n-x} \] We can just plug this in to the moment generating funciton \[ \sum\limits_{x=0}^\infty e^{tx}\frac{n!}{x!(n-x)!}p^x q^{n-x} \] This reduces down to \[ M_x(t) = (q+pe^t)^n \] To understand the expected value and and variance we need to take the first and second derivative of the moment function \[ M'_x(t) = n(q+pe^t)^{n-1}pe^t \] \[ M''_x(t) = npe^t(q+pe^t)^{n-2}(q+npe^t) \] Evaluating the first derivative at t=0 gives us the expected value \[ E(x) = np(q+p)^{n-1}=np \] np is what we expect. Now lets calculate the variance \[ V(x) = E(x^2) - E(x)^2 \] \[ =np(q+np)-n^2p^2 = npq \] This is also expected
Start with the moment function again
\[ \sum\limits_{j=1}^\infty e^{tx_j}p(x_j) \]
Lets plug in the exponential probability funciton \[ \sum\limits_{x=0}^\infty e^{tx}\lambda e^{-\lambda x} \] \[ \sum\limits_{x=0}^n \lambda e^{(t-\lambda)x} \] \[ -\frac{\lambda (e^\lambda - e^{-\lambda n +nt+t})}{e^t - e^\lambda} \]