Suppose that if a signal of intensity \(\mu\) is emitted from a particular star, then the value received at an observatory on earth is a normal random variable with mean \(\mu\) and standard deviation 4. In other words, the value of the signal emitted is altered by random noise, which is normally distributed with mean 0 and standard deviation 4. It is suspected that the intensity of the signal is equal to 10.
Find a 95% confidence interval for \(\mu\).
Test whether this hypothesis is plausible if the same signal is independently received 20 times and the average of the 20 values received is 11.6. Use the 5 percent level of significance.