A surveyor is measuring the height of a cliff known to be about 1000 feet. He assumes his instrument is properly calibrated and that his measurement errors are independent, with mean µ = 0 and variance = 10. He plans to take n measurements and form the average. Estimate, using (a) Chebyshev’s inequality and (b) the normal approximation, how large n should be if he wants to be 95 percent sure that his average falls within 1 foot of the true value. Now estimate, using (a) and (b), what value should the variance have if he wants to make only 10 measurements with the same confidence?
\(.95\quad =\quad \frac { \frac { 10 }{ n } }{ 1 }\)
\(n\quad =\quad 10*\frac { 1 }{ .95 }\)
\(n=\) 10.5263158
\({ \Phi }^{ -1 }(.95)=1.96\quad\)
\(1.96=\frac { 10 }{ n }\)
\(n=\) 5.1020408
\(.95\quad =\quad \frac { { \sigma }^{ 2 } }{ 10 }\)
\({ \sigma }^{ 2 }=9.5\)
\(1.96=\frac { { \sigma }^{ 2 } }{ 10 }\)
\({ \sigma }^{ 2 } =19.6\)