Review concepts

Q1. Derivation of the least squares estimates

Q2. Derivation from the mean form

Q3. Explain the difference between estimators and estimates.

Q4. What does “linear” stand for in a linear regression? Figure out the “linear” regression in the cases below.

Q5. What is elasticity? (give me its formula in a linear relationship and explain it by a simple case)

Q6. Fill in the blank below and complete calculation

Some useful function, their derivatives and elasticities

Name Function Slope Elasticity
linear \(y=\beta_{1}+\beta_{2}x\) \(\beta_{2}\) \(\beta_{2}\frac{x}{y}\)
quadratic \(y=\beta_{1}+\beta_{2}x^{2}\)
cubic \(y=\beta_{1}+\beta_{2}x^{3}\)
log-log \(ln(y)=\beta_{1}+\beta_{2}ln(x)\)
log-linear \(ln(y)=\beta_{1}+\beta_{2}x\)
linear-log \(y=\beta_{1}+\beta_{2}ln(x)\)

Example: linear

\[\frac{\mathrm{d}y }{\mathrm{d} x}=\frac{\mathrm{d}\left ( \beta_{1}+\beta_{2}x \right ) }{\mathrm{d} x}=\beta_{2}\]

\[\varepsilon =\frac{\frac{\mathrm{d} y}{\Delta y}}{\frac{\mathrm{d} x}{\Delta x}}=slope\times \frac{x}{y}=\beta_{2}\frac{x}{y}\]

Exercise

PROBLEMS 1

Consider the following five observations. You are to do all the parts of this exercise using only a calculator.

T2

T2

  1. Complete the entries in the table. Put the sums in the last row. What are the sample means x and y?

  2. Calculate b1 and b2 using Q1 and Q2.

  3. Compute \(\sum_{i=1}^{5}x_{i}^{2}\), \(\sum_{i=1}^{5}x_{i}y_{i}\). Using these numerical values, show that \(\sum \left ( x_{i}-\bar{x} \right )^{2}=\sum x_{i}^{2}-N\bar{x}^{2}\) and \(\sum \left ( x_{i}-\bar{x} \right )\left ( y_{i}-\bar{y} \right )=\sum x_{i}y_{i}-N\overline{xy}\).

  4. Use the least squares estimates to compute the fitted values of y, and complete the remainder of the table below. Put the sums in the last row.

T3

T3

  1. On graph paper, plot the data points and sketch the fitted regression line \(\hat{y}_{i}=\beta_{1}+\beta_{2}x_{i}\).

  2. On the sketch in part (e), locate the point of the means \((\bar{x},\bar{y})\). Does your fitted line pass through that point? If not, go back to the drawing board, literally.

  3. Show that for these numerical values \(\bar{y}_{i}=\beta_{1}+\beta_{2}\bar{x}\).

  4. Show that for these numerical values \(\bar{\hat{y}}=\hat{y}\); where \(\bar{\hat{y}}=\frac{\sum \hat{y_{i}} }{N}\).

PROBLEMS 2

Graph the following observations of x and y on graph paper. (Optional, Excel or By hand)

x 1 2 3 4 5 6
y 4 6 7 7 9 10
  1. Using a ruler, draw a line that fits through the data. Measure the slope and intercept of the line you have drawn.

  2. Use the result of Q1 and Q2 to compute, the least squares estimates of the slope and the intercept.

Plot this line on your graph.

  1. Obtain the sample means of \(\bar{y}= \frac{\sum y_{i}}{N}\) and \(\bar{x}= \frac{\sum x_{i}}{N}\). Obtain the predicted value of \(y\) for \(x=\bar{x}\) and plot it on your graph. What do you observe about this predicted value?

  2. Using the least squares estimates from (b), compute the least squares residuals \(\hat{\epsilon}^{2} \). Find their sum.

  3. Calculate \(\sum x_{i}\hat{\epsilon_i}^{2}\)