class: center, middle, inverse, title-slide .title[ # Static Panel Models ] .author[ ### Rogers Ochenge ] .date[ ### 2023/07/09 (updated: 2023-11-27) ] --- --- # Data Types Sample observations can be: -- 1. **Cross-sectional data** Observations refer to different individuals `\(i\)` (countires, families, companies, stocks,) observed at a specific time (for example, in 2020 year) -- -- 2. **Time series data** observations refer to the same *individual* (economic phenomena) observed at different times `\(t\)` ( for example, inflation in Kenya for the year 2000-2022 period) -- -- 3. **Panel data** Observations have two dimensions, `\(it\)` -- --- # Some notations on panel data -- - Panel data gathers information about several individuals (cross-sectional units) over several periods. - The panel is **balanced** if all units are observed in all periods; if some units are missing in some periods, the panel is **unbalanced**. -- -- - Equation 1 gives the form of a pooled panel data model, where the subscript `\(i=1,\ldots,N\)` denotes an individual (cross sectional unit), and `\(t=1,\ldots,T\)` denotes the time period, or longitudinal unit. The total number of observations in the panel is `\(N\times T\)` -- -- `\begin{equation} y_{it}=\beta_{1}+\beta_{2}x_{2it}+\ldots +\beta_{k}x_{Kit}+e_{it} \end{equation}` -- -- - A **wide panel** has the cross-sectional dimension (N) much larger than the longitudinal dimension (T); when the opposite is true, we have a **long** panel. -- -- - Normally, the same units are observed in all periods; when this is not the case and each period samples mostly other units, the result is not a proper panel data, but **pooled cross-sections** model. -- --- # Example: Returns to Schooling -- ```r ## Load required packages rm(list=ls()) #Removes all items in Environment! #install.packages("remotes") ## Preparing to install PoE data #remotes::install_github("ccolonescu/PoEdata") ## PoE data library(plm) # New package: plm (Croissant and Millo 2015). library(PoEdata) #for PoE4 datasets library(knitr) #for `kable()` ``` -- --- -- ```r library(AER) ``` ``` ## Loading required package: car ``` ``` ## Loading required package: carData ``` ``` ## Loading required package: lmtest ``` ``` ## Warning: package 'lmtest' was built under R version 4.0.5 ``` ``` ## Loading required package: zoo ``` ``` ## ## Attaching package: 'zoo' ``` ``` ## The following objects are masked from 'package:base': ## ## as.Date, as.Date.numeric ``` ``` ## Loading required package: sandwich ``` ``` ## Loading required package: survival ``` ```r library(xtable) ``` -- --- -- ```r data(nls_panel, package="PoEdata") head(nls_panel) ``` ``` ## id year lwage hours age educ collgrad msp nev_mar not_smsa c_city south ## 1 1 82 1.808289 38 30 12 0 1 0 0 1 0 ## 2 1 83 1.863417 38 31 12 0 1 0 0 1 0 ## 3 1 85 1.789367 38 33 12 0 0 0 0 1 0 ## 4 1 87 1.846530 40 35 12 0 0 0 0 1 0 ## 5 1 88 1.856449 40 37 12 0 0 0 0 1 0 ## 6 2 82 1.280933 48 36 17 1 1 0 0 0 0 ## black union exper exper2 tenure tenure2 ## 1 1 1 7.666667 58.77777 7.666667 58.777770 ## 2 1 1 8.583333 73.67361 8.583333 73.673610 ## 3 1 1 10.179490 103.62200 1.833333 3.361111 ## 4 1 1 12.179490 148.33990 3.750000 14.062500 ## 5 1 1 13.621790 185.55330 5.250000 27.562500 ## 6 0 0 7.576923 57.40976 2.416667 5.840278 ``` -- --- -- ### Set the data as panel ```r nlspd <- pdata.frame(nls_panel, index=c("id", "year")) ``` -- --- -- ## Display a sample of the data ```r smpl <- nlspd[nlspd$id %in% c(1,2),c(1:6, 14:15)] tbl <- xtable(smpl) kable(tbl, digits=4, align="c", caption="A data sample") ``` Table: A data sample | | id | year | lwage | hours | age | educ | union | exper | |:----|:--:|:----:|:------:|:-----:|:---:|:----:|:-----:|:-------:| |1-82 | 1 | 82 | 1.8083 | 38 | 30 | 12 | 1 | 7.6667 | |1-83 | 1 | 83 | 1.8634 | 38 | 31 | 12 | 1 | 8.5833 | |1-85 | 1 | 85 | 1.7894 | 38 | 33 | 12 | 1 | 10.1795 | |1-87 | 1 | 87 | 1.8465 | 40 | 35 | 12 | 1 | 12.1795 | |1-88 | 1 | 88 | 1.8564 | 40 | 37 | 12 | 1 | 13.6218 | |2-82 | 2 | 82 | 1.2809 | 48 | 36 | 17 | 0 | 7.5769 | |2-83 | 2 | 83 | 1.5159 | 43 | 37 | 17 | 0 | 8.3846 | |2-85 | 2 | 85 | 1.9302 | 35 | 39 | 17 | 0 | 10.3846 | |2-87 | 2 | 87 | 1.9190 | 42 | 41 | 17 | 1 | 12.0385 | |2-88 | 2 | 88 | 2.2010 | 42 | 43 | 17 | 1 | 13.2115 | -- --- - Function `pdim()` extracts the dimensions of the panel data: ```r pdim(nlspd) ``` ``` ## Balanced Panel: n = 716, T = 5, N = 3580 ``` --- ### Example: Firm investment -- - Suppose `\(y\)` is investment and `\(x\)` is a measure of profit. We have `\(i=1,\ldots N\)` companies and `\(t=1,\ldots T\)` time periods. Suppose we specify a simple econometric model which says that investment depends on profit: `\begin{equation} y_{it}=a_{0}+a_{1}x_{it}+u_{it} \hspace{5cm} (1) \end{equation}` `\(u_{it}\)` is a random error term: is the `\(E(u_{it}) \sim N(0,\sigma^{2})\)` -- -- - Estimation of this firm investment equation (1) depends on the assumptions that we make about the intercept `\((a_{0})\)`, the slope coefficient `\((a_{1})\)` and the error term `\((u_{it})\)` -- --- -- Several possible assumptions can be made in order to estimate (1): -- -- 1. Assume that the intercept and slope coefficients are constant across time and firms and that the error term captures differences over time and over firms. -- -- 2. The slope coefficient is constant but the intercept varies over firms. -- -- 3. The slope coefficient is constant but the intercept varies over firms and over time. -- -- 4. All coefficients (intercept and slope) vary over firms. -- -- 5. The intercept as well as the slope vary over firms and time. -- --- ```r knitr::include_graphics("p1.png") ``` <img src="p1.png" width="100%" /> --- # Unobserved Heterogeneity -- - Unobservable individual differences are called unobservable heterogeneity in the economics and econometrics literature. -- -- - When using panel data, it is important to separate this component of the random error term from other components if we can argue that the factors causing the individual differences are unchanging over time. -- -- - The beauty of having panel data is that we can control for the omitted variables bias, caused by time-invariant omitted variables. -- --- # Benefits of Panel Data -- - Panel data have several advantage over cross section or time series data. We can mention here the following advantages using panel data -- -- - First, in panel data, the number of data points is increased. If there are N cross-section units and T time periods, then total number of observations will be NT. Therefore, in panel data degrees of freedom is more providing more variability than in cross-sectional data or time series data. The econometric estimates are more efficient if panel data are used. -- -- - Second, panel data are helpful in constructing and testing more complicated behavioral hypotheses. One can control the unobserved heterogeneity among the individual cross section units by using panel data -- -- - Third, panel data contain information on intertemporal dynamics and may allow to control the effects of unobserved variables in estimating a model. The collinearity between current and lag variables can be reduced by using panel data. Long panel is useful to carry out dynamic analysis. -- --- -- - Fourth, panel data are helpful in providing micro foundations for aggregate data analysis. If micro units are heterogenous, the time series properties of aggregate data will be very different from those of dissagregated data. In this case, the prediction of aggregate outcomes by using aggregate time series may be misleading. The use of panel data can resolve this problem by capturing the hetorogeneity issue. -- -- - Fifth, in panel data, if observations among cross-sectional units are independent, one can show by using the central limit theorem that the limiting distributions of many estimators remain asymptotically normal even for non-stationary series. -- --- # Sources of Variation in Panel Data -- - Panel data can capture within-group variation, between-group variation and total variation of a variable by using different types of mean. -- -- - We can calculate mean over time for each entity separately -- -- - `\(\overline{x}_{i.}=\dfrac{1}{T}\sum_{t=1}^{T}x_{it}\)` -- -- - `\(\overline{y}_{i.}=\dfrac{1}{T}\sum_{t=1}^{T}y_{it}\)` -- -- - Similarly, mean across entities can be calculated for every time period. -- -- - `\(\overline{x}_{.t}=\dfrac{1}{N}\sum_{i=1}^{N}x_{it}\)` -- -- - `\(\overline{y}_{.t}=\dfrac{1}{N}\sum_{i=1}^{N}y_{it}\)` -- --- -- - By taking all entities over the total period, we can calculate overall mean -- -- - `$$\overline{x}_{..}=\dfrac{1}{NT}\sum_{i=1}^{N}\sum_{t=1}^{T}x_{it}$$` `$$\overline{y}_{..}=\dfrac{1}{NT}\sum_{i=1}^{N}\sum_{t=1}^{T}y_{it}$$` -- --- -- - The within-entity variation for a particular cross section unit `\(i\)` for entity `\(x\)` is defined as -- -- - `\(S_{XXi}^{w}=\sum_{t}^{T}(x_{it}-\overline{x}_{i.})^{2}\)` -- -- - For all cross-section unit, the sum of squares in measuring the within-entity variation of `\(x\)` is -- -- - `\(S_{XX}^{w}=\sum_{i=1}^{N}\sum_{t}^{T}(x_{it}-\overline{x}_{i.})^{2}\)` -- -- - Similarly, the sum of the cross products in measuring covariance between two variables X and Y within a particular cross section unit `\(i\)` is defined as -- -- - `\(S_{XYi}^{w}=\sum_{t=1}^{T}(X_{it}-\overline{X}_{i.})(Y_{it}-\overline{Y}_{i.})\)` -- -- - Thus, the sum of the cross products in measuring covariance between two variables X and Y within group for all cross section units is -- -- - `\(S_{XY}^{w}=\sum_{i=1}^{N}\sum_{t=1}^{T}(X_{it}-\overline{X}_{i.})(Y_{it}-\overline{Y}_{i.})\)` -- --- -- - The sum of square measuring between-entity variation of a variable X is: -- -- - `\(S_{XX}^{B}=\sum_{i=1}^{N}\sum_{t=1}^{T}(\overline{X}_{i}-\overline{X}_{..})^{2}\)` -- -- - The cross product measuring covariance of two variables between groups -- -- - `\(S_{XY}^{B}=\sum_{i=1}^{N}\sum_{t=1}^{T}(\overline{X}_{i}-\overline{X}_{..})(\overline{Y}_{i}-\overline{Y}_{..})\)` -- -- - Total variation of X is defined as the sum of squares of the deviation of a variable from its overall mean as -- -- - `\(S_{XX}^{T}=\sum_{i=1}^{N}\sum_{t=1}^{T}(X_{it}-\overline{X}_{..})^{2}\)` -- -- -Similarly, total covariance between X and Y is -- -- - `\(S_{XY}^{T}=\sum_{i=1}^{N}\sum_{t=1}^{T}(X_{it}-\overline{X}_{..})(Y_{it}-\overline{Y}_{..})\)` -- --- -- - We can prove that `$$S_{XX}^{T}=S_{xx}^{w}+S_{xx}^{B}$$` -- -- - see here http://rizaudinsahlan.blogspot.com/2016/06/within-and-between-variation-in-panel.html -- --- # Pooled OLS -- - If we assume that cross- sectional units are homogeneous, then we can estimate a pooled regression - purely multiple linear regression model with panel data. -- -- - This model is based on the assumptions needed for multiple linear regression model: exogeneity, homoskedasticity, non-autocorrelation and full rank. -- -- - Under these assumptions, the OLS produces efficient and consistent parameter estimate provided that the conditional density of the random variable does not vary across entities (i) and over time (t). -- -- - A pooled model has the specification in Equation 1, which does not allow for intercept or slope differences among individuals. Such a model can be estimated in R using the specification pooling in the plm() function, as the following code sequence illustrates. -- --- ```r library(broom) #for `glance(`) and `tidy()` wage.pooled <- plm(lwage~educ + exper+I(exper^2)+ tenure+I(tenure^2)+black+south+union, model="pooling", data=nlspd) kable(tidy(wage.pooled), digits=3, caption="Pooled model") ``` Table: Pooled model |term | estimate| std.error| statistic| p.value| |:-----------|--------:|---------:|---------:|-------:| |(Intercept) | 0.477| 0.056| 8.487| 0.000| |educ | 0.071| 0.003| 26.567| 0.000| |exper | 0.056| 0.009| 6.470| 0.000| |I(exper^2) | -0.001| 0.000| -3.176| 0.002| |tenure | 0.015| 0.004| 3.394| 0.001| |I(tenure^2) | 0.000| 0.000| -1.886| 0.059| |black | -0.117| 0.016| -7.426| 0.000| |south | -0.106| 0.014| -7.465| 0.000| |union | 0.132| 0.015| 8.839| 0.000| --- ## Error Component Model -- - One way to restore homogeneity across `\(i\)` or over `\(t\)` and to solve the endogeneity problem is to decompose the random error, and the model developed is known as the error component model. -- -- - If the error is decomposed in one way, either cross section-specific or time-specific, it is called one-way error component model. -- -- - When the error is decomposed in both cross section and time-specific, it will be two-way error component model. -- -- - In one-way error component model, the random disturbance is decomposed into a cross-section-specific error `\(\mu_{i}\)` (or time-specific error `\(\lambda_{t}\)`) and an idiosyncratic error `\(\epsilon_{it}\)` -- -- - One way (cross-section) `$$u_{it}=\mu_{i}+\epsilon_{it}$$` -- -- - One-way (time-specific) `$$u_{it}=\lambda_{t}+\epsilon_{it}$$` -- --- -- - Two-way (cross-section and time) `$$u_{it}=\mu_{i}+\lambda_{t}+\epsilon_{it}$$` -- -- - In general a two-way error component model is expressed as `$$y_{it}=X_{it}\beta +\mu_{i}+\lambda_{t}+\epsilon_{it}$$` -- -- - The error component model can be estimated by either applying fixed effects or random effects. When the error component is assumed to be non-stochastic, it will be a fixed effects model, when the error component is treated as random, it becomes random effects model. -- --- # Fixed Effects Estimation -- - It is evident that a better way to model the data would be to allow each group (firm) to have its own intercept. In the wage example, this implies we modify the model as follows: `\begin{equation} y_{it}=\beta_{2}x_{2it}+\ldots +\beta_{k}x_{Kit}+\mu_{i}+e_{it} \end{equation}` - This is know as the (One Way) Fixed Effects Model. -- -- - The simplest way to allow each firm to have its own intercept is to create a set of dummy (binary) variables, one for each cross-sectional unit, and include them as regressors. `\begin{equation} y_{it}=\sum_{i=1}^{N-1}\beta_{1i}D_{i}+\beta_{2}x_{2it}+\ldots +\beta_{k}x_{Kit}+e_{it} \end{equation}` - Consequently, this form of estimation is also known as Least Squares Dummy Variables (LSDV). (Note that there is no constant in this regression.) - Note: The are `\(N-1\)` dummies - to avoid dummy variable trap! -- --- - We create a small sample from the original nls_panel data (specifically we select the first 10 individuals) -- ```r nls10 <- pdata.frame(nls_panel[nls_panel$id %in% 1:10,]) pdim(nls10) ``` ``` ## Balanced Panel: n = 10, T = 5, N = 50 ``` -- --- ```r wage.lsdv <- lm(lwage~educ+ exper+I(exper^2)+ tenure+I(tenure^2)+union+factor(id)-1, data=nlspd) kable(tidy(wage.lsdv), digits=3, caption="Least squares dummy variable model") ``` Table: Least squares dummy variable model |term | estimate| std.error| statistic| p.value| |:-------------|--------:|---------:|---------:|-------:| |educ | 0.074| 0.008| 9.262| 0.000| |exper | 0.041| 0.007| 6.217| 0.000| |I(exper^2) | 0.000| 0.000| -1.510| 0.131| |tenure | 0.014| 0.003| 4.231| 0.000| |I(tenure^2) | -0.001| 0.000| -4.339| 0.000| |union | 0.064| 0.014| 4.472| 0.000| |factor(id)1 | 0.460| 0.125| 3.689| 0.000| |factor(id)2 | 0.086| 0.153| 0.562| 0.574| |factor(id)3 | 0.546| 0.125| 4.363| 0.000| |factor(id)4 | 0.869| 0.126| 6.880| 0.000| |factor(id)5 | 1.058| 0.135| 7.862| 0.000| |factor(id)6 | 0.849| 0.147| 5.784| 0.000| |factor(id)7 | 0.774| 0.141| 5.504| 0.000| |factor(id)8 | 0.595| 0.140| 4.240| 0.000| |factor(id)9 | 0.510| 0.140| 3.632| 0.000| |factor(id)10 | 0.756| 0.134| 5.620| 0.000| |factor(id)11 | 0.669| 0.134| 4.982| 0.000| |factor(id)12 | 0.869| 0.149| 5.848| 0.000| |factor(id)13 | 1.231| 0.159| 7.737| 0.000| |factor(id)14 | 0.851| 0.126| 6.763| 0.000| |factor(id)15 | 0.050| 0.130| 0.381| 0.703| |factor(id)16 | 0.336| 0.124| 2.706| 0.007| |factor(id)17 | 1.028| 0.120| 8.581| 0.000| |factor(id)18 | 1.187| 0.124| 9.609| 0.000| |factor(id)19 | 0.941| 0.125| 7.522| 0.000| |factor(id)20 | 0.601| 0.127| 4.728| 0.000| |factor(id)21 | 0.740| 0.126| 5.857| 0.000| |factor(id)22 | 0.702| 0.125| 5.595| 0.000| |factor(id)23 | 0.620| 0.124| 5.008| 0.000| |factor(id)24 | 0.279| 0.124| 2.249| 0.025| |factor(id)25 | 0.151| 0.160| 0.947| 0.344| |factor(id)26 | 0.824| 0.120| 6.862| 0.000| |factor(id)27 | -0.062| 0.153| -0.402| 0.687| |factor(id)28 | 0.570| 0.124| 4.595| 0.000| |factor(id)29 | 0.936| 0.125| 7.509| 0.000| |factor(id)30 | 0.771| 0.159| 4.839| 0.000| |factor(id)31 | 0.528| 0.114| 4.612| 0.000| |factor(id)32 | 0.421| 0.129| 3.273| 0.001| |factor(id)33 | 0.550| 0.146| 3.764| 0.000| |factor(id)34 | 1.061| 0.148| 7.178| 0.000| |factor(id)35 | 0.199| 0.147| 1.351| 0.177| |factor(id)36 | 1.396| 0.161| 8.699| 0.000| |factor(id)37 | -0.003| 0.154| -0.021| 0.984| |factor(id)38 | -0.082| 0.124| -0.661| 0.509| |factor(id)39 | 0.631| 0.141| 4.474| 0.000| |factor(id)40 | 0.512| 0.124| 4.125| 0.000| |factor(id)41 | 0.367| 0.125| 2.942| 0.003| |factor(id)42 | 0.184| 0.124| 1.483| 0.138| |factor(id)43 | 0.667| 0.124| 5.379| 0.000| |factor(id)44 | 0.591| 0.159| 3.720| 0.000| |factor(id)45 | 0.392| 0.135| 2.908| 0.004| |factor(id)46 | 1.396| 0.147| 9.529| 0.000| |factor(id)47 | 0.449| 0.124| 3.623| 0.000| |factor(id)48 | 0.565| 0.148| 3.827| 0.000| |factor(id)49 | 0.366| 0.136| 2.689| 0.007| |factor(id)50 | 0.649| 0.134| 4.837| 0.000| |factor(id)51 | 0.710| 0.136| 5.229| 0.000| |factor(id)52 | 0.418| 0.134| 3.110| 0.002| |factor(id)53 | 0.189| 0.130| 1.456| 0.145| |factor(id)54 | 0.182| 0.147| 1.236| 0.217| |factor(id)55 | 0.267| 0.125| 2.135| 0.033| |factor(id)56 | 0.832| 0.124| 6.708| 0.000| |factor(id)57 | 0.724| 0.125| 5.785| 0.000| |factor(id)58 | 0.388| 0.131| 2.972| 0.003| |factor(id)59 | 0.097| 0.147| 0.660| 0.510| |factor(id)60 | 0.443| 0.124| 3.580| 0.000| |factor(id)61 | 0.434| 0.124| 3.500| 0.000| |factor(id)62 | 0.643| 0.135| 4.760| 0.000| |factor(id)63 | 0.002| 0.146| 0.016| 0.987| |factor(id)64 | -0.740| 0.119| -6.228| 0.000| |factor(id)65 | 0.518| 0.124| 4.182| 0.000| |factor(id)66 | 0.549| 0.147| 3.740| 0.000| |factor(id)67 | 0.616| 0.125| 4.920| 0.000| |factor(id)68 | 0.634| 0.124| 5.119| 0.000| |factor(id)69 | 0.820| 0.134| 6.108| 0.000| |factor(id)70 | 0.388| 0.119| 3.266| 0.001| |factor(id)71 | 0.502| 0.124| 4.034| 0.000| |factor(id)72 | 0.500| 0.124| 4.036| 0.000| |factor(id)73 | 0.442| 0.148| 2.990| 0.003| |factor(id)74 | 0.886| 0.123| 7.176| 0.000| |factor(id)75 | 0.441| 0.148| 2.985| 0.003| |factor(id)76 | 0.230| 0.137| 1.683| 0.092| |factor(id)77 | 0.698| 0.125| 5.565| 0.000| |factor(id)78 | 0.143| 0.124| 1.150| 0.250| |factor(id)79 | 0.483| 0.125| 3.869| 0.000| |factor(id)80 | 0.253| 0.136| 1.866| 0.062| |factor(id)81 | -0.117| 0.124| -0.942| 0.346| |factor(id)82 | 0.270| 0.124| 2.178| 0.029| |factor(id)83 | 1.028| 0.124| 8.315| 0.000| |factor(id)84 | 1.168| 0.148| 7.901| 0.000| |factor(id)85 | 0.771| 0.125| 6.181| 0.000| |factor(id)86 | 0.839| 0.124| 6.740| 0.000| |factor(id)87 | 0.514| 0.125| 4.124| 0.000| |factor(id)88 | 0.934| 0.147| 6.373| 0.000| |factor(id)89 | 0.579| 0.106| 5.472| 0.000| |factor(id)90 | 0.681| 0.148| 4.587| 0.000| |factor(id)91 | 0.399| 0.124| 3.220| 0.001| |factor(id)92 | 0.765| 0.154| 4.971| 0.000| |factor(id)93 | 0.797| 0.124| 6.423| 0.000| |factor(id)94 | 0.616| 0.160| 3.848| 0.000| |factor(id)95 | 1.250| 0.124| 10.118| 0.000| |factor(id)96 | 0.587| 0.124| 4.747| 0.000| |factor(id)97 | 0.285| 0.153| 1.863| 0.063| |factor(id)98 | 0.764| 0.124| 6.141| 0.000| |factor(id)99 | 0.974| 0.125| 7.826| 0.000| |factor(id)100 | 0.266| 0.124| 2.149| 0.032| |factor(id)101 | 0.648| 0.143| 4.530| 0.000| |factor(id)102 | 0.711| 0.119| 5.987| 0.000| |factor(id)103 | 0.557| 0.124| 4.493| 0.000| |factor(id)104 | 0.404| 0.161| 2.516| 0.012| |factor(id)105 | 0.432| 0.159| 2.715| 0.007| |factor(id)106 | 0.262| 0.147| 1.786| 0.074| |factor(id)107 | 0.815| 0.124| 6.562| 0.000| |factor(id)108 | 0.795| 0.128| 6.220| 0.000| |factor(id)109 | 0.426| 0.120| 3.561| 0.000| |factor(id)110 | 0.716| 0.115| 6.238| 0.000| |factor(id)111 | 0.346| 0.124| 2.788| 0.005| |factor(id)112 | 0.622| 0.127| 4.910| 0.000| |factor(id)113 | 0.845| 0.123| 6.845| 0.000| |factor(id)114 | 0.426| 0.124| 3.453| 0.001| |factor(id)115 | 1.921| 0.125| 15.360| 0.000| |factor(id)116 | 0.357| 0.125| 2.849| 0.004| |factor(id)117 | 0.686| 0.134| 5.104| 0.000| |factor(id)118 | 0.524| 0.150| 3.492| 0.000| |factor(id)119 | 0.850| 0.124| 6.862| 0.000| |factor(id)120 | 0.573| 0.124| 4.619| 0.000| |factor(id)121 | 0.193| 0.124| 1.555| 0.120| |factor(id)122 | 0.237| 0.114| 2.080| 0.038| |factor(id)123 | 0.343| 0.147| 2.328| 0.020| |factor(id)124 | 0.438| 0.146| 2.991| 0.003| |factor(id)125 | 0.190| 0.124| 1.533| 0.125| |factor(id)126 | 0.683| 0.159| 4.300| 0.000| |factor(id)127 | 0.275| 0.124| 2.211| 0.027| |factor(id)128 | 0.265| 0.129| 2.056| 0.040| |factor(id)129 | 0.064| 0.124| 0.514| 0.608| |factor(id)130 | 0.357| 0.135| 2.634| 0.008| |factor(id)131 | 0.678| 0.129| 5.252| 0.000| |factor(id)132 | 0.352| 0.129| 2.717| 0.007| |factor(id)133 | 0.816| 0.124| 6.582| 0.000| |factor(id)134 | 0.559| 0.115| 4.856| 0.000| |factor(id)135 | 0.744| 0.135| 5.516| 0.000| |factor(id)136 | 0.257| 0.129| 1.993| 0.046| |factor(id)137 | 0.739| 0.124| 5.955| 0.000| |factor(id)138 | 0.532| 0.143| 3.716| 0.000| |factor(id)139 | 0.698| 0.143| 4.864| 0.000| |factor(id)140 | 0.547| 0.115| 4.761| 0.000| |factor(id)141 | 0.926| 0.134| 6.897| 0.000| |factor(id)142 | 0.446| 0.129| 3.453| 0.001| |factor(id)143 | 0.354| 0.129| 2.750| 0.006| |factor(id)144 | 0.378| 0.146| 2.592| 0.010| |factor(id)145 | 0.236| 0.160| 1.470| 0.142| |factor(id)146 | 0.412| 0.124| 3.322| 0.001| |factor(id)147 | 0.258| 0.123| 2.087| 0.037| |factor(id)148 | 1.233| 0.142| 8.661| 0.000| |factor(id)149 | 0.968| 0.147| 6.588| 0.000| |factor(id)150 | 0.750| 0.120| 6.264| 0.000| |factor(id)151 | 0.839| 0.135| 6.218| 0.000| |factor(id)152 | -0.230| 0.153| -1.500| 0.134| |factor(id)153 | 0.497| 0.124| 4.022| 0.000| |factor(id)154 | 0.783| 0.124| 6.325| 0.000| |factor(id)155 | 0.552| 0.126| 4.376| 0.000| |factor(id)156 | 1.066| 0.146| 7.280| 0.000| |factor(id)157 | 0.370| 0.124| 2.986| 0.003| |factor(id)158 | 0.925| 0.136| 6.781| 0.000| |factor(id)159 | 1.008| 0.125| 8.093| 0.000| |factor(id)160 | 0.412| 0.140| 2.940| 0.003| |factor(id)161 | 0.401| 0.124| 3.237| 0.001| |factor(id)162 | 0.033| 0.148| 0.221| 0.825| |factor(id)163 | 0.881| 0.129| 6.816| 0.000| |factor(id)164 | -0.061| 0.124| -0.489| 0.625| |factor(id)165 | 1.487| 0.126| 11.799| 0.000| |factor(id)166 | 0.617| 0.125| 4.921| 0.000| |factor(id)167 | 0.158| 0.159| 0.997| 0.319| |factor(id)168 | 0.832| 0.124| 6.720| 0.000| |factor(id)169 | 0.386| 0.124| 3.121| 0.002| |factor(id)170 | 1.286| 0.124| 10.409| 0.000| |factor(id)171 | 0.100| 0.126| 0.795| 0.427| |factor(id)172 | 0.917| 0.126| 7.276| 0.000| |factor(id)173 | 0.320| 0.124| 2.574| 0.010| |factor(id)174 | 1.711| 0.159| 10.778| 0.000| |factor(id)175 | 0.652| 0.159| 4.101| 0.000| |factor(id)176 | 0.601| 0.124| 4.850| 0.000| |factor(id)177 | 0.466| 0.124| 3.762| 0.000| |factor(id)178 | 0.640| 0.124| 5.139| 0.000| |factor(id)179 | 0.484| 0.119| 4.052| 0.000| |factor(id)180 | 0.592| 0.124| 4.762| 0.000| |factor(id)181 | 0.626| 0.124| 5.050| 0.000| |factor(id)182 | 0.875| 0.152| 5.754| 0.000| |factor(id)183 | 0.773| 0.124| 6.250| 0.000| |factor(id)184 | 0.680| 0.119| 5.695| 0.000| |factor(id)185 | 0.811| 0.127| 6.377| 0.000| |factor(id)186 | 0.720| 0.161| 4.474| 0.000| |factor(id)187 | -0.015| 0.124| -0.121| 0.903| |factor(id)188 | 0.708| 0.125| 5.667| 0.000| |factor(id)189 | 0.515| 0.141| 3.663| 0.000| |factor(id)190 | 0.649| 0.153| 4.243| 0.000| |factor(id)191 | 0.065| 0.147| 0.441| 0.659| |factor(id)192 | 0.671| 0.153| 4.391| 0.000| |factor(id)193 | 0.274| 0.124| 2.210| 0.027| |factor(id)194 | 0.712| 0.126| 5.673| 0.000| |factor(id)195 | 0.143| 0.146| 0.980| 0.327| |factor(id)196 | 0.682| 0.123| 5.524| 0.000| |factor(id)197 | 0.363| 0.124| 2.939| 0.003| |factor(id)198 | 0.755| 0.136| 5.565| 0.000| |factor(id)199 | 0.374| 0.134| 2.785| 0.005| |factor(id)200 | 0.248| 0.147| 1.693| 0.091| |factor(id)201 | 0.008| 0.152| 0.055| 0.956| |factor(id)202 | 0.281| 0.126| 2.230| 0.026| |factor(id)203 | 0.134| 0.119| 1.132| 0.258| |factor(id)204 | 0.621| 0.102| 6.105| 0.000| |factor(id)205 | -0.034| 0.124| -0.272| 0.786| |factor(id)206 | 0.126| 0.130| 0.970| 0.332| |factor(id)207 | 0.404| 0.146| 2.761| 0.006| |factor(id)208 | 0.304| 0.147| 2.069| 0.039| |factor(id)209 | 0.748| 0.124| 6.055| 0.000| |factor(id)210 | 0.635| 0.146| 4.344| 0.000| |factor(id)211 | 1.079| 0.131| 8.208| 0.000| |factor(id)212 | 0.147| 0.127| 1.160| 0.246| |factor(id)213 | 0.622| 0.152| 4.081| 0.000| |factor(id)214 | 1.290| 0.153| 8.436| 0.000| |factor(id)215 | 0.460| 0.124| 3.719| 0.000| |factor(id)216 | 0.296| 0.124| 2.384| 0.017| |factor(id)217 | 0.215| 0.123| 1.739| 0.082| |factor(id)218 | 0.711| 0.135| 5.261| 0.000| |factor(id)219 | 0.799| 0.146| 5.473| 0.000| |factor(id)220 | -0.311| 0.141| -2.208| 0.027| |factor(id)221 | 0.079| 0.124| 0.638| 0.523| |factor(id)222 | -0.227| 0.124| -1.835| 0.067| |factor(id)223 | 0.640| 0.124| 5.170| 0.000| |factor(id)224 | 0.260| 0.149| 1.745| 0.081| |factor(id)225 | 0.449| 0.124| 3.618| 0.000| |factor(id)226 | 0.652| 0.147| 4.439| 0.000| |factor(id)227 | 0.040| 0.154| 0.259| 0.795| |factor(id)228 | -0.166| 0.159| -1.048| 0.295| |factor(id)229 | 0.242| 0.124| 1.951| 0.051| |factor(id)230 | 0.845| 0.121| 6.994| 0.000| |factor(id)231 | 0.915| 0.127| 7.225| 0.000| |factor(id)232 | 0.845| 0.125| 6.770| 0.000| |factor(id)233 | 0.702| 0.135| 5.221| 0.000| |factor(id)234 | 0.153| 0.119| 1.287| 0.198| |factor(id)235 | 0.369| 0.124| 2.989| 0.003| |factor(id)236 | 0.671| 0.147| 4.573| 0.000| |factor(id)237 | 0.800| 0.141| 5.661| 0.000| |factor(id)238 | 0.671| 0.126| 5.327| 0.000| |factor(id)239 | 0.419| 0.125| 3.362| 0.001| |factor(id)240 | 0.604| 0.137| 4.403| 0.000| |factor(id)241 | 0.715| 0.147| 4.863| 0.000| |factor(id)242 | 0.892| 0.125| 7.153| 0.000| |factor(id)243 | 0.330| 0.159| 2.072| 0.038| |factor(id)244 | 0.833| 0.119| 6.973| 0.000| |factor(id)245 | 0.502| 0.125| 4.027| 0.000| |factor(id)246 | 1.720| 0.126| 13.673| 0.000| |factor(id)247 | 0.493| 0.124| 3.987| 0.000| |factor(id)248 | 0.963| 0.141| 6.816| 0.000| |factor(id)249 | 0.308| 0.120| 2.573| 0.010| |factor(id)250 | 0.118| 0.100| 1.179| 0.239| |factor(id)251 | -0.025| 0.123| -0.202| 0.840| |factor(id)252 | 0.815| 0.124| 6.576| 0.000| |factor(id)253 | 0.116| 0.110| 1.061| 0.289| |factor(id)254 | 0.262| 0.147| 1.787| 0.074| |factor(id)255 | 0.506| 0.124| 4.075| 0.000| |factor(id)256 | 0.588| 0.129| 4.564| 0.000| |factor(id)257 | 0.820| 0.129| 6.353| 0.000| |factor(id)258 | 0.070| 0.154| 0.457| 0.648| |factor(id)259 | 0.428| 0.125| 3.434| 0.001| |factor(id)260 | 0.161| 0.148| 1.092| 0.275| |factor(id)261 | 0.605| 0.147| 4.121| 0.000| |factor(id)262 | -0.061| 0.124| -0.489| 0.625| |factor(id)263 | 0.666| 0.146| 4.564| 0.000| |factor(id)264 | 0.332| 0.124| 2.689| 0.007| |factor(id)265 | 0.515| 0.124| 4.135| 0.000| |factor(id)266 | 1.108| 0.129| 8.577| 0.000| |factor(id)267 | 0.725| 0.153| 4.735| 0.000| |factor(id)268 | 0.292| 0.106| 2.763| 0.006| |factor(id)269 | 0.145| 0.124| 1.175| 0.240| |factor(id)270 | 0.402| 0.125| 3.206| 0.001| |factor(id)271 | 0.580| 0.126| 4.590| 0.000| |factor(id)272 | 0.430| 0.152| 2.823| 0.005| |factor(id)273 | 0.500| 0.124| 4.040| 0.000| |factor(id)274 | 0.599| 0.110| 5.464| 0.000| |factor(id)275 | 0.215| 0.106| 2.027| 0.043| |factor(id)276 | 0.645| 0.135| 4.785| 0.000| |factor(id)277 | -0.584| 0.146| -4.000| 0.000| |factor(id)278 | 1.069| 0.146| 7.314| 0.000| |factor(id)279 | 0.596| 0.130| 4.573| 0.000| |factor(id)280 | 0.440| 0.153| 2.882| 0.004| |factor(id)281 | 0.979| 0.124| 7.868| 0.000| |factor(id)282 | 0.408| 0.154| 2.646| 0.008| |factor(id)283 | 0.580| 0.124| 4.659| 0.000| |factor(id)284 | -0.006| 0.123| -0.052| 0.959| |factor(id)285 | -0.436| 0.159| -2.746| 0.006| |factor(id)286 | 0.615| 0.124| 4.959| 0.000| |factor(id)287 | 0.277| 0.124| 2.237| 0.025| |factor(id)288 | 0.225| 0.129| 1.747| 0.081| |factor(id)289 | 0.214| 0.119| 1.797| 0.072| |factor(id)290 | 0.358| 0.136| 2.635| 0.008| |factor(id)291 | 0.789| 0.124| 6.371| 0.000| |factor(id)292 | 0.601| 0.123| 4.875| 0.000| |factor(id)293 | 0.288| 0.129| 2.233| 0.026| |factor(id)294 | 0.328| 0.127| 2.580| 0.010| |factor(id)295 | 0.429| 0.124| 3.464| 0.001| |factor(id)296 | 0.022| 0.149| 0.147| 0.883| |factor(id)297 | 0.625| 0.125| 4.981| 0.000| |factor(id)298 | 0.294| 0.124| 2.378| 0.017| |factor(id)299 | 0.519| 0.140| 3.704| 0.000| |factor(id)300 | 0.720| 0.127| 5.655| 0.000| |factor(id)301 | 0.781| 0.125| 6.267| 0.000| |factor(id)302 | 1.255| 0.125| 10.055| 0.000| |factor(id)303 | 0.372| 0.130| 2.869| 0.004| |factor(id)304 | 0.126| 0.124| 1.018| 0.309| |factor(id)305 | -0.136| 0.129| -1.053| 0.292| |factor(id)306 | 0.755| 0.124| 6.098| 0.000| |factor(id)307 | 0.488| 0.124| 3.937| 0.000| |factor(id)308 | 0.854| 0.123| 6.919| 0.000| |factor(id)309 | 0.452| 0.124| 3.651| 0.000| |factor(id)310 | -0.064| 0.124| -0.514| 0.607| |factor(id)311 | 0.380| 0.154| 2.475| 0.013| |factor(id)312 | 1.232| 0.124| 9.969| 0.000| |factor(id)313 | 0.294| 0.119| 2.473| 0.013| |factor(id)314 | 0.676| 0.159| 4.254| 0.000| |factor(id)315 | 0.687| 0.119| 5.767| 0.000| |factor(id)316 | 0.413| 0.124| 3.334| 0.001| |factor(id)317 | 0.449| 0.135| 3.328| 0.001| |factor(id)318 | 0.445| 0.130| 3.430| 0.001| |factor(id)319 | 0.070| 0.114| 0.613| 0.540| |factor(id)320 | 0.811| 0.129| 6.286| 0.000| |factor(id)321 | 1.160| 0.100| 11.553| 0.000| |factor(id)322 | 0.173| 0.114| 1.515| 0.130| |factor(id)323 | 0.413| 0.124| 3.342| 0.001| |factor(id)324 | 0.782| 0.124| 6.327| 0.000| |factor(id)325 | 0.844| 0.124| 6.789| 0.000| |factor(id)326 | 0.307| 0.124| 2.476| 0.013| |factor(id)327 | 0.496| 0.160| 3.106| 0.002| |factor(id)328 | 0.355| 0.120| 2.966| 0.003| |factor(id)329 | 0.599| 0.125| 4.791| 0.000| |factor(id)330 | 0.788| 0.126| 6.275| 0.000| |factor(id)331 | 0.427| 0.096| 4.463| 0.000| |factor(id)332 | 0.372| 0.124| 3.004| 0.003| |factor(id)333 | 0.596| 0.140| 4.253| 0.000| |factor(id)334 | 0.925| 0.123| 7.489| 0.000| |factor(id)335 | 0.947| 0.110| 8.575| 0.000| |factor(id)336 | 0.367| 0.124| 2.965| 0.003| |factor(id)337 | 0.324| 0.124| 2.619| 0.009| |factor(id)338 | 0.279| 0.124| 2.261| 0.024| |factor(id)339 | 1.072| 0.124| 8.658| 0.000| |factor(id)340 | 0.405| 0.160| 2.530| 0.011| |factor(id)341 | 0.198| 0.159| 1.245| 0.213| |factor(id)342 | 0.732| 0.159| 4.594| 0.000| |factor(id)343 | 0.324| 0.146| 2.215| 0.027| |factor(id)344 | 0.553| 0.153| 3.614| 0.000| |factor(id)345 | 0.731| 0.131| 5.574| 0.000| |factor(id)346 | 0.392| 0.127| 3.090| 0.002| |factor(id)347 | 0.432| 0.135| 3.206| 0.001| |factor(id)348 | 0.614| 0.146| 4.196| 0.000| |factor(id)349 | 0.002| 0.111| 0.023| 0.982| |factor(id)350 | 0.376| 0.106| 3.534| 0.000| |factor(id)351 | 0.305| 0.120| 2.547| 0.011| |factor(id)352 | 0.657| 0.125| 5.268| 0.000| |factor(id)353 | 0.825| 0.136| 6.044| 0.000| |factor(id)354 | 0.246| 0.120| 2.059| 0.040| |factor(id)355 | 0.895| 0.159| 5.624| 0.000| |factor(id)356 | 0.683| 0.147| 4.636| 0.000| |factor(id)357 | 0.909| 0.146| 6.227| 0.000| |factor(id)358 | 0.679| 0.125| 5.447| 0.000| |factor(id)359 | 0.573| 0.125| 4.595| 0.000| |factor(id)360 | 0.382| 0.124| 3.084| 0.002| |factor(id)361 | 1.063| 0.129| 8.253| 0.000| |factor(id)362 | 0.469| 0.124| 3.774| 0.000| |factor(id)363 | 0.456| 0.124| 3.675| 0.000| |factor(id)364 | 0.393| 0.123| 3.181| 0.001| |factor(id)365 | 0.132| 0.119| 1.115| 0.265| |factor(id)366 | 0.514| 0.154| 3.346| 0.001| |factor(id)367 | 0.593| 0.146| 4.059| 0.000| |factor(id)368 | -0.212| 0.141| -1.500| 0.134| |factor(id)369 | 0.107| 0.131| 0.816| 0.415| |factor(id)370 | 0.253| 0.124| 2.032| 0.042| |factor(id)371 | -0.131| 0.159| -0.827| 0.408| |factor(id)372 | 0.061| 0.124| 0.492| 0.623| |factor(id)373 | 0.538| 0.153| 3.528| 0.000| |factor(id)374 | 0.194| 0.135| 1.435| 0.151| |factor(id)375 | 0.465| 0.124| 3.755| 0.000| |factor(id)376 | 0.131| 0.124| 1.056| 0.291| |factor(id)377 | 0.017| 0.124| 0.140| 0.889| |factor(id)378 | 0.561| 0.152| 3.677| 0.000| |factor(id)379 | 0.685| 0.124| 5.517| 0.000| |factor(id)380 | 1.360| 0.146| 9.292| 0.000| |factor(id)381 | 0.511| 0.147| 3.479| 0.001| |factor(id)382 | 0.288| 0.124| 2.314| 0.021| |factor(id)383 | 0.791| 0.127| 6.250| 0.000| |factor(id)384 | 0.461| 0.135| 3.425| 0.001| |factor(id)385 | 0.498| 0.123| 4.034| 0.000| |factor(id)386 | 0.195| 0.160| 1.224| 0.221| |factor(id)387 | -0.201| 0.125| -1.611| 0.107| |factor(id)388 | 0.072| 0.115| 0.628| 0.530| |factor(id)389 | 0.354| 0.106| 3.341| 0.001| |factor(id)390 | 0.126| 0.111| 1.139| 0.255| |factor(id)391 | 0.597| 0.154| 3.886| 0.000| |factor(id)392 | 0.965| 0.125| 7.730| 0.000| |factor(id)393 | 0.811| 0.153| 5.285| 0.000| |factor(id)394 | 0.136| 0.130| 1.042| 0.298| |factor(id)395 | 1.248| 0.153| 8.169| 0.000| |factor(id)396 | 0.951| 0.124| 7.674| 0.000| |factor(id)397 | 0.813| 0.130| 6.260| 0.000| |factor(id)398 | 0.851| 0.146| 5.809| 0.000| |factor(id)399 | 0.250| 0.134| 1.861| 0.063| |factor(id)400 | 0.388| 0.158| 2.448| 0.014| |factor(id)401 | 1.039| 0.124| 8.388| 0.000| |factor(id)402 | -0.086| 0.127| -0.681| 0.496| |factor(id)403 | 0.086| 0.124| 0.697| 0.486| |factor(id)404 | 0.390| 0.125| 3.128| 0.002| |factor(id)405 | 0.498| 0.146| 3.399| 0.001| |factor(id)406 | 0.278| 0.124| 2.243| 0.025| |factor(id)407 | 0.381| 0.125| 3.053| 0.002| |factor(id)408 | 0.430| 0.125| 3.442| 0.001| |factor(id)409 | 0.643| 0.125| 5.159| 0.000| |factor(id)410 | 0.811| 0.125| 6.464| 0.000| |factor(id)411 | 1.057| 0.147| 7.209| 0.000| |factor(id)412 | -0.349| 0.124| -2.819| 0.005| |factor(id)413 | 0.307| 0.124| 2.476| 0.013| |factor(id)414 | 1.041| 0.124| 8.378| 0.000| |factor(id)415 | 0.481| 0.127| 3.779| 0.000| |factor(id)416 | 0.280| 0.119| 2.362| 0.018| |factor(id)417 | -0.134| 0.120| -1.124| 0.261| |factor(id)418 | 0.641| 0.146| 4.379| 0.000| |factor(id)419 | 0.394| 0.114| 3.442| 0.001| |factor(id)420 | 1.084| 0.112| 9.723| 0.000| |factor(id)421 | 0.445| 0.124| 3.579| 0.000| |factor(id)422 | 0.857| 0.153| 5.604| 0.000| |factor(id)423 | 0.188| 0.106| 1.781| 0.075| |factor(id)424 | 0.250| 0.148| 1.685| 0.092| |factor(id)425 | 0.253| 0.124| 2.044| 0.041| |factor(id)426 | 0.487| 0.102| 4.772| 0.000| |factor(id)427 | 0.989| 0.153| 6.451| 0.000| |factor(id)428 | -0.072| 0.124| -0.576| 0.565| |factor(id)429 | 0.240| 0.119| 2.011| 0.044| |factor(id)430 | 0.432| 0.105| 4.095| 0.000| |factor(id)431 | -0.018| 0.124| -0.144| 0.886| |factor(id)432 | 0.545| 0.110| 4.965| 0.000| |factor(id)433 | 0.312| 0.160| 1.948| 0.052| |factor(id)434 | 0.418| 0.135| 3.100| 0.002| |factor(id)435 | -0.001| 0.125| -0.006| 0.995| |factor(id)436 | 0.583| 0.113| 5.182| 0.000| |factor(id)437 | 0.376| 0.135| 2.776| 0.006| |factor(id)438 | 0.166| 0.161| 1.026| 0.305| |factor(id)439 | 0.333| 0.159| 2.100| 0.036| |factor(id)440 | 0.475| 0.134| 3.536| 0.000| |factor(id)441 | 0.255| 0.125| 2.050| 0.040| |factor(id)442 | 0.194| 0.125| 1.560| 0.119| |factor(id)443 | 0.029| 0.124| 0.237| 0.813| |factor(id)444 | 0.184| 0.125| 1.480| 0.139| |factor(id)445 | 0.460| 0.159| 2.887| 0.004| |factor(id)446 | 0.042| 0.115| 0.370| 0.712| |factor(id)447 | 0.325| 0.124| 2.629| 0.009| |factor(id)448 | 1.189| 0.126| 9.471| 0.000| |factor(id)449 | 0.579| 0.147| 3.947| 0.000| |factor(id)450 | 0.464| 0.124| 3.748| 0.000| |factor(id)451 | 0.659| 0.153| 4.311| 0.000| |factor(id)452 | 0.227| 0.124| 1.828| 0.068| |factor(id)453 | 0.373| 0.125| 2.989| 0.003| |factor(id)454 | -0.200| 0.129| -1.553| 0.121| |factor(id)455 | 0.768| 0.110| 6.988| 0.000| |factor(id)456 | 0.166| 0.124| 1.335| 0.182| |factor(id)457 | 0.536| 0.160| 3.351| 0.001| |factor(id)458 | 0.304| 0.110| 2.773| 0.006| |factor(id)459 | 0.228| 0.115| 1.993| 0.046| |factor(id)460 | 0.165| 0.124| 1.338| 0.181| |factor(id)461 | 0.281| 0.155| 1.818| 0.069| |factor(id)462 | -0.279| 0.125| -2.241| 0.025| |factor(id)463 | -0.198| 0.125| -1.592| 0.112| |factor(id)464 | 0.446| 0.114| 3.911| 0.000| |factor(id)465 | 0.366| 0.125| 2.935| 0.003| |factor(id)466 | -0.059| 0.114| -0.517| 0.605| |factor(id)467 | 0.181| 0.114| 1.585| 0.113| |factor(id)468 | 0.192| 0.123| 1.552| 0.121| |factor(id)469 | 0.102| 0.153| 0.667| 0.505| |factor(id)470 | 0.380| 0.124| 3.067| 0.002| |factor(id)471 | -0.170| 0.147| -1.160| 0.246| |factor(id)472 | 0.339| 0.114| 2.970| 0.003| |factor(id)473 | 0.289| 0.147| 1.970| 0.049| |factor(id)474 | 0.612| 0.129| 4.742| 0.000| |factor(id)475 | 0.445| 0.147| 3.035| 0.002| |factor(id)476 | 0.382| 0.123| 3.090| 0.002| |factor(id)477 | 0.413| 0.124| 3.321| 0.001| |factor(id)478 | 0.562| 0.141| 3.976| 0.000| |factor(id)479 | 0.173| 0.124| 1.400| 0.162| |factor(id)480 | 0.092| 0.124| 0.746| 0.455| |factor(id)481 | 0.774| 0.126| 6.147| 0.000| |factor(id)482 | 0.100| 0.124| 0.810| 0.418| |factor(id)483 | 0.630| 0.147| 4.297| 0.000| |factor(id)484 | 0.222| 0.124| 1.795| 0.073| |factor(id)485 | -0.004| 0.124| -0.032| 0.975| |factor(id)486 | -0.015| 0.124| -0.118| 0.906| |factor(id)487 | 0.056| 0.134| 0.416| 0.677| |factor(id)488 | 0.411| 0.146| 2.817| 0.005| |factor(id)489 | 0.340| 0.110| 3.104| 0.002| |factor(id)490 | 0.264| 0.140| 1.882| 0.060| |factor(id)491 | 0.581| 0.130| 4.473| 0.000| |factor(id)492 | 0.135| 0.147| 0.916| 0.360| |factor(id)493 | 0.463| 0.110| 4.222| 0.000| |factor(id)494 | 0.196| 0.125| 1.567| 0.117| |factor(id)495 | 0.172| 0.125| 1.374| 0.170| |factor(id)496 | 0.847| 0.119| 7.096| 0.000| |factor(id)497 | 0.653| 0.119| 5.494| 0.000| |factor(id)498 | 0.175| 0.135| 1.298| 0.194| |factor(id)499 | 0.263| 0.123| 2.130| 0.033| |factor(id)500 | 0.211| 0.124| 1.706| 0.088| |factor(id)501 | -0.311| 0.159| -1.963| 0.050| |factor(id)502 | 0.671| 0.110| 6.111| 0.000| |factor(id)503 | 0.503| 0.148| 3.409| 0.001| |factor(id)504 | 1.642| 0.123| 13.300| 0.000| |factor(id)505 | 0.289| 0.100| 2.876| 0.004| |factor(id)506 | 0.913| 0.115| 7.970| 0.000| |factor(id)507 | 0.895| 0.136| 6.574| 0.000| |factor(id)508 | 0.422| 0.124| 3.406| 0.001| |factor(id)509 | 0.777| 0.147| 5.299| 0.000| |factor(id)510 | 0.669| 0.147| 4.561| 0.000| |factor(id)511 | 1.039| 0.124| 8.402| 0.000| |factor(id)512 | -0.041| 0.119| -0.345| 0.730| |factor(id)513 | 0.425| 0.161| 2.635| 0.008| |factor(id)514 | 0.266| 0.147| 1.809| 0.071| |factor(id)515 | 0.612| 0.124| 4.935| 0.000| |factor(id)516 | 1.028| 0.125| 8.234| 0.000| |factor(id)517 | 0.771| 0.106| 7.267| 0.000| |factor(id)518 | 0.549| 0.124| 4.435| 0.000| |factor(id)519 | 0.856| 0.141| 6.087| 0.000| |factor(id)520 | 1.096| 0.148| 7.416| 0.000| |factor(id)521 | 0.625| 0.124| 5.047| 0.000| |factor(id)522 | 0.919| 0.125| 7.344| 0.000| |factor(id)523 | 0.769| 0.124| 6.221| 0.000| |factor(id)524 | 0.664| 0.136| 4.872| 0.000| |factor(id)525 | 0.487| 0.142| 3.419| 0.001| |factor(id)526 | 0.492| 0.124| 3.958| 0.000| |factor(id)527 | 0.762| 0.141| 5.394| 0.000| |factor(id)528 | 0.962| 0.115| 8.400| 0.000| |factor(id)529 | 0.205| 0.149| 1.380| 0.168| |factor(id)530 | 0.419| 0.135| 3.107| 0.002| |factor(id)531 | 1.066| 0.128| 8.356| 0.000| |factor(id)532 | 0.815| 0.125| 6.547| 0.000| |factor(id)533 | 1.620| 0.125| 12.932| 0.000| |factor(id)534 | 0.856| 0.119| 7.204| 0.000| |factor(id)535 | 0.147| 0.124| 1.184| 0.236| |factor(id)536 | 0.577| 0.124| 4.652| 0.000| |factor(id)537 | 0.702| 0.124| 5.662| 0.000| |factor(id)538 | 0.227| 0.153| 1.487| 0.137| |factor(id)539 | 0.273| 0.130| 2.101| 0.036| |factor(id)540 | 0.791| 0.153| 5.170| 0.000| |factor(id)541 | 1.220| 0.125| 9.769| 0.000| |factor(id)542 | 0.519| 0.124| 4.186| 0.000| |factor(id)543 | 0.548| 0.143| 3.822| 0.000| |factor(id)544 | 0.376| 0.119| 3.155| 0.002| |factor(id)545 | 0.603| 0.124| 4.877| 0.000| |factor(id)546 | 0.271| 0.129| 2.093| 0.036| |factor(id)547 | 0.702| 0.125| 5.616| 0.000| |factor(id)548 | -0.007| 0.124| -0.054| 0.957| |factor(id)549 | 0.752| 0.124| 6.063| 0.000| |factor(id)550 | 0.723| 0.125| 5.804| 0.000| |factor(id)551 | 0.076| 0.124| 0.614| 0.539| |factor(id)552 | 0.680| 0.135| 5.037| 0.000| |factor(id)553 | 0.201| 0.130| 1.553| 0.120| |factor(id)554 | 0.612| 0.159| 3.847| 0.000| |factor(id)555 | 0.484| 0.124| 3.889| 0.000| |factor(id)556 | 0.635| 0.126| 5.045| 0.000| |factor(id)557 | 0.649| 0.154| 4.228| 0.000| |factor(id)558 | 0.557| 0.136| 4.081| 0.000| |factor(id)559 | 0.525| 0.124| 4.222| 0.000| |factor(id)560 | 0.675| 0.115| 5.882| 0.000| |factor(id)561 | 0.781| 0.124| 6.320| 0.000| |factor(id)562 | 0.706| 0.146| 4.818| 0.000| |factor(id)563 | 0.718| 0.135| 5.303| 0.000| |factor(id)564 | 0.300| 0.124| 2.416| 0.016| |factor(id)565 | 0.678| 0.120| 5.646| 0.000| |factor(id)566 | -0.150| 0.126| -1.191| 0.234| |factor(id)567 | 0.477| 0.124| 3.850| 0.000| |factor(id)568 | 0.885| 0.119| 7.448| 0.000| |factor(id)569 | 0.544| 0.131| 4.147| 0.000| |factor(id)570 | 1.315| 0.124| 10.611| 0.000| |factor(id)571 | 0.115| 0.124| 0.925| 0.355| |factor(id)572 | 0.800| 0.160| 4.984| 0.000| |factor(id)573 | 0.892| 0.134| 6.638| 0.000| |factor(id)574 | 0.409| 0.123| 3.310| 0.001| |factor(id)575 | 0.514| 0.124| 4.164| 0.000| |factor(id)576 | 0.671| 0.124| 5.401| 0.000| |factor(id)577 | 0.644| 0.147| 4.396| 0.000| |factor(id)578 | 0.539| 0.135| 4.000| 0.000| |factor(id)579 | 0.553| 0.129| 4.298| 0.000| |factor(id)580 | 0.437| 0.131| 3.345| 0.001| |factor(id)581 | 0.584| 0.125| 4.668| 0.000| |factor(id)582 | 0.962| 0.135| 7.111| 0.000| |factor(id)583 | 0.325| 0.124| 2.624| 0.009| |factor(id)584 | 0.497| 0.125| 3.989| 0.000| |factor(id)585 | 0.599| 0.119| 5.035| 0.000| |factor(id)586 | 0.100| 0.123| 0.814| 0.416| |factor(id)587 | 0.969| 0.124| 7.841| 0.000| |factor(id)588 | 0.977| 0.125| 7.805| 0.000| |factor(id)589 | 0.347| 0.116| 2.996| 0.003| |factor(id)590 | 0.100| 0.125| 0.805| 0.421| |factor(id)591 | 0.554| 0.154| 3.609| 0.000| |factor(id)592 | -0.163| 0.124| -1.310| 0.190| |factor(id)593 | 0.417| 0.124| 3.353| 0.001| |factor(id)594 | 0.584| 0.142| 4.098| 0.000| |factor(id)595 | 0.562| 0.124| 4.532| 0.000| |factor(id)596 | 0.571| 0.110| 5.172| 0.000| |factor(id)597 | 0.369| 0.114| 3.238| 0.001| |factor(id)598 | 0.458| 0.110| 4.170| 0.000| |factor(id)599 | 0.547| 0.106| 5.179| 0.000| |factor(id)600 | 0.075| 0.124| 0.604| 0.546| |factor(id)601 | -0.189| 0.124| -1.528| 0.127| |factor(id)602 | 0.771| 0.126| 6.127| 0.000| |factor(id)603 | -0.025| 0.125| -0.196| 0.845| |factor(id)604 | 0.420| 0.124| 3.377| 0.001| |factor(id)605 | 0.318| 0.114| 2.785| 0.005| |factor(id)606 | 0.298| 0.125| 2.387| 0.017| |factor(id)607 | 0.565| 0.110| 5.153| 0.000| |factor(id)608 | 0.787| 0.124| 6.350| 0.000| |factor(id)609 | 0.488| 0.160| 3.056| 0.002| |factor(id)610 | 0.676| 0.147| 4.591| 0.000| |factor(id)611 | 0.953| 0.125| 7.621| 0.000| |factor(id)612 | 0.472| 0.153| 3.078| 0.002| |factor(id)613 | 0.551| 0.124| 4.453| 0.000| |factor(id)614 | 0.430| 0.111| 3.893| 0.000| |factor(id)615 | 0.061| 0.136| 0.451| 0.652| |factor(id)616 | 0.362| 0.124| 2.925| 0.003| |factor(id)617 | 0.116| 0.135| 0.859| 0.391| |factor(id)618 | -0.345| 0.140| -2.456| 0.014| |factor(id)619 | 0.221| 0.124| 1.776| 0.076| |factor(id)620 | 0.147| 0.126| 1.164| 0.245| |factor(id)621 | 0.638| 0.119| 5.371| 0.000| |factor(id)622 | 0.483| 0.124| 3.895| 0.000| |factor(id)623 | 0.401| 0.103| 3.891| 0.000| |factor(id)624 | -0.214| 0.124| -1.733| 0.083| |factor(id)625 | 0.396| 0.152| 2.603| 0.009| |factor(id)626 | 0.461| 0.124| 3.733| 0.000| |factor(id)627 | 1.375| 0.143| 9.592| 0.000| |factor(id)628 | 0.162| 0.124| 1.302| 0.193| |factor(id)629 | 1.144| 0.125| 9.184| 0.000| |factor(id)630 | 0.912| 0.099| 9.210| 0.000| |factor(id)631 | 0.672| 0.124| 5.422| 0.000| |factor(id)632 | 0.838| 0.124| 6.764| 0.000| |factor(id)633 | 0.535| 0.123| 4.336| 0.000| |factor(id)634 | 1.235| 0.124| 9.979| 0.000| |factor(id)635 | 0.239| 0.119| 2.010| 0.045| |factor(id)636 | 0.357| 0.119| 3.001| 0.003| |factor(id)637 | 0.751| 0.110| 6.854| 0.000| |factor(id)638 | 0.807| 0.141| 5.745| 0.000| |factor(id)639 | 0.210| 0.124| 1.695| 0.090| |factor(id)640 | 0.908| 0.153| 5.926| 0.000| |factor(id)641 | 0.483| 0.160| 3.022| 0.003| |factor(id)642 | 0.134| 0.131| 1.026| 0.305| |factor(id)643 | 0.678| 0.136| 4.976| 0.000| |factor(id)644 | 0.998| 0.125| 8.005| 0.000| |factor(id)645 | 0.740| 0.132| 5.630| 0.000| |factor(id)646 | 1.148| 0.159| 7.233| 0.000| |factor(id)647 | 0.404| 0.124| 3.247| 0.001| |factor(id)648 | 0.286| 0.123| 2.320| 0.020| |factor(id)649 | 0.262| 0.147| 1.784| 0.075| |factor(id)650 | 0.049| 0.129| 0.380| 0.704| |factor(id)651 | 0.662| 0.102| 6.485| 0.000| |factor(id)652 | 0.387| 0.124| 3.108| 0.002| |factor(id)653 | -0.320| 0.123| -2.595| 0.009| |factor(id)654 | 0.516| 0.125| 4.132| 0.000| |factor(id)655 | -0.045| 0.135| -0.332| 0.740| |factor(id)656 | 0.105| 0.146| 0.719| 0.472| |factor(id)657 | -0.202| 0.124| -1.633| 0.103| |factor(id)658 | 0.347| 0.115| 3.031| 0.002| |factor(id)659 | -0.136| 0.123| -1.101| 0.271| |factor(id)660 | 0.618| 0.141| 4.374| 0.000| |factor(id)661 | 0.312| 0.124| 2.516| 0.012| |factor(id)662 | 0.527| 0.114| 4.621| 0.000| |factor(id)663 | 0.316| 0.160| 1.973| 0.049| |factor(id)664 | 0.332| 0.124| 2.673| 0.008| |factor(id)665 | 0.014| 0.124| 0.114| 0.909| |factor(id)666 | 0.114| 0.124| 0.922| 0.357| |factor(id)667 | -0.189| 0.129| -1.470| 0.142| |factor(id)668 | -0.254| 0.124| -2.052| 0.040| |factor(id)669 | 0.197| 0.125| 1.584| 0.113| |factor(id)670 | 0.250| 0.124| 2.020| 0.044| |factor(id)671 | 0.404| 0.124| 3.269| 0.001| |factor(id)672 | -0.232| 0.125| -1.856| 0.064| |factor(id)673 | 0.280| 0.124| 2.270| 0.023| |factor(id)674 | 0.548| 0.105| 5.200| 0.000| |factor(id)675 | -0.018| 0.124| -0.144| 0.886| |factor(id)676 | -0.240| 0.124| -1.940| 0.053| |factor(id)677 | 0.314| 0.110| 2.865| 0.004| |factor(id)678 | 0.208| 0.124| 1.677| 0.094| |factor(id)679 | 0.167| 0.124| 1.352| 0.177| |factor(id)680 | -0.278| 0.119| -2.339| 0.019| |factor(id)681 | 0.126| 0.120| 1.053| 0.293| |factor(id)682 | 0.138| 0.124| 1.118| 0.263| |factor(id)683 | 0.181| 0.160| 1.138| 0.255| |factor(id)684 | 0.187| 0.119| 1.567| 0.117| |factor(id)685 | 1.149| 0.129| 8.872| 0.000| |factor(id)686 | 0.532| 0.147| 3.625| 0.000| |factor(id)687 | 0.361| 0.141| 2.568| 0.010| |factor(id)688 | 0.206| 0.124| 1.662| 0.097| |factor(id)689 | 0.432| 0.119| 3.616| 0.000| |factor(id)690 | 0.993| 0.124| 8.014| 0.000| |factor(id)691 | 0.511| 0.124| 4.134| 0.000| |factor(id)692 | 0.609| 0.124| 4.910| 0.000| |factor(id)693 | 0.066| 0.123| 0.536| 0.592| |factor(id)694 | 0.109| 0.147| 0.743| 0.457| |factor(id)695 | 0.093| 0.126| 0.734| 0.463| |factor(id)696 | 0.768| 0.125| 6.169| 0.000| |factor(id)697 | 0.723| 0.124| 5.853| 0.000| |factor(id)698 | 1.436| 0.123| 11.636| 0.000| |factor(id)699 | 0.462| 0.123| 3.744| 0.000| |factor(id)700 | 0.170| 0.147| 1.159| 0.247| |factor(id)701 | 0.406| 0.123| 3.288| 0.001| |factor(id)702 | 0.102| 0.102| 0.995| 0.320| |factor(id)703 | 0.368| 0.135| 2.736| 0.006| |factor(id)704 | 0.125| 0.147| 0.850| 0.396| |factor(id)705 | 0.404| 0.154| 2.624| 0.009| |factor(id)706 | 0.708| 0.153| 4.623| 0.000| |factor(id)707 | 0.456| 0.124| 3.690| 0.000| |factor(id)708 | 0.178| 0.110| 1.618| 0.106| |factor(id)709 | 0.631| 0.124| 5.096| 0.000| |factor(id)710 | 0.191| 0.160| 1.194| 0.233| |factor(id)711 | 0.284| 0.125| 2.280| 0.023| |factor(id)712 | 0.945| 0.124| 7.627| 0.000| |factor(id)713 | 0.456| 0.103| 4.424| 0.000| |factor(id)714 | 0.721| 0.147| 4.906| 0.000| |factor(id)715 | 0.136| 0.129| 1.051| 0.293| |factor(id)716 | NA| NA| NA| NA| --- - The Table above displays the results of an OLS regression on a subsample of the first 10 individuals in the dataset nls_panel. - The table is generated by the previous code sequence, where the novelty is using the factor variable id. The function factor() generates dummy variables for all categories of the variable, taking the first category as the reference. - To include the reference in the output, one needs to exclude the constant from the regression model by including the term −1 in the regression formula. When the constant is not excluded, the coefficients of the dummy variables represent, as usual, the difference between the respective category and the benchmark one. --- ## Problem with LSDV -- - Incidental parameter problem: The degrees of freedom reduce to `\(NT-N-K\)` because of the extra parameters estimated. This can be severe if N is large! - therefore LSDV can be numerically very inefficient. -- -- - We consider alternative estimation procedures that employ a transformation to eliminate the individual heterogeneity from the estimation equation and thus solve the common endogeneity problem - The difference estimator - The within estimator - The fixed effects estimator -- --- ## The Difference Estimator: T=2 -- - When we observe each individual in two different time periods, t = 1 and t = 2. The two observations written out as in (1) are `\begin{equation} y_{i1}=\beta_{1}+\beta_{2}x_{2i1}+\ldots +\beta_{k}x_{Ki1}+\mu_{i}+ e_{i1} \end{equation}` `\begin{equation} y_{i2}=\beta_{1}+\beta_{2}x_{2i2}+\ldots +\beta_{k}x_{Ki2}+\mu_{i}+ e_{i2} \end{equation}` - Subtracting these two equations creates a new equation `\begin{equation} (y_{i2}-y_{i2})=\beta_{2}(x_{2i2}-x_{2i1})+\ldots +\beta_{k}(x_{Ki2}-x_{Ki2})+ (e_{i2}-e_{i1}) \end{equation}` - Simplifying `$$\Delta y_{i}=\beta_{2}\Delta x_{2i}+\ldots +\beta_{k}\Delta x_{Ki}+\Delta e_{i}$$` - The OLS estimator od this final equation is called **difference estimator** -- --- - Let us considerdifferences between the first and last years in the sample. Since there are 5 years in the wage data, we are interested in `\(\Delta_{5}y_{it}=y_{it}-y_{i,t-5}\)` In `plm()` package this is accomplished as `diff(y,5)`. We revert to the entire data. ```r wage.diff <- plm(diff(lwage,5)~diff(educ,5)+ diff(exper,5)+diff(I(exper^2),5)+ diff(tenure,5)+diff(I(tenure^2),5), data=nlspd, model="pooling") kable(tidy(wage.diff), digits=3, caption="Difference estimator") ``` Table: Difference estimator |term | estimate| std.error| statistic| p.value| |:--------------------|--------:|---------:|---------:|-------:| |(Intercept) | -0.057| 0.068| -0.828| 0.408| |diff(exper, 5) | 0.053| 0.017| 3.029| 0.002| |diff(I(exper^2), 5) | 0.000| 0.000| -1.303| 0.193| |diff(tenure, 5) | 0.013| 0.004| 3.207| 0.001| |diff(I(tenure^2), 5) | -0.001| 0.000| -3.650| 0.000| --- ## The Within estimator -- - The advantage of the within transformation is that it generalizes nicely to situations when we have more than T = 2 time observations on each individual. -- -- - Averaging over all time observations: `\begin{equation} \dfrac{1}{N}\sum_{t=1}^{T}(y_{i1}=\beta_{1}+\beta_{2}x_{2i1}+\ldots +\beta_{k}x_{Ki1}+\mu_{i}+ e_{i1}) \end{equation}` -- -- - The time-averaged model, for `\(i = 1, \ldots,N,\)` is: `\begin{equation} \bar{y}_{i}=\beta_{1}+\beta_{2}\bar{x}_{2i}+\ldots +\beta_{k}\bar{x}_{Ki}+\mu_{i}+\bar{e}_{i} \end{equation}` -- -- - Now subtract this equation from the original equation `\begin{equation} (y_{it}-\bar{y}_{i})=\beta_{2}(x_{2it}-\bar{x}_{2i})+\ldots +\beta_{k}(x_{Kit}-\bar{x}_{K})+ (e_{it}-\bar{e}_{i}) \end{equation}` -- -- - The within transformed model is: `\begin{equation} \tilde{y}_{it}=\beta_{2}\tilde{x}_{2it}+\ldots +\beta_{k}\tilde{x}_{Kit}+ \tilde{e}_{it} \end{equation}` -- --- ```r wage.within <- plm(lwage~educ+ exper+I(exper^2)+ tenure+I(tenure^2)+union+factor(id)-1, data=nlspd, model="within") kable(tidy(wage.within), digits=3, caption="Fixed effects using the 'within' model option") ``` Table: Fixed effects using the 'within' model option |term | estimate| std.error| statistic| p.value| |:-----------|--------:|---------:|---------:|-------:| |exper | 0.041| 0.007| 6.217| 0.000| |I(exper^2) | 0.000| 0.000| -1.510| 0.131| |tenure | 0.014| 0.003| 4.231| 0.000| |I(tenure^2) | -0.001| 0.000| -4.339| 0.000| |union | 0.064| 0.014| 4.472| 0.000| - Table presents the fixed effects model results for the full sample of the dataset nls_panel. - This is to be compared to the within method is equivalent to including the dummies in the model. - An interesting comparison is between the pooled and fixed effect models. - one can notice that including accounting for individual heterogeneity significantly lowers the marginal effects of the variables. --- # Poolability test -Testing if fixed effets are necessary is to compare the fixed effects model `wage.within` with the pooled model `wage.pooled`. The function pFtest() does this comparison, as in the following code lines. ```r kable(tidy(pFtest(wage.within, wage.pooled)), caption= "Fixed effects test: Ho:'No fixed effects'") ``` ``` ## Multiple parameters; naming those columns df1, df2 ``` Table: Fixed effects test: Ho:'No fixed effects' | df1| df2| statistic| p.value|method |alternative | |---:|----:|---------:|-------:|:-----------------------------|:-------------------| | 712| 2859| 15.2128| 0|F test for individual effects |significant effects | - Table shows that the null hypothesis of no fixed effects is rejected. --- # The Random Effects Model -- - The random effects model elaborates on the fixed effects model by recognizing that, since the individuals in the panel are randomly selected, their characteristics, measured by the intercept `\(\beta_{1i}\)` should be random. -- -- - Thus, the random effects model assumes the form of the intercept given as `$$\beta_{1i}=\overline{\beta}_{1}+\mu_{i}$$` -- -- - Therefore the general model is now `$$y_{it}=x_{it}^{\prime}\beta +u_{it}$$` -- -- - where the residual now consist of two components: `$$u_{it}=\mu_{i} +\epsilon_{it}$$` here `\(\mu_{i}\)` is random. -- --- -- - Therefore, the one-way error component random effects model incorporates a composite error term. `$$y_{it}=x_{it}^{\prime}\beta +u_{it}$$` `$$y_{it}=x_{it}^{\prime}\beta +\mu_{i}+\epsilon_{it}$$` -- -- - As `\(\mu_{i}\)` is considered as a component of the composite error, a random effects model is called an error component model. -- -- - The assumptions on the components of errors are the following: -- -- - `\(E(\mu_{i})=0\)` -- -- - `\(V(\mu_{i})=E(\mu_{i}^{2})=\sigma_{\mu}^{2}\)` -- -- - `\(E(\mu_{i}x_{it})=0\)` -- -- - `\(E(\mu_{i}\mu_{j})=0\)` -- -- - `\(E(\epsilon_{it})=0\)` -- -- - `\(V(\epsilon_{it})=E(\epsilon_{it}^{2})=\sigma_{\epsilon}^{2}\)` -- -- - `\(E(\epsilon_{it}\epsilon_{js})=0\)` for `\(i\neq j\)` and `\(t\neq s\)` -- --- -- - The components of the error are not correlated `$$E(\mu_{i}\epsilon_{it})=0$$` - The `\(\mu_{i}\)` are independent of the error ter `\(\epsilon_{it}\)` and the regressors `\(x_{it}\)`, for all `\(i\)` and `\(t\)`. -- -- - Therefore, the mean and variance of the composite error are `\(E(u_{it})=0\)`, and `\(V(u_{it})=V(y_{it})=\sigma_{y}^{2}=\sigma_{\mu}^{2}+\sigma_{\epsilon}^{2}\)` -- -- - The variances, `\(\sigma_{\mu}^{2}\)` and `\(\sigma_{\epsilon}^{2}\)` are called variance components of `\(\sigma_{y}^{2}\)`. For this reason, the random effects model is called the variance component or error component model. -- -- - The covariance of the composite error, `$$Cov(u_{it},u_{js})=E(u_{it}u_{js})=E(\mu_{i}+\epsilon_{it})(\mu_{j}+\epsilon_{js})$$` `$$=E(\mu_{i}\mu_{j}+\mu_{i}\epsilon_{js}+\mu_{j}\epsilon_{it}+\epsilon_{it}\epsilon_{js})$$` -- -- - Or, -- -- - `$$Cov(u_{it}u_{js})=\sigma_{\mu}^{2}+\sigma_{\epsilon}^{2}, \hspace{0.3cm} \text{for all} \hspace{0.3cm} i =j,t=s$$` -- -- - `\(=\sigma_{\mu}^{2}, \hspace{0.3cm} \text{for all} \hspace{0.3cm} i =j,t\neq s\)` -- -- - `\(=0, \hspace{0.3cm} \text{for all} \hspace{0.3cm} i \neq j,t\neq s\)` --- -- - For cross-section unit `\(i\)`, we can write the model -- -- <img src="s1.png" width="656" /> -- -- <img src="s2.png" width="520" /> --- ## The GLS Estimation -- - The generalized least squares (GLS) is used in estimating a random effects model where `\(U\)` is known. -- -- - Suppose that U is a known, symmetric and positive definite matrix. This assumption may occasionally be true, but in most cases, `\(U\)` contains unknown parameters.. In this case the feasible generalized least squares (FGLS) method is to be used to estimate the entire variance- covariance matrix `\(\Omega\)` (for all `\(i\)`) -- -- - Premultiply equation `\(y_{i}=X_{i}\beta +u_{i}\)` by `\(U^{-\dfrac{1}{2}}\)` to get -- -- - `\(U^{-\dfrac{1}{2}}y_{i}+U^{-\dfrac{1}{2}}X_{i}\beta+U^{-\dfrac{1}{2}}u_{i}\)` -- -- - Or, `$$y_{i}^{*}=X_{i}^{*}+u_{i}^{*}$$` -- -- - Now, we can apply OLS to the transformed model - This is GLS. -- --- # Random Effects: Wage Equation ```r wage.random <- plm(lwage~educ+exper+I(exper^2)+ tenure+I(tenure^2)+black+south+union, data=nlspd, random.method="swar", model="random") kable(tidy(wage.random), digits=4, caption= "The random effects results for the wage equation") ``` Table: The random effects results for the wage equation |term | estimate| std.error| statistic| p.value| |:-----------|--------:|---------:|---------:|-------:| |(Intercept) | 0.5339| 0.0799| 6.6839| 0.0000| |educ | 0.0733| 0.0053| 13.7417| 0.0000| |exper | 0.0436| 0.0064| 6.8606| 0.0000| |I(exper^2) | -0.0006| 0.0003| -2.1361| 0.0327| |tenure | 0.0142| 0.0032| 4.4699| 0.0000| |I(tenure^2) | -0.0008| 0.0002| -3.8790| 0.0001| |black | -0.1167| 0.0302| -3.8643| 0.0001| |south | -0.0818| 0.0224| -3.6505| 0.0003| |union | 0.0802| 0.0132| 6.0724| 0.0000| --- -- - Random effects estimator are reliable under the assumption that individual characteristics (heterogeneity) are exogenous, that is, they are independent with respect to the regressors in the random effects equation. -- -- - The same Hausman test for endogeneity we have already used in another chapter can be used here as well, with the **null hypothesis that individual random effects are exogenous**. -- -- - The test function phtest() compares the fixed effects and the random effects models; the next code lines estimate the random effects model and performs the Hausman endogeneity test. --- -- ```r kable(tidy(phtest(wage.within, wage.random)), caption= "Hausman endogeneity test for the random effects wage model") ``` Table: Hausman endogeneity test for the random effects wage model | statistic| p.value| parameter|method |alternative | |---------:|---------:|---------:|:------------|:-------------------------| | 16.58789| 0.0053515| 5|Hausman Test |one model is inconsistent | -- -- - Table above shows a low p-value of the test, which indicates that the null hypothesis saying that the individual random effects are exogenous is rejected, which makes the random effects equation inconsistent. In this case the fixed effects model is the correct solution. (The number of parameters in Table is given for the time-varying variables only.) --- -- - The fixed effects model, however, does not allow time-invariant variables such as educ or black. -- -- - Since the problem of the random effects model is endogeneity, one can use instrumental variables methods when time-invariant regressors must be in the model. -- - The Hausman-Taylor estimator uses instrumental variables in a random effects model; it assumes four categories of regressors: time-varying exogenous, time-varying endogenous, time-invariant exogenous, and time-invariant endogenous. -- -- - The number of time-varying variables must be at least equal to the number of time-invariant ones. In our wage model, suppose exper, tenure and union are time-varying exogenous, south is time-varying endogenous, black is time-invariant exogenous, and educ is time-invariant endogenous. -- - The same plm() function allows carrying out Hausman-Taylor estimation by setting model= “ht”. --- ```r wage.HT <- plm(lwage~educ+exper+I(exper^2)+ tenure+I(tenure^2)+black+south+union | exper+I(exper^2)+tenure+I(tenure^2)+union+black, data=nlspd, model="ht") ``` ``` ## Warning: uses of 'pht()' and 'plm(., model = "ht")' are discouraged, better use ## 'plm(., model = "random", random.method = "ht", inst.method = ## "baltagi"/"am"/"bms")' for Hausman-Taylor, Amemiya-MaCurdy, and ## Breusch-Mizon-Schmidt estimator ``` ```r kable(tidy(wage.HT), digits=5, caption= "Hausman-Taylor estimates for the wage equation-with the largest changes taking place for educ and black.") ``` Table: Hausman-Taylor estimates for the wage equation-with the largest changes taking place for educ and black. |term | estimate| std.error| statistic| p.value| |:-----------|--------:|---------:|---------:|-------:| |(Intercept) | -0.75077| 0.58624| -1.28066| 0.20031| |educ | 0.17051| 0.04446| 3.83485| 0.00013| |exper | 0.03991| 0.00647| 6.16382| 0.00000| |I(exper^2) | -0.00039| 0.00027| -1.46222| 0.14368| |tenure | 0.01433| 0.00316| 4.53388| 0.00001| |I(tenure^2) | -0.00085| 0.00020| -4.31885| 0.00002| |black | -0.03591| 0.06007| -0.59788| 0.54992| |south | -0.03171| 0.03485| -0.91003| 0.36281| |union | 0.07197| 0.01345| 5.34910| 0.00000| --- # Grunfeld’s Investment Example -- - The dataset grunfeld2 is a subset of the initial dataset; it includes two firms, GE and WE observed over the period 1935 to 1954. -- -- - The purpose of this example is to identify various issues that should be taken into account when building a panel data econometric model. -- - The problem is to find the determinants of investment by a firm , `\(inv_{it}\)` among regressors such as the value of the firm, `\(v_{it}\)`, and capital stock `\(k_{it}\)`. -- -- - Table below gives a glimpse of the grunfeld panel data. --- -- ```r data("grunfeld2", package="PoEdata") grun <- pdata.frame(grunfeld2, index=c("firm","year")) kable(head(grun), align="c", caption= "The head of the grunfeld2 dataset organized as a panel") ``` Table: The head of the grunfeld2 dataset organized as a panel | inv | v | k | firm | year | |:----:|:------:|:-----:|:----:|:----:| | 33.1 | 1170.6 | 97.8 | 1 | 1935 | | 45.0 | 2015.8 | 104.4 | 1 | 1936 | | 77.2 | 2803.3 | 118.0 | 1 | 1937 | | 44.6 | 2039.7 | 156.2 | 1 | 1938 | | 48.1 | 2256.2 | 172.6 | 1 | 1939 | | 74.4 | 2132.2 | 186.6 | 1 | 1940 | -- -- - Let us consider a pooling model first, assuming that the coefficients of the regression equation, as well as the error variances are the same for both firms (no individual heterogeneity). --- ```r grun.pool <- plm(inv~v+k, model="pooling",data=grun) kable(tidy(grun.pool), digits=5, caption= "Grunfeld dataset, pooling panel data results") ``` Table: Grunfeld dataset, pooling panel data results |term | estimate| std.error| statistic| p.value| |:-----------|--------:|---------:|---------:|-------:| |(Intercept) | 17.87200| 7.02408| 2.54439| 0.01525| |v | 0.01519| 0.00620| 2.45191| 0.01905| |k | 0.14358| 0.01860| 7.71890| 0.00000| -- ```r SSE.pool <- sum(resid(grun.pool)^2) sigma2.pool <- SSE.pool/(grun.pool$df.residual) ``` -- -- - For the pooling model, `\(SSE=16563.003385\)`, and `\(\sigma^{2}=447.64874.\)` -- -- - Allowing for different coefficients across firms but same error structure is the fixed effects model summarized in Table below. Note that the fixed effects are modeled using the function factor(). --- ```r grun.fe <- plm(inv~v*grun$firm+k*grun$firm, model="pooling",data=grun) kable(tidy(grun.fe), digits=4, caption= "Grunfeld dataset, 'pooling' panel data results") ``` Table: Grunfeld dataset, 'pooling' panel data results |term | estimate| std.error| statistic| p.value| |:------------|--------:|---------:|---------:|-------:| |(Intercept) | -9.9563| 23.6264| -0.4214| 0.6761| |v | 0.0266| 0.0117| 2.2651| 0.0300| |grun$firm2 | 9.4469| 28.8054| 0.3280| 0.7450| |k | 0.1517| 0.0194| 7.8369| 0.0000| |v:grun$firm2 | 0.0263| 0.0344| 0.7668| 0.4485| |grun$firm2:k | -0.0593| 0.1169| -0.5070| 0.6155| ```r SSE.fe <- sum(resid(grun.fe)^2) sigma2.fe <- SSE.fe/(grun.fe$df.residual) ``` -- - For the fixed effects model with firm dummies, `\(SSE=14989.821701\)`, and `\(\sigma^{2}=440.877109.\)` -- -- - A test to see if the coefficients are significantly different between the pooling and fixed effects equations can be done in R using the function pooltest from package plm; to perform this test, the fixed effects model should be estimated with the function pvcm with the argument model= “within”, as the next code lines show. --- -- ```r grun.pvcm <- pvcm(inv~v+k, model="within", data=grun) coef(grun.pvcm) ``` ``` ## (Intercept) v k ## 1 -9.9563065 0.02655119 0.15169387 ## 2 -0.5093902 0.05289413 0.09240649 ``` -- -- ```r pooltest(grun.pool, grun.pvcm) ``` ``` ## ## F statistic ## ## data: inv ~ v + k ## F = 1.1894, df1 = 3, df2 = 34, p-value = 0.3284 ## alternative hypothesis: unstability ``` -- -- - The result shows that the null hypothesis of zero coefficients for the individual dummy terms are zero cannot be rejected. (However, the pvcm function is not equivalent to the fixed effects model that uses individual dummies; it is, though, useful for testing the ‘poolability’ of a dataset.) --