I didn’t do that much coding since this update so, just going to use
this one for our meeting.
- Notes for this update
- Had some difficulties coding these last two weeks. Didn’t quite get
as far along as I had hoped, but I at least got some stuff mapped out. I
wrote some more (I’m up to 4-5 pages now, and its helped me with my
coding, I think). I created a new variable that examines how many times
a census tract has had an increasing number of issued building perits
from one 5 yr period to another. so the range is 0-3. This new variable
brought the number of dimensions in the PCA from three to four, which I
think is helpful for the LPA.
- I’ve been having a hard time getting separation of classes in the
LPA. I think I may need to introduce another new variable to help with
this. Hopefully we can talk about that at our meeting.
- Do you know anything about POMS (proportion of maximum)? I’ve found
that when I use this for the LPA instead of scaling the variables, I get
better separation of the classes. I’ve done some searching but I can’t
figure out why its helping me out. Below are some results from both
scaling and poms with the same data.
- I mapped out the results with the best separation in the three
cities I’m examining. The results don’t make sense to me, which is why I
think I may need to introduce another variable; I have some thoughts
about that.
- see you in the morning.




# comparing models
#pca_orig_diminc <- as.data.frame(pca_orig_diminc)
dat_dim%>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
estimate_profiles(1:4,
variances = c("equal", "varying"),
covariances = c("zero", "varying")
) %>%
compare_solutions(statistics = c("AIC", "BIC"))
## Compare tidyLPA solutions:
##
## Model Classes AIC BIC Warnings
## 1 1 19404.74 19445.11
## 1 2 19415.26 19480.87
## 1 3 18785.12 18875.96
## 1 4 17374.01 17490.09 Warning
## 6 1 19416.74 19487.39
## 6 2 16420.67 16567.03
## 6 3 15474.08 15696.14
## 6 4 14910.16 15207.92
##
## Best model according to AIC is Model 6 with 4 classes.
## Best model according to BIC is Model 6 with 4 classes.
##
## An analytic hierarchy process, based on the fit indices AIC, AWE, BIC, CLC, and KIC (Akogul & Erisoglu, 2017), suggests the best solution is Model 6 with 4 classes.
Scale vs POMS
Scale
y1 <- dat_dim %>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
estimate_profiles(4,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y2<- dat_dim %>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
estimate_profiles(4,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()

POMS
y3 <- dat_dim %>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(4,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y4<- dat_dim %>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(4,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()

Working with model y3
v3 <- dat_dim %>%
dplyr::select(dim1,dim2,dim3,dim4) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(4,
variances = "varying",
covariances = "varying") %>%
get_data()
dat_dim$class <- v3$Class
table with numbers within the classes
table(v3$Class)
##
## 1 2 3 4
## 539 334 239 37
Mapping out the results in the three cities
San Antonio
trying out five dimensions
pca_orig_diminc <- as.data.frame(pca_orig_inc$ind$coord[,1:5])
dat_dim5 <- cbind(bpsw_comb_inc,pca_orig_diminc)
dat_dim5 <- dat_dim5 %>%
mutate(
dim1 = Dim.1,
dim2 = Dim.2,
dim3 = Dim.3,
dim4 = Dim.4,
dim5 = Dim.5
)
# comparing models
#pca_orig_diminc <- as.data.frame(pca_orig_diminc)
dat_dim5%>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
estimate_profiles(1:5,
variances = c("equal", "varying"),
covariances = c("zero", "varying")
) %>%
compare_solutions(statistics = c("AIC", "BIC"))
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_1_class_5
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_1_class_5
## Compare tidyLPA solutions:
##
## Model Classes AIC BIC Warnings
## 1 1 22779.16 22829.63
## 1 2 22316.16 22396.90
## 1 3 22146.09 22257.12
## 1 4 21554.44 21695.75
## 1 5 19867.71 20039.30 Warning
## 6 1 22799.16 22900.10
## 6 2 19397.94 19604.85
## 6 3 18655.28 18968.17
## 6 4 17571.80 17990.67
## 6 5 17184.59 17709.44
##
## Best model according to AIC is Model 6 with 5 classes.
## Best model according to BIC is Model 6 with 5 classes.
##
## An analytic hierarchy process, based on the fit indices AIC, AWE, BIC, CLC, and KIC (Akogul & Erisoglu, 2017), suggests the best solution is Model 6 with 5 classes.
y5 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4, dim5) %>%
single_imputation() %>%
scale() %>%
estimate_profiles(5,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y6<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
scale() %>%
estimate_profiles(5,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_5

y7 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(5,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y8<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(5,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_5

trying to look at 5 dimensions with three classes
y9 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4, dim5) %>%
single_imputation() %>%
scale() %>%
estimate_profiles(3,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y10<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
scale() %>%
estimate_profiles(3,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_3

y11 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(3,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y12<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(3,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_3

5 dimensions 4 classes
y11 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(4,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y12<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(4,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_4

5 dimensions 5 classes
y13 <- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(5,
variances = "varying",
covariances = "varying") %>%
plot_profiles()

y14<- dat_dim5 %>%
dplyr::select(dim1,dim2,dim3,dim4,dim5) %>%
single_imputation() %>%
poms() %>%
estimate_profiles(5,package = "MplusAutomation",
variances = "varying",
covariances = "varying") %>%
plot_profiles()
## Warning:
## One or more analyses resulted in warnings! Examine these analyses carefully: model_6_class_5
