Main Bayesian GLM analysis: model 1 (all infants)
Plots
Age plot.
ggplot(all_habitation_data,
aes(x = age_months, y = choice, group = 1)) +
stat_smooth(method = "lm") +
geom_point(position = position_jitter(height = .03, width = 0)) +
xlab("Age (months)") +
ylab("Choice")

plot between lab variation
by_lab <- all_habitation_data %>%
group_by(lab_id) %>%
summarize(tested = n(),
chose_helper_mean = mean(chose_helper),
chose_helper = sum(chose_helper),
ci_lower = binom.confint(x = chose_helper,
n = tested,
methods = "wilson")$lower,
ci_upper = binom.confint(x = chose_helper,
n = tested,
methods = "wilson")$upper)
ggplot(by_lab,
aes(x = lab_id, y = chose_helper_mean,
ymin = ci_lower, ymax = ci_upper)) +
geom_point(aes(size = tested)) +
geom_linerange() +
coord_flip() +
xlab("Lab") +
ylab("Proportion Choosing Helper") +
ylim(0,1) +
geom_hline(yintercept = .5, col = "black", lty = 2) +
scale_size_continuous(name = "N", breaks = function(x) c(min(x), mean(x), max(x))) +
theme(legend.position = "bottom")

Bayesian Analysis
Set partially informative priors, with intercept based on meta-analysis.
priors.full <- c(set_prior("normal(.5753641, .1)", # From Margoni & Surian, through logit
class = "Intercept"),
set_prior("normal(0, .5)",
class = "b"),
set_prior("student_t(3, 0, 2)",
class = "sd"))
priors.nointercept <- c(set_prior("normal(0, .5)",
class = "b"),
set_prior("student_t(3, 0, 2)",
class = "sd"))
priors.noage <- c(set_prior("normal(.5753641, .1)", # From Margoni & Surian, through logit
class = "Intercept"),
set_prior("student_t(3, 0, 2)",
class = "sd"))
Fit three models - full model and two subset models for Bayes factors.
bayes_mod <- brm(chose_helper_num ~ z_age_days + (z_age_days | lab_id),
family = bernoulli, data = all_habitation_data,
iter = 5000,
prior = priors.full,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## Start sampling
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 5.9e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.59 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 1.36908 seconds (Warm-up)
## Chain 1: 1.06431 seconds (Sampling)
## Chain 1: 2.43339 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 2.7e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.27 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 1.55813 seconds (Warm-up)
## Chain 2: 2.40412 seconds (Sampling)
## Chain 2: 3.96225 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 2.5e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.25 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 1.38735 seconds (Warm-up)
## Chain 3: 1.61284 seconds (Sampling)
## Chain 3: 3.00018 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 1.8e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.18 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 1.26427 seconds (Warm-up)
## Chain 4: 0.941607 seconds (Sampling)
## Chain 4: 2.20588 seconds (Total)
## Chain 4:
bayes_mod_nointercept <- brm(chose_helper_num ~ z_age_days - 1 +
(z_age_days - 1 | lab_id),
family = bernoulli, data = all_habitation_data,
iter = 5000,
prior = priors.nointercept,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## Start sampling
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 3.6e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.36 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 0.467553 seconds (Warm-up)
## Chain 1: 0.485496 seconds (Sampling)
## Chain 1: 0.953049 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 1.1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 0.449016 seconds (Warm-up)
## Chain 2: 0.436286 seconds (Sampling)
## Chain 2: 0.885302 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 1.4e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.14 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 0.484473 seconds (Warm-up)
## Chain 3: 0.472464 seconds (Sampling)
## Chain 3: 0.956937 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 1.3e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 0.456613 seconds (Warm-up)
## Chain 4: 0.393879 seconds (Sampling)
## Chain 4: 0.850492 seconds (Total)
## Chain 4:
bayes_mod_noage <- brm(chose_helper_num ~ 1 +
(1 | lab_id),
family = bernoulli, data = all_habitation_data,
iter = 5000,
prior = priors.noage,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## Start sampling
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 2.2e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.22 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 0.333602 seconds (Warm-up)
## Chain 1: 0.36464 seconds (Sampling)
## Chain 1: 0.698242 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 9e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 0.340462 seconds (Warm-up)
## Chain 2: 0.406327 seconds (Sampling)
## Chain 2: 0.746789 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 9e-06 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 0.303537 seconds (Warm-up)
## Chain 3: 0.230577 seconds (Sampling)
## Chain 3: 0.534114 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 9e-06 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 0.342476 seconds (Warm-up)
## Chain 4: 0.316376 seconds (Sampling)
## Chain 4: 0.658852 seconds (Total)
## Chain 4:
Now bridge sampling for a bayes factor on the presence/absence of the intercept.
H0_nointercept.bridge <- bridge_sampler(bayes_mod_nointercept, silent = TRUE)
H0_noage.bridge <- bridge_sampler(bayes_mod_noage, silent = TRUE)
H1.bridge <- bridge_sampler(bayes_mod, silent = TRUE)
Bayes factor for intercept.
bf10 <- bf(H1.bridge, H0_nointercept.bridge)
print(bf10)
## Estimated Bayes factor in favor of H1.bridge over H0_nointercept.bridge: 1.76013
Bayes factor for age.
bf10 <- bf(H1.bridge, H0_noage.bridge)
print(bf10)
## Estimated Bayes factor in favor of H1.bridge over H0_noage.bridge: 2.82831
Plots
Age plot.
ggplot(habituators_data,
aes(x = age_months, y = chose_helper, group = 1)) +
stat_smooth(method = "lm") +
geom_point(position = position_jitter(height = .03, width = 0)) +
xlab("Age (months)") +
ylab("Pr (chose helper)")

plot between lab variation
by_lab <- habituators_data %>%
group_by(lab_id) %>%
summarize(tested = n(),
chose_helper_mean = mean(chose_helper),
chose_helper = sum(chose_helper),
ci_lower = binom.confint(x = chose_helper,
n = tested,
methods = "wilson")$lower,
ci_upper = binom.confint(x = chose_helper,
n = tested,
methods = "wilson")$upper)
ggplot(by_lab,
aes(x = lab_id, y = chose_helper_mean,
ymin = ci_lower, ymax = ci_upper)) +
geom_point(aes(size = tested)) +
geom_linerange() +
coord_flip() +
xlab("Lab") +
ylab("Proportion Choosing Helper") +
ylim(0,1) +
geom_hline(yintercept = .5, col = "black", lty = 2) +
scale_size_continuous(name = "N", breaks = function(x) c(min(x), mean(x), max(x))) +
theme(legend.position = "bottom")

Bayesian Analysis
Fit three models - full model and two subset models for Bayes factors.
bayes_mod <- brm(chose_helper_num ~ z_age_days + (z_age_days | lab_id),
family = bernoulli, data = habituators_data,
iter = 5000,
prior = priors.full,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## recompiling to avoid crashing R session
## Start sampling
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 4e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.4 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 1.20976 seconds (Warm-up)
## Chain 1: 0.737789 seconds (Sampling)
## Chain 1: 1.94755 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 1.6e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 1.2578 seconds (Warm-up)
## Chain 2: 1.76131 seconds (Sampling)
## Chain 2: 3.01911 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 1.7e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.17 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 1.29635 seconds (Warm-up)
## Chain 3: 1.96452 seconds (Sampling)
## Chain 3: 3.26087 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL '4f9602ada81d1d1c6607a7937f3bd90e' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 1.7e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.17 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 1.26163 seconds (Warm-up)
## Chain 4: 1.6345 seconds (Sampling)
## Chain 4: 2.89613 seconds (Total)
## Chain 4:
bayes_mod_nointercept <- brm(chose_helper_num ~ z_age_days - 1 +
(z_age_days - 1 | lab_id),
family = bernoulli, data = habituators_data,
iter = 5000,
prior = priors.nointercept,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## recompiling to avoid crashing R session
## Start sampling
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 3.4e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.34 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 0.467861 seconds (Warm-up)
## Chain 1: 0.395656 seconds (Sampling)
## Chain 1: 0.863517 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 1e-05 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.1 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 0.474028 seconds (Warm-up)
## Chain 2: 0.349023 seconds (Sampling)
## Chain 2: 0.823051 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 1.2e-05 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.12 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 0.43001 seconds (Warm-up)
## Chain 3: 0.376711 seconds (Sampling)
## Chain 3: 0.806721 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL 'd2db2f98038294c40fb5261f45f61c66' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 1.2e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.12 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 0.483678 seconds (Warm-up)
## Chain 4: 0.401982 seconds (Sampling)
## Chain 4: 0.88566 seconds (Total)
## Chain 4:
bayes_mod_noage <- brm(chose_helper_num ~ 1 +
(1 | lab_id),
family = bernoulli, data = habituators_data,
iter = 5000,
prior = priors.noage,
control = list(adapt_delta = .99, max_treedepth = 20),
chains = 4, save_all_pars = TRUE)
## Compiling the C++ model
## recompiling to avoid crashing R session
## Start sampling
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 1).
## Chain 1:
## Chain 1: Gradient evaluation took 2.7e-05 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.27 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1:
## Chain 1:
## Chain 1: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 1: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 1: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 1: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 1: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 1: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 1: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 1: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 1: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 1: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 1: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 1: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 1:
## Chain 1: Elapsed Time: 0.366285 seconds (Warm-up)
## Chain 1: 0.359332 seconds (Sampling)
## Chain 1: 0.725617 seconds (Total)
## Chain 1:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 2).
## Chain 2:
## Chain 2: Gradient evaluation took 9e-06 seconds
## Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds.
## Chain 2: Adjust your expectations accordingly!
## Chain 2:
## Chain 2:
## Chain 2: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 2: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 2: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 2: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 2: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 2: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 2: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 2: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 2: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 2: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 2: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 2: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 2:
## Chain 2: Elapsed Time: 0.383013 seconds (Warm-up)
## Chain 2: 0.375052 seconds (Sampling)
## Chain 2: 0.758065 seconds (Total)
## Chain 2:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 3).
## Chain 3:
## Chain 3: Gradient evaluation took 8e-06 seconds
## Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.08 seconds.
## Chain 3: Adjust your expectations accordingly!
## Chain 3:
## Chain 3:
## Chain 3: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 3: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 3: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 3: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 3: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 3: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 3: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 3: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 3: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 3: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 3: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 3: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 3:
## Chain 3: Elapsed Time: 0.336911 seconds (Warm-up)
## Chain 3: 0.258636 seconds (Sampling)
## Chain 3: 0.595547 seconds (Total)
## Chain 3:
##
## SAMPLING FOR MODEL 'bc07629cc61dd0dc953a2afbd37c679a' NOW (CHAIN 4).
## Chain 4:
## Chain 4: Gradient evaluation took 1.1e-05 seconds
## Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
## Chain 4: Adjust your expectations accordingly!
## Chain 4:
## Chain 4:
## Chain 4: Iteration: 1 / 5000 [ 0%] (Warmup)
## Chain 4: Iteration: 500 / 5000 [ 10%] (Warmup)
## Chain 4: Iteration: 1000 / 5000 [ 20%] (Warmup)
## Chain 4: Iteration: 1500 / 5000 [ 30%] (Warmup)
## Chain 4: Iteration: 2000 / 5000 [ 40%] (Warmup)
## Chain 4: Iteration: 2500 / 5000 [ 50%] (Warmup)
## Chain 4: Iteration: 2501 / 5000 [ 50%] (Sampling)
## Chain 4: Iteration: 3000 / 5000 [ 60%] (Sampling)
## Chain 4: Iteration: 3500 / 5000 [ 70%] (Sampling)
## Chain 4: Iteration: 4000 / 5000 [ 80%] (Sampling)
## Chain 4: Iteration: 4500 / 5000 [ 90%] (Sampling)
## Chain 4: Iteration: 5000 / 5000 [100%] (Sampling)
## Chain 4:
## Chain 4: Elapsed Time: 0.347539 seconds (Warm-up)
## Chain 4: 0.213706 seconds (Sampling)
## Chain 4: 0.561245 seconds (Total)
## Chain 4:
Now bridge sampling for a bayes factor on the presence/absence of the intercept.
H0_nointercept.bridge <- bridge_sampler(bayes_mod_nointercept, silent = TRUE)
H0_noage.bridge <- bridge_sampler(bayes_mod_noage, silent = TRUE)
H1.bridge <- bridge_sampler(bayes_mod, silent = TRUE)
Bayes factor for intercept.
bf10 <- bf(H1.bridge, H0_nointercept.bridge)
print(bf10)
## Estimated Bayes factor in favor of H1.bridge over H0_nointercept.bridge: 2.35192
Bayes factor for age.
bf10 <- bf(H1.bridge, H0_noage.bridge)
print(bf10)
## Estimated Bayes factor in favor of H1.bridge over H0_noage.bridge: 1.68801
ICCs for variability
Calculate the intraclass-correlation for random intercepts of the mixed effects model. Note that because the lab_id variable leads to a singular fit, this can’t be computed for pilot data.
iccbin(cid = lab_id, y = chose_helper,
data = all_habitation_data,
alpha = 0.05)
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC not estimable by 'ANOVA' method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : Smith's Large Sample Confidence Interval for ICC is
## Not Estimable
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : Zou and Donner's Modified Wald Confidence for ICC is
## Not Estimable
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Modified ANOVA' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Moment with Equal Weights'
## Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Moment with Weights
## Proportional to Cluster Size' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Modified Moment with Equal
## Weights' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Modified Moment with Weights
## Proportional to Cluster Size' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Stabilized Moment' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Unbiased Estimating Equation'
## Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Fleiss-Cuzick's Kappa' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : Fleiss-Cuzick Confidence Interval for ICC is Not
## Estimable
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Mak's Unweighted' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC not Estimable by 'Correlation Method with Weight
## to Every Pair of Observations'
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : Pearson Correlation Type Confidence Interval for ICC
## is Not Estimable
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Correlation Method with Equal
## Weight to Each Cluster Irrespective of Size'
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Correlation Method with
## Weighting Each Pair According to Number of Pairs individuals Appear'
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : ICC Not Estimable by 'Resampling' Method
## Warning in iccbin(cid = lab_id, y = chose_helper, data =
## all_habitation_data, : Resampling Based Confidence Interval for ICC is Not
## Estimable
## boundary (singular) fit: see ?isSingular
## $estimates
## Methods
## 1 ANOVA Estimate
## 2 Modified ANOVA Estimate
## 3 Moment Estimate with Equal Weights
## 4 Moment Estimate with Weights Proportional to Cluster Size
## 5 Modified Moment Estimate with Equal Weights
## 6 Modified Moment Estimate with Weights Proportional to Cluster Size
## 7 Stabilized Moment Estimate
## 8 Moment Estimate from Unbiased Estimating Equation
## 9 Fleiss-Cuzick Kappa Type Estimate
## 10 Mak's Unweighted Average Estimate
## 11 Correlation Estimate with Equal Weight to Every Pair of Observations
## 12 Correlation Estimate with Equal Weight to Each Cluster Irrespective of Size
## 13 Correlation Estimate with Weighting Each Pair According to Number of Pairs individuals Appear
## 14 Resampling Estimate
## 15 First-order Model Linearized Estimate
## 16 Monte Carlo Simulation Estimate
## ICC
## 1 -
## 2 -
## 3 -
## 4 -
## 5 -
## 6 -
## 7 -
## 8 -
## 9 -
## 10 -
## 11 -
## 12 -
## 13 -
## 14 -
## 15 0
## 16 0
##
## $ci
## Type LowerCI UpperCI
## 1 Smith's Large Sample Confidence Interval - -
## 2 Zou and Donner's Modified Wald Confidence Interval - -
## 3 Fleiss-Cuzick Confidence Interval - -
## 4 Pearson Correlation Type Confidence Interval - -
## 5 Resampling Based Confidence Interval - -
Other analyses
calculate meta-analysis of single proportions
ma <- metaprop(event = by_lab$chose_helper,
n = by_lab$tested,
studlab = by_lab$lab_id)
## Warning in rma.glmm(xi = event[!exclude], ni = n[!exclude], method =
## "FE", : Cannot invert Hessian for saturated model.
## Warning in rma.glmm(xi = event[!exclude], ni = n[!exclude], method =
## method.tau, : Cannot invert Hessian for saturated model.
ma
## proportion 95%-CI %W(fixed) %W(random)
## InfantCog-UBC 0.7143 [0.2904; 0.9633] -- --
## UCSB 0.8000 [0.2836; 0.9949] -- --
## UWASH 0.6667 [0.0943; 0.9916] -- --
##
## Number of studies combined: k = 3
##
## proportion 95%-CI z p-value
## Fixed effect model 0.7333 [0.4669; 0.8962] -- --
## Random effects model 0.7333 [0.4669; 0.8962] -- --
##
## Quantifying heterogeneity:
## tau^2 = 0; H = 1.00; I^2 = 0.0%
##
## Test of heterogeneity:
## Q d.f. p-value Test
## NA 2 -- Wald-type
## 0.20 2 0.9055 Likelihood-Ratio
##
## Details on meta-analytical method:
## - Random intercept logistic regression model
## - Maximum-likelihood estimator for tau^2
## - Logit transformation
## - Clopper-Pearson confidence interval for individual studies