8E1. For each of the causal relationships below, name a hypothetical third variable that would lead to an interaction effect:
#1. Commercial baking yeast is inactive below 40∘F and dies above 130∘F, so temperature would likely interact with yeast to predict amount of bread dough rising.
#2. Different fields have vastly different distributions of expected income, so field of work would likely interact with years of education to predict income.
#3. Gasoline will only make a working car go, so a variable representing each car’s engine functioning would likely interact with gasoline to predict movement.
8E2. Which of the following explanations invokes an interaction?
#1. This explanation invokes an interaction between heat and dryness in predicting onion caramelization. Specifically, it implies that caramelization will only occur when both heat and dryness are low.
#2. This explanation invokes main effects of number of cylinders and fuel injector quality on car speed but does not explicitly invoke an interaction. Specifically, it seems to imply that adding cylinders and increasing the quality of the fuel injector are independent routes to increasing car speed.
#3. This explanation seems to imply that there are two types of people: those who acquire their beliefs from their parents and those who acquire their beliefs from their friends. The implied model seems to predict individuals’ political beliefs using a linear combination of the interactions between type and parents’ beliefs on the one hand and between type and friends’ beliefs on the other hand.
#4. This explanation seems to invoke an interaction between sociality and the possession of manipulative appendages in predicting a species’ intelligence. Specifically, it seems to imply that intelligent species are high on sociality or have manipulative appendages but are not both high on sociality and in possession of manipulative appendages.
8E3. For each of the explanations in 8E2, write a linear model that expresses the stated relationship.
#1. μi=βHHi+βDDi+βHDHiDi
#2. μi=βCCi+βQQi
#3. μi=βTPTiPi+βTFTiFi
#4. μi=βSSi+βAAi+βSASiAi
8M1. Recall the tulips example from the chapter. Suppose another set of treatments adjusted the temperature in the greenhouse over two levels: cold and hot. The data in the chapter were collected at the cold temperature. You find none of the plants grown under the hot temperature developed any blooms at all, regardless of the water and shade levels. Can you explain this result in terms of interactions between water, shade, and temperature?
#An interaction allows the relationship between a predictor and an outcome to depend upon the value of another predictor. Since the question is stating that the relationships between blossoms and water and between blossoms and shade depend upon the value of temperature, interaction effects can be used here. Since there are three predictor variables now (water, shade, and temperature), we would have a single three-way interaction and three two-way interactions (WST, WS, WT, and ST).
8M2. Can you invent a regression equation that would make the bloom size zero, whenever the temperature is hot?
#The algebraic form of the regression equation would be: μi=α+βWWi+βSSi+βTTi+βWSTWiSiTi+βWSWiSi+βWTWiTi+βSTSiTi
#assuming all predictors are equal to 1: μi|Wi=1,Si=1,T=1=α+βW+βS+βT+βWST+βWS+βWT+βST
#When Ti=0, the full regression equation would be: μi=α+βWWi+βSSi−αTi−βWSWiSiTi+βWSWiSi−βWWiTi−βSSiTi
8M3. In parts of North America, ravens depend upon wolves for their food. This is because ravens are carnivorous but cannot usually kill or open carcasses of prey. Wolves however can and do kill and tear open animals, and they tolerate ravens co-feeding at their kills. This species relationship is generally described as a “species interaction.” Can you invent a hypothetical set of data on raven population size in which this relationship would manifest as a statistical interaction? Do you think the biological interaction could be linear? Why or why not?
library(rethinking)
## Loading required package: rstan
## Loading required package: StanHeaders
## Loading required package: ggplot2
## rstan (Version 2.19.3, GitRev: 2e1f913d3ca3)
## For execution on a local, multicore CPU with excess RAM we recommend calling
## options(mc.cores = parallel::detectCores()).
## To avoid recompilation of unchanged Stan programs, we recommend calling
## rstan_options(auto_write = TRUE)
## For improved execution time, we recommend calling
## Sys.setenv(LOCAL_CPPFLAGS = '-march=corei7 -mtune=corei7')
## although this causes Stan to throw an error on a few processors.
## Loading required package: parallel
## Loading required package: dagitty
## rethinking (Version 2.01)
##
## Attaching package: 'rethinking'
## The following object is masked from 'package:stats':
##
## rstudent
data(tulips)
data(rugged)
data(nettle)
set.seed(158)
N <- 300 # simulation size
rPW <- 0.6 # correlation between prey and wolf
bP <- 0.3 # regression coefficient for prey
bW <- 0.1 # regression coefficient for wolf
bPW <- 0.5 # regression coefficient for prey-by-wolf interaction
# Simulate data
prey <- rnorm(
n = N,
mean = 0,
sd = 1
)
wolf <- rnorm(
n = N,
mean = rPW * prey,
sd = sqrt(1 - rPW^2)
)
raven <- rnorm(
n = N,
mean = bP*prey + bW*wolf + bPW*prey*wolf,
sd = 1
)
d <- data.frame(raven, prey, wolf)
str(d)
## 'data.frame': 300 obs. of 3 variables:
## $ raven: num 0.0187 2.0273 -0.7128 -0.3665 -1.2788 ...
## $ prey : num 0.832 0.286 -0.357 -1.161 0.487 ...
## $ wolf : num -0.636 1.063 0.164 -2.328 -0.29 ...
m <- map(
alist(
raven ~ dnorm(mu, sigma),
mu <- a + bP*prey + bW*wolf + bPW*prey*wolf,
a ~ dnorm(0, 1),
bW ~ dnorm(0, 1),
bP ~ dnorm(0, 1),
bPW ~ dnorm(0, 1),
sigma ~ dunif(0, 5)
),
data = d,
start = list(a = 0, bP = 0, bW = 0, bPW = 0, sigma = 1)
)
precis(m)
## mean sd 5.5% 94.5%
## a -0.2116555 0.05795311 -0.30427573 -0.1190352
## bP 0.3298847 0.05969840 0.23447515 0.4252943
## bW 0.1886810 0.06083568 0.09145385 0.2859082
## bPW 0.5004650 0.04546211 0.42780779 0.5731222
## sigma 0.8945074 0.03651798 0.83614466 0.9528702
8M4. Repeat the tulips analysis, but this time use priors that constrain the effect of water to be positive and the effect of shade to be negative. Use prior predictive simulation. What do these prior assumptions mean for the interaction prior, if anything?
data(tulips)
d <- tulips
str(d)
## 'data.frame': 27 obs. of 4 variables:
## $ bed : Factor w/ 3 levels "a","b","c": 1 1 1 1 1 1 1 1 1 2 ...
## $ water : int 1 1 1 2 2 2 3 3 3 1 ...
## $ shade : int 1 2 3 1 2 3 1 2 3 1 ...
## $ blooms: num 0 0 111 183.5 59.2 ...
d$blooms_std <- d$blooms / max(d$blooms)
d$water_cent <- d$water - mean(d$water)
d$shade_cent <- d$shade - mean(d$shade)
a <- rnorm( 1e4 , 0.5 , 1 ); sum(a < 0 | a >1) / length(a)
## [1] 0.6179
a <- rnorm( 1e4 , 0.5 , 0.25 ); sum( a < 0 | a > 1 ) / length( a )
## [1] 0.0434
m8.4 <- quap(
alist(
blooms_std ~ dnorm( mu , sigma ) ,
mu <- a + bw*water_cent - bs*shade_cent ,
a ~ dnorm( 0.5 , 0.25 ) ,
bw ~ dnorm( 0 , 0.25 ) ,
bs ~ dnorm( 0 , 0.25 ) ,
sigma ~ dexp( 1 )
) , data=d )
m8.5 <- quap(
alist(
blooms_std ~ dnorm( mu , sigma ) ,
mu <- a + bw*water_cent - bs*shade_cent + bws*water_cent*shade_cent ,
a ~ dnorm( 0.5 , 0.25 ) ,
bw ~ dnorm( 0 , 0.25 ) ,
bs ~ dnorm( 0 , 0.25 ) ,
bws ~ dnorm( 0 , 0.25 ) ,
sigma ~ dexp( 1 )
) , data=d )
par(mfrow=c(1,3)) # 3 plots in 1 row
for ( s in -1:1 ) {
idx <- which( d$shade_cent==s )
plot( d$water_cent[idx] , d$blooms_std[idx] , xlim=c(-1,1) , ylim=c(0,1) ,
xlab="water" , ylab="blooms" , pch=16 , col=rangi2 )
mu <- link( m8.4 , data=data.frame( shade_cent=s , water_cent=-1:1 ) )
for ( i in 1:20 ) lines( -1:1 , mu[i,] , col=col.alpha("black",0.3) )}
8H1. Return to the data(tulips) example in the chapter. Now include the bed variable as a predictor in the interaction model. Don’t interact bed with the other predictors; just include it as a main effect. Note that bed is categorical. So to use it properly, you will need to either construct dummy variables or rather an index variable, as explained in Chapter 5.
d <- tulips
d$shade.c <- d$shade - mean(d$shade)
d$water.c <- d$water - mean(d$water)
# Dummy variables
d$bedb <- d$bed == "b"
d$bedc <- d$bed == "c"
# Index variable
d$bedx <- coerce_index(d$bed)
m_dummy <- map(
alist(
blooms ~ dnorm(mu, sigma),
mu <- a + bW*water.c + bS*shade.c + bWS*water.c*shade.c + bBb*bedb + bBc*bedc,
a ~ dnorm(130, 100),
bW ~ dnorm(0, 100),
bS ~ dnorm(0, 100),
bWS ~ dnorm(0, 100),
bBb ~ dnorm(0, 100),
bBc ~ dnorm(0, 100),
sigma ~ dunif(0, 100)
),
data = d,
start = list(a = mean(d$blooms), bW = 0, bS = 0, bWS = 0, bBb = 0, bBc = 0, sigma = sd(d$blooms))
)
precis(m_dummy)
## mean sd 5.5% 94.5%
## a 99.36131 12.757521 78.97233 119.75029
## bW 75.12433 9.199747 60.42136 89.82730
## bS -41.23103 9.198481 -55.93198 -26.53008
## bWS -52.15060 11.242951 -70.11901 -34.18219
## bBb 42.41139 18.039255 13.58118 71.24160
## bBc 47.03141 18.040136 18.19979 75.86303
## sigma 39.18964 5.337920 30.65862 47.72067
m_index <- map(
alist(
blooms ~ dnorm(mu, sigma),
mu <- a[bedx] + bW*water.c + bS*shade.c + bWS*water.c*shade.c,
a[bedx] ~ dnorm(130, 100),
bW ~ dnorm(0, 100),
bS ~ dnorm(0, 100),
bWS ~ dnorm(0, 100),
sigma ~ dunif(0, 200)
),
data = d
)
## Caution, model may not have converged.
## Code 1: Maximum iterations reached.
precis(m_index, depth = 2)
## Warning in sqrt(diag(vcov(model))): NaNs produced
## Warning in sqrt(diag(vcov(model))): NaNs produced
## Warning in sqrt(diag(vcov(model))): NaNs produced
## mean sd 5.5% 94.5%
## a[1] 121.29772 50.14583 41.15499 201.44045
## a[2] 160.64077 50.39190 80.10478 241.17676
## a[3] 124.58099 50.21190 44.33267 204.82931
## bW 88.11262 38.15424 27.13478 149.09046
## bS -49.60890 38.29830 -110.81699 11.59918
## bWS -30.63570 44.79267 -102.22304 40.95165
## sigma 176.46409 NaN NaN NaN
coeftab(m_dummy, m_index)
## Caution, model may not have converged.
## Code 1: Maximum iterations reached.
## Caution, model may not have converged.
## Code 1: Maximum iterations reached.
## Warning in sqrt(diag(vcov(model))): NaNs produced
## m_dummy m_index
## a 99.36 NA
## bW 75.12 88.11
## bS -41.23 -49.61
## bWS -52.15 -30.64
## bBb 42.41 NA
## bBc 47.03 NA
## sigma 39.19 176.46
## a[1] NA 121.3
## a[2] NA 160.64
## a[3] NA 124.58
## nobs 27 27