This chapter introduced interactions, which allow for the association between a predictor and an outcome to depend upon the value of another predictor. While you can’t see them in a DAG, interactions can be important for making accurate inferences. Interactions can be difficult to interpret, and so the chapter also introduced triptych plots that help in visualizing the effect of an interaction. No new coding skills were introduced, but the statistical models considered were among the most complicated so far in the book.
Place each answer inside the code chunk (grey box). The code chunks should contain a text response or a code that completes/answers the question or activity requested. Problems are labeled Easy (E), Medium (M), and Hard(H).
Finally, upon completion, name your final output .html
file as: YourName_ANLY505-Year-Semester.html
and publish the assignment to your R Pubs account and submit the link to Canvas. Each question is worth 5 points.
8E1. For each of the causal relationships below, name a hypothetical third variable that would lead to an interaction effect:
#1. Time. Yeast needs time to reproduce and exert effect on the rise of bread dough.
#2. Industry. Different industries have different income levels.
#3. The type of gasoline. The higher the quality of gasoline, the better for the car and therefore better performance.
8E2. Which of the following explanations invokes an interaction?
#1. It invokes and interaction between heat and dryness.
#2. No specific interaction is invoked.
#3. The interaction between people and their parents/friends and the effect that parents/friends have on one's political beliefs is an interaction.
#4. There is some sort of relationship between the intelligence and the social/manipulative appendages of an animal specie. The intelligent animal species, however, are not both highly social and have manipulative appendages.
8E3. For each of the explanations in 8E2, write a linear model that expresses the stated relationship.
#yi = βH * Hi + βD * Di + βHD * Hi * Di
#yi = βC * Ci + βF * Fi
#yi = βP * Pi + βF * Pi * Fi
#yi = βS * Si + βA * Ai + βSA * Si * Ai
8M1. Recall the tulips example from the chapter. Suppose another set of treatments adjusted the temperature in the greenhouse over two levels: cold and hot. The data in the chapter were collected at the cold temperature. You find none of the plants grown under the hot temperature developed any blooms at all, regardless of the water and shade levels. Can you explain this result in terms of interactions between water, shade, and temperature?
#The result with hot temperature did not develop any blooms because influence of hot temperature on blooms was dependent upon on the interaction between temperature, water and shade or the interaction between temperature and water or water and shade or temperature and shade. Either of these interactions could have resulted in plants not blooming at all under the hot temperature.
8M2. Can you invent a regression equation that would make the bloom size zero, whenever the temperature is hot?
#μi = α + βt*xt + βw*xw + βs*xs + βtws*xt*xw*xs + βtw*xt*xw + βws*xw*xs + βts*xt*xs
#μi is the bloom size, for μi = 0 when xt is hot, depends on other variables
#After invention, the regression equation that would make the bloom size zero, whenever the temperature is hot
# α + βt + βw*xw + βs*xs + βtws*xw*xs + βtw*xw + βws*xw*xs + βts*xs = 0
8M3. In parts of North America, ravens depend upon wolves for their food. This is because ravens are carnivorous but cannot usually kill or open carcasses of prey. Wolves however can and do kill and tear open animals, and they tolerate ravens co-feeding at their kills. This species relationship is generally described as a “species interaction.” Can you invent a hypothetical set of data on raven population size in which this relationship would manifest as a statistical interaction? Do you think the biological interaction could be linear? Why or why not?
data(tulips)
data(rugged)
data(nettle)
set.seed(123)
# We consider a linear model
N <- 500 # simulation size
rPW <- 0.6 # correlation between prey and wolf
bP <- 0.3 # regression coefficient for prey
bW <- 0.1 # regression coefficient for wolf
bPW <- 0.5 # regression coefficient for prey-by-wolf interaction
# Simulate data
prey <- rnorm(
n = N,
mean = 0,
sd = 1
)
wolf <- rnorm(
n = N,
mean = rPW * prey,
sd = sqrt(1 - rPW^2)
)
raven <- rnorm(
n = N,
mean = bP*prey + bW*wolf + bPW*prey*wolf,
sd = 1
)
d <- data.frame(raven, prey, wolf)
str(d)
## 'data.frame': 500 obs. of 3 variables:
## $ raven: num -1.017 -1.095 1.994 -0.024 -2.697 ...
## $ prey : num -0.5605 -0.2302 1.5587 0.0705 0.1293 ...
## $ wolf : num -0.818 -0.933 1.757 0.643 -1.13 ...
m <- quap(
alist(
raven ~ dnorm(mu, sigma),
mu <- a + bP*prey + bW*wolf + bPW*prey*wolf,
a ~ dnorm(0, 1),
bW ~ dnorm(0, 1),
bP ~ dnorm(0, 1),
bPW ~ dnorm(0, 1),
sigma ~ dunif(0, 5)
),
data = d,
start = list(a = 0, bP = 0, bW = 0, bPW = 0, sigma = 1)
)
precis(m)
## mean sd 5.5% 94.5%
## a 0.03740266 0.04855042 -0.04019029 0.1149956
## bP 0.32213161 0.05466076 0.23477316 0.4094901
## bW 0.16898993 0.05480987 0.08139316 0.2565867
## bPW 0.47507099 0.03840570 0.41369126 0.5364507
## sigma 0.98839478 0.03126046 0.93843452 1.0383550
plot(m)
#Conclusion: It seems the biological interaction could be linear.
8M4. Repeat the tulips analysis, but this time use priors that constrain the effect of water to be positive and the effect of shade to be negative. Use prior predictive simulation. What do these prior assumptions mean for the interaction prior, if anything?
data(tulips)
d <- tulips
d$blooms_std <- d$blooms / max(d$blooms)
d$water_cent <- d$water - mean(d$water)
d$shade_cent <- d$shade - mean(d$shade)
bw_d<- abs(rnorm(nrow(d),0,0.25))
bs_d<- (-abs(rnorm(nrow(d),0,0.25)))
a <- rnorm(1e4, 0.5, 0.25);sum(a<0|a>1)/length(a)
## [1] 0.0457
m8.4 <- quap(alist(blooms_std ~ dnorm(mu , sigma), mu <- a + bw*water_cent - bs*shade_cent,
a ~ dnorm(0.5, 0.25), bw ~ dnorm(0, 0.25), bs ~ dnorm(0, 0.25), sigma ~ dexp(1)) , data=d )
precis(m8.4)
## mean sd 5.5% 94.5%
## a 0.3587758 0.03021898 0.31048005 0.4070716
## bw 0.2050361 0.03688957 0.14607948 0.2639928
## bs 0.1125399 0.03687568 0.05360543 0.1714744
## sigma 0.1581545 0.02144380 0.12388316 0.1924258
par(mfrow=c(1,3))
for (s in -1:1) {idx <- which(d$shade_cent==s)
plot( d$water_cent[idx] , d$blooms_std[idx], xlim=c(-1,1), ylim=c(0,1),
xlab="water", ylab="blooms", pch=16, col=rangi2)
mu <- link(m8.4, data=data.frame( shade_cent=s , water_cent=-1:1))
for (i in 1:20) lines( -1:1, mu[i,], col=col.alpha("black",0.3))}
#Conclusion: Both water and shades would affect tulip. Water has more effect when there's much light, as the tulips need more water to do photosynthesis. Water has a smaller effect when there's not enough light, as the tulips are not as active and would not need that much water.
8H1. Return to the data(tulips) example in the chapter. Now include the bed variable as a predictor in the interaction model. Don’t interact bed with the other predictors; just include it as a main effect. Note that bed is categorical. So to use it properly, you will need to either construct dummy variables or rather an index variable, as explained in Chapter 5.
data(tulips)
d <- tulips
d$shade.c <- d$shade - mean(d$shade)
d$water.c <- d$water - mean(d$water)
# Dummy variables
d$bedb <- d$bed == "b"
d$bedc <- d$bed == "c"
# Index variable
d$bedx <- coerce_index(d$bed)
m_dummy <- map(alist(blooms ~ dnorm(mu, sigma),
mu <- a + bW*water.c + bS*shade.c + bWS*water.c*shade.c + bBb*bedb + bBc*bedc,
a ~ dnorm(130, 100), bW ~ dnorm(0, 100), bS ~ dnorm(0, 100), bWS ~ dnorm(0, 100), bBb ~ dnorm(0, 100),
bBc ~ dnorm(0, 100), sigma ~ dunif(0, 100)), data = d,
start = list(a = mean(d$blooms), bW = 0, bS = 0, bWS = 0, bBb = 0, bBc = 0, sigma = sd(d$blooms)))
precis(m_dummy)
## mean sd 5.5% 94.5%
## a 99.36131 12.757521 78.97233 119.75029
## bW 75.12433 9.199747 60.42136 89.82730
## bS -41.23103 9.198481 -55.93198 -26.53008
## bWS -52.15060 11.242951 -70.11901 -34.18219
## bBb 42.41139 18.039255 13.58118 71.24160
## bBc 47.03141 18.040136 18.19979 75.86303
## sigma 39.18964 5.337920 30.65862 47.72067