Chapter 13 - Models with Memory

This chapter has been an introduction to the motivation, implementation, and interpretation of basic multilevel models. It focused on varying intercepts, which achieve better estimates of baseline differences among clusters in the data. They achieve better estimates, because they simultaneously model the population of clusters and use inferences about the population to pool information among parameters. From another perspective, varying intercepts are adaptively regularized parameters, relying upon a prior that is itself learned from the data.

Place each answer inside the code chunk (grey box). The code chunks should contain a text response or a code that completes/answers the question or activity requested. Make sure to include plots if the question requests them.

Finally, upon completion, name your final output .html file as: YourName_ANLY505-Year-Semester.html and publish the assignment to your R Pubs account and submit the link to Canvas. Each question is worth 5 points.

Questions

13-1. Revisit the Reed frog survival data, data(reedfrogs), and add the predation and size treatment variables to the varying intercepts model. Consider models with either main effect alone, both main effects, as well as a model including both and their interaction. Instead of focusing on inferences about these two predictor variables, focus on the inferred variation across tanks. Explain why it changes as it does across models. Plot the sigma estimates.

data(reedfrogs)
d <- reedfrogs

# Make the tank cluster variable
d$tank <- 1:nrow(d)

dat <- list(
  S = d$surv,
  N = d$density,
  tank = d$tank,
  pred = ifelse(d$pred == "no", 0L, 1L),
  size_ = ifelse(d$size == "small", 1L, 2L)
)

# Models
# 1. Tank model
m_tank <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank],
    a[tank] ~ dnorm(a_bar, sigma),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 1 finished in 0.8 seconds.
## Chain 2 finished in 0.8 seconds.
## Chain 3 finished in 0.7 seconds.
## Chain 4 finished in 0.7 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.7 seconds.
## Total execution time: 1.6 seconds.
precis(m_tank, depth=2)
##               mean        sd         5.5%        94.5%     n_eff     Rhat4
## a[1]   2.139354834 0.8823653  0.871043290  3.619953000  7379.554 0.9996857
## a[2]   3.084814025 1.0935045  1.483523000  4.937012950  6708.561 0.9996576
## a[3]   1.000942348 0.6667264  0.005122918  2.096738950  9224.656 0.9998518
## a[4]   3.079345609 1.1238920  1.453487600  5.027847050  6017.353 0.9991748
## a[5]   2.157091906 0.8789173  0.867632990  3.631496200  5690.514 0.9993616
## a[6]   2.134207621 0.8615963  0.848129745  3.612910700  6521.471 0.9996282
## a[7]   3.083822051 1.1435743  1.424876750  5.070246550  6873.499 0.9993750
## a[8]   2.141559143 0.8701659  0.853247720  3.618810400  6472.437 0.9995777
## a[9]  -0.177557776 0.6114889 -1.156354800  0.798898370 10462.078 0.9991931
## a[10]  2.145916426 0.8882575  0.840712570  3.688301900  6729.197 0.9993763
## a[11]  1.013704583 0.6916705 -0.030651935  2.156214200  8859.879 0.9992533
## a[12]  0.600707647 0.6601271 -0.424156750  1.662426250  9547.846 0.9995631
## a[13]  1.004097005 0.6519578  0.001450124  2.052522750  7389.139 0.9993979
## a[14]  0.189345207 0.6001455 -0.774082985  1.164776750 11125.885 0.9995551
## a[15]  2.157851279 0.9193853  0.837230500  3.749583350  6953.532 0.9999258
## a[16]  2.140630082 0.9072878  0.791392465  3.667595850  7376.025 0.9995521
## a[17]  2.903471798 0.7911194  1.761193250  4.288429450  7007.165 0.9996930
## a[18]  2.402487420 0.6548708  1.432288450  3.480237450  7130.431 0.9996759
## a[19]  2.010664662 0.5771706  1.154738350  2.974783300  8409.983 0.9992478
## a[20]  3.682373169 1.0305541  2.238263900  5.459993500  5397.769 1.0001108
## a[21]  2.396460731 0.6543938  1.434248250  3.502462750  6190.240 0.9997756
## a[22]  2.398305898 0.6754160  1.402551950  3.538588450  6661.342 0.9999029
## a[23]  2.388661949 0.6564930  1.410695800  3.515689350  7241.542 0.9994386
## a[24]  1.702992566 0.5328270  0.904643645  2.594155100  7644.225 0.9992667
## a[25] -1.000104737 0.4574747 -1.735817700 -0.279195665  9108.035 0.9996249
## a[26]  0.164228390 0.4064819 -0.488361715  0.812009175  9910.590 0.9996430
## a[27] -1.432637122 0.5012234 -2.255679700 -0.675993630  8875.743 1.0004828
## a[28] -0.472589141 0.4171144 -1.140355300  0.175482970  8044.146 0.9997209
## a[29]  0.152906267 0.4075423 -0.518870520  0.821573620  9727.513 0.9993605
## a[30]  1.444378601 0.4896660  0.707391320  2.262673000  8645.939 0.9993011
## a[31] -0.636337161 0.4140141 -1.317683100  0.005393385 11013.202 0.9992950
## a[32] -0.317426070 0.4094462 -0.973506775  0.331302355  9428.253 0.9995003
## a[33]  3.187090057 0.7680708  2.047662400  4.510023200  6413.721 0.9994159
## a[34]  2.707872042 0.6523994  1.752082500  3.822713000  8568.188 1.0000723
## a[35]  2.719533110 0.6546339  1.753967800  3.809695500  7530.784 0.9996663
## a[36]  2.059305227 0.5011596  1.300764300  2.888627150  8626.266 0.9994014
## a[37]  2.062416772 0.5146745  1.305797250  2.925972850  7218.482 0.9995200
## a[38]  3.902869523 0.9882327  2.496115050  5.641424550  5621.559 0.9998076
## a[39]  2.710892664 0.6340914  1.792347400  3.771429150  7277.108 0.9993147
## a[40]  2.356661108 0.5650646  1.521443700  3.294630600  7248.143 0.9993540
## a[41] -1.813311805 0.4704951 -2.608813550 -1.095541400  7338.367 0.9992789
## a[42] -0.571293239 0.3504809 -1.125182650 -0.031849432 10930.095 0.9991455
## a[43] -0.451827900 0.3582790 -1.023416050  0.105739905 10243.616 0.9993279
## a[44] -0.341301613 0.3354641 -0.884829110  0.186984475  9442.206 0.9993225
## a[45]  0.582357301 0.3368615  0.048021687  1.136956950  9781.126 0.9994608
## a[46] -0.568792054 0.3495822 -1.144957700 -0.015890552 10896.329 0.9992557
## a[47]  2.063938301 0.5232524  1.266191400  2.964530300  7871.560 0.9994709
## a[48]  0.001933224 0.3304639 -0.529622000  0.534829720  9403.407 0.9990636
## a_bar  1.347664997 0.2582329  0.944302575  1.760952650  6193.019 0.9992462
## sigma  1.620685760 0.2176298  1.304200100  1.988069150  3588.455 1.0000226
# 2. Predation model
m_pred <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank] + bp * pred,
    a[tank] ~ dnorm(a_bar, sigma),
    bp ~ dnorm(-0.5, 1),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 2 finished in 0.8 seconds.
## Chain 4 finished in 0.8 seconds.
## Chain 1 finished in 0.9 seconds.
## Chain 3 finished in 0.8 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.8 seconds.
## Total execution time: 1.7 seconds.
precis(m_pred, depth=2)
##             mean        sd        5.5%     94.5%     n_eff     Rhat4
## a[1]   2.4791486 0.6705576  1.45414855  3.574087 3129.3981 1.0002227
## a[2]   2.9676340 0.7343736  1.83426180  4.200750 3022.9972 0.9998312
## a[3]   1.6916150 0.6465639  0.67164537  2.764741 3230.9614 1.0001688
## a[4]   2.9473278 0.7417523  1.81685625  4.166801 2853.9381 0.9992097
## a[5]   2.4755062 0.6836259  1.41951140  3.591347 3145.3364 1.0009150
## a[6]   2.4863461 0.6808629  1.43741560  3.596244 2862.2679 1.0009241
## a[7]   2.9402141 0.7264170  1.80753025  4.140335 2358.5853 1.0022218
## a[8]   2.4725862 0.6812276  1.42056425  3.591529 3082.8683 0.9994176
## a[9]   2.2107249 0.5854798  1.25144250  3.103419 1733.0903 1.0000508
## a[10]  3.5724709 0.6305934  2.62700785  4.623500 1805.5930 1.0003763
## a[11]  2.9972508 0.5832561  2.09257500  3.944989 1589.8101 1.0002783
## a[12]  2.7401811 0.5835230  1.81570505  3.660194 1277.4342 1.0017289
## a[13]  2.9970576 0.5800685  2.08089865  3.936798 1672.9354 1.0000014
## a[14]  2.4814042 0.5746208  1.55845300  3.395428 1481.1540 0.9996288
## a[15]  3.5744852 0.6170712  2.64017680  4.599357 1637.4149 0.9998360
## a[16]  3.5782290 0.6160022  2.62153450  4.608189 1684.6172 1.0012356
## a[17]  2.8844770 0.6181265  1.97091920  3.907294 3527.3237 0.9993137
## a[18]  2.5457022 0.5638000  1.69295890  3.454837 3731.1522 1.0002131
## a[19]  2.2651536 0.5445483  1.43217710  3.159438 2852.2428 1.0006000
## a[20]  3.2959311 0.7037282  2.23370200  4.481422 2137.1273 1.0006289
## a[21]  2.5630053 0.5674189  1.68786480  3.501524 3451.0731 1.0012735
## a[22]  2.5552162 0.5764343  1.68930735  3.520831 3521.0023 0.9997476
## a[23]  2.5598343 0.5674617  1.70886230  3.505797 3760.7845 1.0016901
## a[24]  2.0072078 0.5001499  1.21397900  2.809456 3533.4344 1.0000206
## a[25]  1.5506075 0.4820053  0.76886030  2.306667  926.5170 0.9999283
## a[26]  2.5237149 0.4619467  1.78156030  3.248608 1032.2436 1.0002991
## a[27]  1.2191388 0.5164968  0.37511416  2.015608 1051.0492 0.9993559
## a[28]  1.9872133 0.4575958  1.27250220  2.712725  860.2296 1.0011491
## a[29]  2.5070957 0.4589991  1.78570670  3.240450  880.7434 1.0002461
## a[30]  3.5140685 0.5050216  2.72000230  4.322436 1203.1538 1.0011681
## a[31]  1.8473256 0.4718289  1.10524170  2.593928  990.5469 1.0009301
## a[32]  2.1266151 0.4667218  1.37195910  2.867313  930.6830 1.0006575
## a[33]  3.0738497 0.6058877  2.18463975  4.099257 2860.5127 0.9998669
## a[34]  2.7552769 0.5273956  1.95642920  3.621037 3756.8092 1.0003566
## a[35]  2.7714043 0.5577782  1.93421645  3.710248 4287.3647 1.0000191
## a[36]  2.2646683 0.4777316  1.52783565  3.037059 3675.3954 0.9997729
## a[37]  2.2587056 0.4906736  1.51959295  3.072364 3908.2420 0.9997119
## a[38]  3.4186965 0.6585820  2.47140790  4.541787 2794.6627 0.9996473
## a[39]  2.7575299 0.5552162  1.90513005  3.705749 2983.1733 1.0005011
## a[40]  2.4882502 0.4958351  1.72608205  3.318156 3900.9311 1.0004701
## a[41]  0.9101623 0.5147428  0.06198517  1.692725  877.8186 0.9999942
## a[42]  1.8947620 0.4292650  1.20141920  2.560449  807.3446 1.0010667
## a[43]  1.9889309 0.4366077  1.29562535  2.668651  863.4504 1.0003016
## a[44]  2.0989107 0.4305924  1.41179755  2.796795  758.1813 1.0004927
## a[45]  2.8894775 0.4364680  2.18313770  3.590931  843.8052 1.0005995
## a[46]  1.8891043 0.4253602  1.21010615  2.552536  716.3377 1.0013833
## a[47]  4.0057890 0.4873429  3.25254455  4.798610 1007.5303 1.0012222
## a[48]  2.3941671 0.4289836  1.70081310  3.075431  836.7002 1.0003920
## bp    -2.4323869 0.3014156 -2.91233110 -1.947431  440.1681 1.0022013
## a_bar  2.5325073 0.2351159  2.15516930  2.912444  522.6205 1.0036157
## sigma  0.8299011 0.1454788  0.61746976  1.072961 1060.2321 1.0007267
# 3. Size model
m_size <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank] + s[size_],
    a[tank] ~ dnorm(a_bar, sigma),
    s[size_] ~ dnorm(0, 0.5),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 2 finished in 0.7 seconds.
## Chain 3 finished in 0.7 seconds.
## Chain 4 finished in 0.7 seconds.
## Chain 1 finished in 0.8 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.7 seconds.
## Total execution time: 0.9 seconds.
precis(m_size, depth=2)
##              mean        sd       5.5%         94.5%     n_eff    Rhat4
## a[1]   2.13871895 0.9664519  0.6957404  3.7150677000 1188.5113 1.003672
## a[2]   3.07326933 1.1558365  1.3661089  5.0393987500 1712.0719 1.003833
## a[3]   1.01729454 0.7777428 -0.1858488  2.3043695000 1067.6693 1.004998
## a[4]   3.05595944 1.1225634  1.3942208  4.9677930000 1462.8243 1.003707
## a[5]   1.92708239 0.9396639  0.5079957  3.4845159500 1554.0736 1.002653
## a[6]   1.94037704 0.9639290  0.4631655  3.5857853500 1120.0183 1.002781
## a[7]   2.87772853 1.1753546  1.1795360  4.8952860500 1752.3922 1.003652
## a[8]   1.91265962 0.9242226  0.4958658  3.4640934000 1444.8979 1.004281
## a[9]  -0.16772905 0.7142895 -1.3233951  0.9483253600  962.4824 1.004058
## a[10]  2.12110236 0.9463656  0.6760815  3.7114410500 1468.4316 1.002140
## a[11]  1.02736023 0.7580583 -0.1321935  2.2711509000  944.1017 1.005344
## a[12]  0.59589300 0.7184553 -0.5436850  1.7619834000  865.3956 1.006972
## a[13]  0.76460332 0.7900571 -0.4677159  2.0396384000  972.6874 1.002138
## a[14] -0.04098394 0.7018104 -1.1646610  1.0892531000 1005.0833 1.003332
## a[15]  1.95468421 0.9852688  0.4801773  3.6116559000 1550.5391 1.003096
## a[16]  1.93597513 0.9613181  0.5327916  3.5805275000 1475.6520 1.000518
## a[17]  2.92657921 0.8531336  1.6228532  4.2839720000 1186.3181 1.004046
## a[18]  2.40447374 0.7401597  1.2603110  3.5945123500 1039.3495 1.003803
## a[19]  2.03254781 0.6847116  0.9470947  3.1299314500  826.3857 1.003495
## a[20]  3.66519300 1.0794004  2.1406307  5.5343624000 1308.4485 1.002573
## a[21]  2.15290940 0.7696560  1.0018731  3.4465130500  985.1068 1.003078
## a[22]  2.14654182 0.7582287  0.9808578  3.3973879000 1041.0170 1.002919
## a[23]  2.14906233 0.7798855  0.9792222  3.4540054500  999.3593 1.003458
## a[24]  1.45930540 0.6416004  0.4583113  2.5186039000  696.7931 1.005527
## a[25] -0.97124013 0.5782580 -1.8829093 -0.0941422510  670.3475 1.008858
## a[26]  0.19129743 0.5411308 -0.6779357  1.0607378500  589.1535 1.008793
## a[27] -1.40269064 0.6203233 -2.4289390 -0.4435482150  758.0664 1.006091
## a[28] -0.43812890 0.5595889 -1.3727810  0.4151165050  654.2046 1.006398
## a[29] -0.09214126 0.5640496 -0.9840284  0.8068297450  598.0219 1.006266
## a[30]  1.20433706 0.6234549  0.2023853  2.2218615500  691.0793 1.004743
## a[31] -0.89589140 0.5792696 -1.8509525  0.0174221360  615.0356 1.005461
## a[32] -0.57164710 0.5615494 -1.4609837  0.2988551650  610.6222 1.002989
## a[33]  3.19020190 0.8346883  1.9530172  4.6083149000 1274.9040 1.001752
## a[34]  2.73561278 0.7295572  1.6237476  3.9347947500  918.5318 1.004667
## a[35]  2.74317470 0.7491106  1.5947961  3.9832167000  856.2882 1.006104
## a[36]  2.09405502 0.6457442  1.0927913  3.1660808000  744.2593 1.005467
## a[37]  1.81480479 0.6435218  0.8082782  2.8725056500  638.8955 1.006261
## a[38]  3.67604786 1.0438822  2.1664343  5.4511272500 1264.4709 1.002725
## a[39]  2.47465243 0.7548790  1.3384336  3.7321910500  975.3977 1.001894
## a[40]  2.10442414 0.7026898  1.0154967  3.2533738500  749.9590 1.006279
## a[41] -1.79467061 0.6140446 -2.8009531 -0.8217640050  759.6048 1.007831
## a[42] -0.53850722 0.5105812 -1.3581363  0.2628526100  530.1575 1.010579
## a[43] -0.42414374 0.5068648 -1.2475417  0.3832738400  526.0678 1.010255
## a[44] -0.29574667 0.5116993 -1.0932443  0.5354598450  536.8741 1.008637
## a[45]  0.32690393 0.5177243 -0.5075836  1.1561443500  477.3476 1.006417
## a[46] -0.82863107 0.5261353 -1.6642964 -0.0003771641  528.9382 1.006674
## a[47]  1.79957035 0.6325645  0.8359137  2.8256098000  786.6236 1.007319
## a[48] -0.25656223 0.5053317 -1.0561424  0.5498575350  520.1632 1.004886
## s[1]   0.26166972 0.3969989 -0.3807660  0.8993232250  334.3653 1.010841
## s[2]  -0.04486392 0.3878607 -0.6634651  0.5956324100  334.2078 1.016296
## a_bar  1.23515427 0.4059043  0.5753816  1.8799693500  354.2800 1.014439
## sigma  1.62149011 0.2130324  1.3106234  1.9883278500 1661.1697 1.002792
# 4. Predation and Size model
m_pred_size <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank] + bp * pred + s[size_],
    a[tank] ~ dnorm(a_bar, sigma),
    bp ~ dnorm(-0.5, 1),
    s[size_] ~ dnorm(0, 0.5),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 1 finished in 1.1 seconds.
## Chain 2 finished in 1.0 seconds.
## Chain 4 finished in 1.0 seconds.
## Chain 3 finished in 1.1 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 1.1 seconds.
## Total execution time: 1.2 seconds.
precis(m_pred_size, depth=2)
##             mean        sd       5.5%      94.5%     n_eff     Rhat4
## a[1]   2.3650220 0.7249233  1.2140099  3.5305104  966.4380 1.0024975
## a[2]   2.7947649 0.7799638  1.6158260  4.0976832  981.0937 1.0018963
## a[3]   1.6690456 0.6946150  0.5647598  2.7702058  889.2890 1.0005506
## a[4]   2.7842391 0.7763395  1.5795455  4.0376370  979.4478 1.0010416
## a[5]   2.2369357 0.7571953  1.0685400  3.4706384 1013.8634 1.0005750
## a[6]   2.2366165 0.7531595  1.0431165  3.4349924 1226.8359 1.0002779
## a[7]   2.6920044 0.7968012  1.5033028  4.0056414  939.3443 1.0008832
## a[8]   2.2395325 0.7419009  1.0609011  3.4350137  992.0101 1.0011112
## a[9]   2.1760421 0.6678335  1.1011212  3.2421915  703.9734 1.0005733
## a[10]  3.4471147 0.6993758  2.3626235  4.6218121  726.2267 1.0008135
## a[11]  2.9144343 0.6646853  1.8843352  3.9901271  707.9641 1.0010464
## a[12]  2.6649401 0.6480791  1.6270163  3.7141312  715.3615 1.0009180
## a[13]  2.6723592 0.6685155  1.6320196  3.7596268  713.0160 1.0024952
## a[14]  2.1743606 0.6462721  1.1364481  3.2319534  637.9466 1.0014213
## a[15]  3.2385429 0.7012878  2.1766007  4.3776536  710.7884 1.0022433
## a[16]  3.2317314 0.6951678  2.1775188  4.3677114  776.1769 1.0021712
## a[17]  2.7949899 0.6667117  1.7726400  3.8618312  796.9413 1.0021410
## a[18]  2.5036263 0.6543352  1.4958651  3.5545379  799.0855 1.0006649
## a[19]  2.2332900 0.6190082  1.2577406  3.2185163  766.4475 1.0013800
## a[20]  3.1266430 0.7221298  2.0230753  4.3243643  841.9956 0.9997995
## a[21]  2.2801027 0.6522370  1.2663723  3.3348649  813.9316 1.0018751
## a[22]  2.2895869 0.6600029  1.2643051  3.3892631  801.4775 1.0023355
## a[23]  2.2849838 0.6814935  1.2097083  3.3947283  834.8690 1.0003985
## a[24]  1.7309176 0.6155290  0.7658927  2.7215502  786.5770 1.0017803
## a[25]  1.5735572 0.5933826  0.6095256  2.4863902  552.7301 1.0011617
## a[26]  2.5093843 0.5795819  1.5735990  3.4440949  539.5724 1.0013539
## a[27]  1.2485553 0.6361805  0.2164768  2.2732209  592.7061 1.0019651
## a[28]  1.9851820 0.5758161  1.0505857  2.9164222  543.3111 1.0002431
## a[29]  2.1835304 0.5638864  1.2947110  3.0939143  497.7881 1.0021129
## a[30]  3.1631572 0.5832097  2.2380308  4.0877861  574.0641 1.0019575
## a[31]  1.5290546 0.5803873  0.5865966  2.4591470  537.0308 1.0032454
## a[32]  1.8046969 0.5756750  0.8872703  2.7371756  561.6295 1.0009858
## a[33]  2.9720667 0.6810216  1.9287128  4.0979975  849.4387 1.0011059
## a[34]  2.6899979 0.6308222  1.7150502  3.7246109  770.0205 1.0012664
## a[35]  2.6948666 0.6229762  1.7388234  3.7116233  809.6635 1.0006035
## a[36]  2.2453550 0.5823749  1.3469350  3.2157588  756.1859 1.0005579
## a[37]  1.9748006 0.5833316  1.0684308  2.9365203  800.6572 1.0025257
## a[38]  3.1146221 0.7414964  2.0014693  4.3479534  905.8687 1.0016967
## a[39]  2.4805966 0.6462939  1.4904129  3.5467300  810.7675 1.0020356
## a[40]  2.2007632 0.6075683  1.2434747  3.1790689  845.0417 1.0013170
## a[41]  0.9336001 0.6301408 -0.0823995  1.9012614  615.5299 1.0009445
## a[42]  1.9169885 0.5601446  1.0235017  2.8039514  482.3035 1.0006307
## a[43]  2.0108857 0.5505443  1.1380546  2.8833890  493.1400 1.0009228
## a[44]  2.1055405 0.5515198  1.2231376  2.9918605  475.1545 1.0013119
## a[45]  2.5503584 0.5461231  1.6997374  3.4637265  482.7504 1.0032827
## a[46]  1.5613107 0.5597759  0.6821783  2.4694387  497.9577 1.0006805
## a[47]  3.6368509 0.6001049  2.6988517  4.6140635  559.5970 1.0021509
## a[48]  2.0522233 0.5439392  1.1944986  2.9221185  465.5214 1.0029070
## bp    -2.4240730 0.2870552 -2.8611537 -1.9559846 1183.0723 1.0003551
## s[1]   0.3706942 0.3741063 -0.2418982  0.9615911  409.7419 1.0026962
## s[2]  -0.0520606 0.3754427 -0.6565463  0.5346124  407.8104 1.0021139
## a_bar  2.3602883 0.4087873  1.7165329  3.0622115  321.2622 1.0039686
## sigma  0.7815911 0.1457756  0.5699464  1.0312031 1130.8071 1.0007541
# 5. Predation and Size interaction model
m_pred_size_int <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a_bar + a[tank] * sigma + bp[size_] * pred + s[size_], 
    a[tank] ~ dnorm(0, 1),
    bp[size_] ~ dnorm(-0.5, 1),
    s[size_] ~ dnorm(0, 0.5),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 2 finished in 1.6 seconds.
## Chain 1 finished in 1.6 seconds.
## Chain 3 finished in 1.6 seconds.
## Chain 4 finished in 1.6 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 1.6 seconds.
## Total execution time: 1.8 seconds.
precis(m_pred_size_int, depth=2)
##              mean        sd        5.5%       94.5%    n_eff     Rhat4
## a[1]  -0.04558235 0.8237533 -1.31891085  1.28994985 6178.977 0.9996670
## a[2]   0.48951657 0.8871586 -0.89070173  1.94387630 6552.004 0.9994374
## a[3]  -0.97843911 0.7773999 -2.20780470  0.23627257 6108.141 0.9996988
## a[4]   0.50570361 0.8696797 -0.82870051  1.90197165 6156.430 0.9995706
## a[5]  -0.00866625 0.8554489 -1.30325455  1.40522075 6471.590 0.9999319
## a[6]  -0.03437801 0.8444074 -1.35146340  1.34904835 5436.755 0.9994911
## a[7]   0.51074894 0.8804115 -0.86041861  1.92937230 6256.050 0.9996697
## a[8]  -0.01825059 0.8271777 -1.32424855  1.32287285 5488.405 0.9995268
## a[9]  -0.10000057 0.6882378 -1.19042645  1.00552160 4577.043 0.9997530
## a[10]  1.52286431 0.7065178  0.40641643  2.64922055 4340.393 1.0002487
## a[11]  0.83733240 0.6956738 -0.25532473  1.95616545 5259.736 1.0003246
## a[12]  0.52885363 0.6864105 -0.54773274  1.61974495 5181.334 0.9996782
## a[13]  0.21722699 0.6713585 -0.83272071  1.31054090 4739.898 1.0002308
## a[14] -0.43048475 0.6822490 -1.49968150  0.67018624 5215.244 1.0000676
## a[15]  0.97282761 0.7421896 -0.19091590  2.19295065 5166.873 0.9995139
## a[16]  0.94577711 0.7279197 -0.20643406  2.13155685 5752.307 0.9992813
## a[17]  0.46053066 0.7757848 -0.75608616  1.72144380 6139.864 0.9991536
## a[18]  0.05814546 0.7485632 -1.10445280  1.28041500 5193.698 0.9994731
## a[19] -0.26927902 0.7160079 -1.40556735  0.89060887 4267.085 1.0001304
## a[20]  0.89514010 0.8142778 -0.35430160  2.24300715 5222.992 0.9995078
## a[21]  0.10422183 0.7177335 -1.00701125  1.26543010 5942.295 0.9994395
## a[22]  0.09139712 0.7403871 -1.06665815  1.27286460 5701.706 0.9992438
## a[23]  0.09259331 0.7498289 -1.08724585  1.30114070 5250.367 0.9992217
## a[24] -0.56477945 0.6640575 -1.60723860  0.51133511 4595.503 1.0001760
## a[25] -0.87366501 0.5739705 -1.78789345  0.02194784 3853.776 0.9999412
## a[26]  0.37804920 0.5601662 -0.48507243  1.28082620 3289.867 1.0011574
## a[27] -1.29013740 0.6008579 -2.25895085 -0.34915096 3749.294 1.0010282
## a[28] -0.31582025 0.5499574 -1.21808080  0.54864854 3425.679 1.0004495
## a[29] -0.49333816 0.5448982 -1.36655205  0.39055512 4368.731 1.0005684
## a[30]  0.81602688 0.5955228 -0.12200869  1.77391580 3836.072 1.0002884
## a[31] -1.33395240 0.5645813 -2.24086730 -0.46915376 3430.018 1.0001464
## a[32] -0.99732276 0.5645287 -1.91329125 -0.07789645 4178.562 1.0011787
## a[33]  0.66783919 0.7598633 -0.49028828  1.91713920 5017.965 1.0006775
## a[34]  0.31886177 0.7133135 -0.77779171  1.49384805 5501.437 0.9997602
## a[35]  0.29509530 0.7106208 -0.82807937  1.49546825 4891.041 0.9999197
## a[36] -0.28675253 0.6496736 -1.33867685  0.76238493 3989.667 1.0000202
## a[37] -0.25112670 0.6754266 -1.32262270  0.82261674 4905.506 0.9996743
## a[38]  1.09401977 0.7606293 -0.09949578  2.33264805 6184.410 0.9992919
## a[39]  0.35232052 0.7035999 -0.74888745  1.51492460 5171.585 0.9997316
## a[40]  0.04742504 0.6810682 -1.01361075  1.16157635 4791.855 0.9996490
## a[41] -1.70505993 0.5920081 -2.71384500 -0.79763334 4379.503 0.9997439
## a[42] -0.41391513 0.5207465 -1.26228720  0.42411476 3390.141 1.0012806
## a[43] -0.26793528 0.5082112 -1.06298825  0.55641658 3479.123 1.0011509
## a[44] -0.13502259 0.5077844 -0.93356485  0.67947515 3236.010 1.0012053
## a[45] -0.02406770 0.5239939 -0.84975646  0.79900769 3368.124 1.0001111
## a[46] -1.33669075 0.5163852 -2.17413255 -0.50980444 3447.698 1.0007067
## a[47]  1.43988549 0.6068612  0.49217176  2.43587355 3831.279 1.0000103
## a[48] -0.69291974 0.5101369 -1.51485665  0.11401142 3385.198 1.0010003
## bp[1] -1.84926202 0.3624804 -2.42036005 -1.26856965 2045.936 1.0007934
## bp[2] -2.74266349 0.3774810 -3.33444600 -2.13874300 2005.630 1.0039729
## s[1]   0.10539723 0.3977294 -0.52667576  0.73419752 2440.762 1.0004834
## s[2]   0.14809578 0.3975712 -0.48490450  0.77038805 2416.686 0.9996869
## a_bar  2.31268918 0.4194121  1.65581175  2.97497200 1987.408 0.9997796
## sigma  0.75506322 0.1486131  0.53622742  1.00842880 1774.730 1.0013283
# Plot sigma
plot(coeftab(m_tank, m_pred, m_size, m_pred_size, m_pred_size_int),
  pars = "sigma",
  labels = c("Tank", "Predation", "Size", "Predation - Size", "Predation - Size Interaction")
)

# There is a significant variation in tank between model that contains predation and model that does not. From this we can conclude that the predation variable explains a lot of the variation across tanks. Without the inclusion of predation, the model assign the unexplainable variation towards tank, hence the higher value of tank for the model without predation.

13-2. Compare the models you fit just above, using WAIC. Can you reconcile the differences in WAIC with the posterior distributions of the models? Show the WAIC table.

compare(m_tank, m_pred, m_size, m_pred_size, m_pred_size_int)
##                     WAIC       SE     dWAIC      dSE    pWAIC    weight
## m_pred          199.0088 9.004974 0.0000000       NA 19.20719 0.2981636
## m_pred_size_int 199.4305 8.975110 0.4217785 3.026553 19.01499 0.2414719
## m_pred_size     200.0171 8.676370 1.0083498 1.800900 19.15356 0.1800919
## m_size          200.0466 7.127134 1.0378739 5.650678 20.86252 0.1774529
## m_tank          201.1381 7.266008 2.1293316 5.469458 21.38318 0.1028197
# The WAIC between the different models are quite similar. However, for the posterior (see results in Question 13-1), parameter estimate for s is closer to 0 than those of bp, therfore, model that contains bp significantly reduce sigma, even if the WAIC value is similar for the different models.

13-3. Re-estimate the basic Reed frog varying intercept model, but now using a Cauchy distribution in place of the Gaussian distribution for the varying intercepts. That is, fit this model: \[\begin{align} s_i ∼ Binomial(n_i, p_i) \\ logit(p_i) = α_{tank[i]} \\ α_{tank} ∼ Cauchy(α, σ) \\ α ∼ Normal(0, 1) \\ σ ∼ Exponential(1) \\ \end{align}\]

(You are likely to see many divergent transitions for this model. Can you figure out why? Can you fix them?) Plot and compare the posterior means of the intercepts, αtank, to the posterior means produced in the chapter, using the customary Gaussian prior. Can you explain the pattern of differences? Take note of any change in the mean α as well.

# Model Tank - Cauchy
m_tank_cauchy <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank],
    a[tank] ~ dcauchy(a_bar, sigma),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 3 finished in 1.5 seconds.
## Chain 2 finished in 1.9 seconds.
## Chain 1 finished in 6.5 seconds.
## Chain 4 finished in 8.5 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 4.6 seconds.
## Total execution time: 8.6 seconds.
precis(m_tank_cauchy, depth=2)
##              mean         sd       5.5%        94.5%     n_eff     Rhat4
## a[1]   2.02568656  0.8378825  0.8838668  3.526301550 2565.1777 1.0003583
## a[2]   6.10235553  7.2729070  1.5032074 20.719784000  179.4416 1.0172887
## a[3]   1.09644311  0.6054865  0.1248227  2.061132200 3534.0681 0.9999260
## a[4]   6.22217018 14.0302003  1.5262245 15.904816500  163.1774 1.0270941
## a[5]   2.02572828  0.8374322  0.9183321  3.502612800 2403.6957 0.9994277
## a[6]   2.00726458  0.8212668  0.8912029  3.400098000 2441.9602 1.0022250
## a[7]   6.01685710  9.2609949  1.5307707 17.258604500  107.6371 1.0373114
## a[8]   2.05640903  0.9043436  0.9040191  3.618700900 1512.2795 1.0004204
## a[9]  -0.08263406  0.6720609 -1.1817087  0.952750900 4577.0730 0.9991966
## a[10]  2.04562378  0.9021646  0.8805592  3.584407450 2127.3854 1.0000050
## a[11]  1.11796444  0.6192624  0.1232217  2.111774100 4969.2758 1.0000928
## a[12]  0.74864887  0.6325190 -0.2714654  1.756584550 4210.3969 0.9999640
## a[13]  1.12796630  0.6154092  0.1521952  2.100475550 3280.8015 1.0000894
## a[14]  0.33125904  0.6360324 -0.7279250  1.322073450 4364.6488 1.0002470
## a[15]  2.04053138  0.8495517  0.8945870  3.486999700 2925.9397 1.0003648
## a[16]  2.03109618  0.8814902  0.8461728  3.543476600 2393.8040 0.9994296
## a[17]  2.86667122  0.9401627  1.6243573  4.528971150 1965.1210 0.9996627
## a[18]  2.24811172  0.6255684  1.3747095  3.339585400 3635.4770 0.9997852
## a[19]  1.92651197  0.5435114  1.1428343  2.854831650 3235.4602 1.0006850
## a[20] 11.57091404 25.3327700  2.3973349 40.180899000  196.9126 1.0196939
## a[21]  2.23562452  0.6406369  1.3316350  3.328059550 2434.3586 1.0001544
## a[22]  2.25283472  0.6495868  1.3636845  3.376808250 3407.6663 1.0014742
## a[23]  2.26080812  0.6477639  1.3675586  3.327975200 2656.9716 1.0012332
## a[24]  1.66857726  0.4911464  0.9121136  2.498675950 3138.3467 0.9995201
## a[25] -1.05566563  0.4754767 -1.8593488 -0.331787235 3479.5123 0.9992921
## a[26]  0.23078488  0.3939441 -0.3889121  0.871576635 4390.8571 0.9998855
## a[27] -1.57089235  0.5524814 -2.4591149 -0.745154395 4441.7189 1.0004450
## a[28] -0.44934884  0.4242030 -1.1368448  0.218748940 3915.5602 0.9996697
## a[29]  0.22816558  0.4150760 -0.4285624  0.880927170 4211.9333 0.9994374
## a[30]  1.45608074  0.4570415  0.7704953  2.204585150 3251.4561 0.9995238
## a[31] -0.64443836  0.4434930 -1.3552727  0.032807657 5174.2196 0.9995139
## a[32] -0.28655262  0.4160372 -0.9498992  0.371641100 4458.8939 0.9991527
## a[33]  3.23657521  0.9981780  1.9510471  5.035933500 2276.9565 1.0017811
## a[34]  2.58204060  0.6598603  1.6610506  3.731071300 3626.4598 0.9996555
## a[35]  2.60392701  0.6776408  1.6969992  3.772967200 2878.0198 1.0005419
## a[36]  1.98326417  0.4896585  1.2566502  2.831133800 4036.8226 0.9999973
## a[37]  1.98671282  0.4798693  1.2782791  2.784532600 2807.4103 1.0003826
## a[38] 14.06621667 26.2718033  2.8774306 49.688509000  123.3647 1.0637737
## a[39]  2.59799914  0.6763839  1.6647233  3.739364750 2469.8706 1.0018644
## a[40]  2.23928133  0.5390860  1.4753054  3.162879150 3458.9111 1.0010259
## a[41] -2.00261546  0.5504244 -2.9534316 -1.173267600 3910.1130 0.9993222
## a[42] -0.57288130  0.3641540 -1.1804720 -0.009068704 3653.3058 0.9996335
## a[43] -0.44263677  0.3656951 -1.0206369  0.143262730 5501.1792 0.9992620
## a[44] -0.31463624  0.3429372 -0.8667499  0.221470125 4289.7080 0.9998741
## a[45]  0.65230444  0.3581451  0.1012972  1.223588450 3523.3577 0.9997113
## a[46] -0.55992025  0.3627525 -1.1460667  0.025235987 4506.2855 1.0000742
## a[47]  1.98007895  0.4808581  1.2840590  2.816681550 4035.1786 0.9997970
## a[48]  0.04945931  0.3537247 -0.5202884  0.598482120 4513.8991 0.9999397
## a_bar  1.47641371  0.2993269  0.9765018  1.940064400 2210.1930 0.9999947
## sigma  1.01490540  0.2246976  0.6955057  1.389634850 2428.2744 0.9999611
# Extract results
a_tank <- apply(extract.samples(m_tank)$a, 2, mean)
a_tank_cauchy <- apply(extract.samples(m_tank_cauchy)$a, 2, mean)

# Plot
plot(a_tank, a_tank_cauchy, pch=15, col=rangi2,
     xlab="Intercept (Guassian Prior)", ylab="Intercept(Cauchy Prior)")
abline(a=0, b=1, lty=4)

# At higher level of αtank value, we see a higher divergent values between the Cauchy Prior and Gaussian Prior, with αtank values for Cauchy Prior being much higher than the Gaussian prior. This is likely due to the fact that Cauchy distribution has a much longer tail than the Gaussian prior. At normal αtank value, we see that the point falls on the straight line, which means that they are not divergent.

13-4. Now use a Student-t distribution with ν = 2 for the intercepts: \[\begin{align} α_{tank} ∼ Student(2, α, σ) \end{align}\]

Refer back to the Student-t example in Chapter 7 (page 234), if necessary. Plot and compare the resulting posterior to both the original model and the Cauchy model in 13-3. Can you explain the differences and similarities in shrinkage in terms of the properties of these distributions?

# Model Tank - Student t
m_tank_t <- ulam(
  alist(
    S ~ dbinom(N, p),
    logit(p) <- a[tank],
    a[tank] ~ dstudent(2, a_bar, sigma),
    a_bar ~ dnorm(0, 1.5),
    sigma ~ dexp(1)
  ),
  data=dat, chains=4, cores=4, log_lik=TRUE, iter=2e3, refresh = 0
)
## Running MCMC with 4 parallel chains, with 1 thread(s) per chain...
## 
## Chain 1 finished in 1.2 seconds.
## Chain 4 finished in 1.5 seconds.
## Chain 3 finished in 1.9 seconds.
## Chain 2 finished in 2.0 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 1.6 seconds.
## Total execution time: 2.2 seconds.
precis(m_tank_t, depth=2)
##              mean         sd         5.5%         94.5%      n_eff     Rhat4
## a[1]   2.09150853  0.9345374  0.856243205  3.6845833500 2228.64724 1.0013849
## a[2]   5.76822527 14.8210696  1.458777650  9.3626861000   68.12363 1.0584479
## a[3]   1.05232466  0.6318862  0.057910918  2.0734477000 7241.45814 0.9993305
## a[4]   4.58154940  5.4822759  1.496399850 10.4523550000  124.43248 1.0367923
## a[5]   2.08126751  0.8713159  0.850826380  3.6031043000 3499.73095 1.0000731
## a[6]   2.07767009  0.8774940  0.861342590  3.5870944000 4394.03370 1.0001579
## a[7]   3.78451554  2.4985720  1.439870100  8.2143411500  833.27222 1.0016964
## a[8]   2.06272354  0.8549841  0.886407250  3.4961419000 4178.22355 0.9992976
## a[9]  -0.11632287  0.6142491 -1.151102650  0.8427226000 6666.68790 0.9991479
## a[10]  2.07079363  0.8824222  0.826818765  3.5997955000 4369.57804 0.9997792
## a[11]  1.06810263  0.6486338  0.070873546  2.1248262500 6685.00363 0.9991119
## a[12]  0.67819980  0.6184712 -0.300067995  1.6602384500 7186.09653 0.9995956
## a[13]  1.07321993  0.6729731  0.006994275  2.1524767500 6176.49906 0.9996142
## a[14]  0.29375161  0.6240368 -0.728827755  1.2769200000 7120.92468 0.9996826
## a[15]  2.06537701  0.8495329  0.871700880  3.5368504500 4478.09040 0.9994473
## a[16]  2.10551641  0.9279282  0.821194690  3.6658224000 4326.03023 0.9999295
## a[17]  2.89516138  0.8994235  1.691480800  4.4621820000 3625.17091 1.0000426
## a[18]  2.32515846  0.6564663  1.402969750  3.4606031000 5192.20329 0.9998618
## a[19]  1.96092314  0.5576250  1.148756300  2.8984570500 5352.16719 0.9997376
## a[20]  5.30874364  3.8091180  2.326102850 11.4720605000  660.61185 1.0080098
## a[21]  2.32241244  0.6560918  1.382704650  3.4404330000 4291.85796 1.0000099
## a[22]  2.32298393  0.6741721  1.370756750  3.5117694000 4725.97662 0.9997389
## a[23]  2.33223780  0.6868904  1.381556350  3.5694238500 3695.96317 0.9997204
## a[24]  1.68690934  0.5117059  0.936142160  2.5518297000 4750.08144 0.9996304
## a[25] -1.03272396  0.4714164 -1.793590800 -0.3113737600 6170.54916 0.9992295
## a[26]  0.20692026  0.3989433 -0.441416885  0.8452835500 8303.69159 0.9998988
## a[27] -1.54739045  0.5516760 -2.476240400 -0.7388287850 5230.36747 0.9996999
## a[28] -0.44748933  0.4284915 -1.144265400  0.2253696600 6025.29928 0.9997091
## a[29]  0.21444706  0.4044909 -0.435484140  0.8716280700 5613.81992 1.0000236
## a[30]  1.44429409  0.4642951  0.746425350  2.2017596000 6626.39109 0.9996551
## a[31] -0.62206994  0.4229769 -1.329731250  0.0403708870 6432.95130 0.9994177
## a[32] -0.27842035  0.4095502 -0.947020850  0.3634964800 6924.13275 0.9997188
## a[33]  3.23057111  0.8749765  2.051485250  4.7322072500 3105.83774 1.0001723
## a[34]  2.64323820  0.6549529  1.714496850  3.7844145500 4904.90549 0.9998367
## a[35]  2.65898453  0.6769686  1.704746500  3.8497241500 4270.26833 0.9995628
## a[36]  2.01674306  0.5113093  1.243564700  2.8770284000 5394.67303 0.9999821
## a[37]  2.02058867  0.5027472  1.278523500  2.8571582500 5845.15384 0.9993444
## a[38]  5.76185203  3.8203207  2.645407150 12.1541375000  682.77181 1.0021268
## a[39]  2.66021164  0.6629003  1.725248650  3.8188357500 4405.86766 0.9996356
## a[40]  2.28423857  0.5543928  1.486026450  3.1862087500 5139.01988 0.9997224
## a[41] -1.95417463  0.5157233 -2.811182550 -1.1807433500 5746.21779 1.0007032
## a[42] -0.56087501  0.3627200 -1.146465200  0.0003907062 6057.28897 0.9996460
## a[43] -0.43727383  0.3510402 -0.999155915  0.1130930250 7457.20334 0.9997770
## a[44] -0.31681665  0.3484161 -0.881773250  0.2308227400 5922.13268 0.9999481
## a[45]  0.61756342  0.3482917  0.057650770  1.1732161500 6713.71559 0.9992683
## a[46] -0.56627727  0.3585904 -1.138170350  0.0005000623 5484.20026 0.9992776
## a[47]  2.01548067  0.4908443  1.286207600  2.8295366000 6453.60893 0.9991163
## a[48]  0.03235764  0.3362393 -0.509593300  0.5587337300 5967.83942 0.9996527
## a_bar  1.40045324  0.2841777  0.942461390  1.8561069500 4938.55566 0.9999837
## sigma  1.26471294  0.2186012  0.953704930  1.6375625500 3119.19933 1.0021393
# Extract results
a_tank_t <- apply(extract.samples(m_tank_t)$a, 2, mean)

# Compare results
compare(m_tank, m_tank_cauchy, m_tank_t)
##                   WAIC       SE    dWAIC      dSE    pWAIC    weight
## m_tank        201.1381 7.266008 0.000000       NA 21.38318 0.5295347
## m_tank_t      202.3646 7.855638 1.226490 1.456044 22.45489 0.2867911
## m_tank_cauchy 203.2558 8.518931 2.117671 2.525595 23.05350 0.1836742
# Plot
plot(a_tank, a_tank_t, pch=15, col=rangi2,
     xlab="Intercept (Guassian Prior)", ylab="Intercept(Student T Prior)")
abline(a=0, b=1, lty=4)

plot(a_tank_cauchy, a_tank_t, pch=15, col=rangi2,
     xlab="Intercept (Cauchy Prior)", ylab="Intercept(Student T Prior)")
abline(a=0, b=1, lty=4)

# Comparing αtank value between Gaussian and Student T prior, we see some similarities to the αtank value between Gaussian and Cauchy prior. We see some drift at higher value of αtank. This means that the tail end of Student T distribution is longer than a Gaussian distribution However, comparing the αtank value between Cauchy and Student T we see that at higher αtank value the value of Student T is lower than the value of Cauchy. This indicates that while the Student T distribution has a longer tail than the Gaussian distribution, but the tail is shorter than that of the Cauchy distribution. Overall, WAIC is quite similar between the three models, with Gaussian distribution intercept prior having the lowest WAIC, hence the preferred model.

13-5. Sometimes the prior and the data (through the likelihood) are in conflict, because they concentrate around different regions of parameter space. What happens in these cases depends a lot upon the shape of the tails of the distributions. Likewise, the tails of distributions strongly influence ca outliers are shrunk or not towards the mean. I want you to consider four different models to fit to one observation at y = 0, this is your data, do not use any other data set. The models differ only in the distributions assigned to the likelihood and prior. Here are the four models:

\[\begin{align} Model \;NN: y &∼ Normal(μ, 1) & Model \;TN: y &∼ Student(2, μ, 1) \\ μ &∼ Normal(10, 1) & μ &∼ Normal(10, 1) \\ Model \;NT: y &∼ Normal(μ, 1) & Model \;TT: y &∼ Student(2, μ, 1) \\ μ &∼ Student(2, 10, 1) & μ &∼ Student(2, 10, 1) \\ \end{align}\]

Estimate and plot the posterior distributions against the likelihoods for these models and compare them. Can you explain the results, using the properties of the distributions?

# Model NN
m_nn <- ulam(
  alist(
    y ~ normal(mu, 1),
    mu ~ normal(10, 1)
  ), data=list(N=1), chains=4
)
## Running MCMC with 4 sequential chains, with 1 thread(s) per chain...
## 
## Chain 1 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 1 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 1 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 1 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 1 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 1 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 1 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 1 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 1 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 1 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 1 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 1 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 1 finished in 0.0 seconds.
## Chain 2 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 2 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 2 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 2 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 2 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 2 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 2 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 2 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 2 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 2 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 2 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 2 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 2 finished in 0.0 seconds.
## Chain 3 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 3 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 3 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 3 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 3 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 3 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 3 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 3 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 3 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 3 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 3 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 3 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 3 finished in 0.0 seconds.
## Chain 4 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 4 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 4 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 4 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 4 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 4 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 4 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 4 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 4 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 4 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 4 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 4 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 4 finished in 0.0 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.0 seconds.
## Total execution time: 0.5 seconds.
precis(m_nn)
##        mean       sd     5.5%    94.5%    n_eff     Rhat4
## y  9.970932 1.345790 7.700927 12.10745 832.0895 0.9992954
## mu 9.973366 0.952106 8.492834 11.48423 780.0382 1.0002327
# Model NT
m_nt <- ulam(
  alist(
    y ~ normal(mu, 1),
    mu ~ dstudent(2, 10, 1)
  ), data=list(N=1), chains=4
)
## Running MCMC with 4 sequential chains, with 1 thread(s) per chain...
## 
## Chain 1 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 1 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 1 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 1 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 1 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 1 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 1 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 1 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 1 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 1 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 1 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 1 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 1 finished in 0.0 seconds.
## Chain 2 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 2 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 2 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 2 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 2 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 2 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 2 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 2 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 2 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 2 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 2 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 2 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 2 finished in 0.0 seconds.
## Chain 3 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 3 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 3 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 3 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 3 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 3 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 3 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 3 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 3 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 3 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 3 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 3 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 3 finished in 0.0 seconds.
## Chain 4 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 4 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 4 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 4 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 4 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 4 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 4 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 4 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 4 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 4 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 4 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 4 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 4 finished in 0.0 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.0 seconds.
## Total execution time: 0.5 seconds.
precis(m_nt)
##        mean       sd     5.5%    94.5%    n_eff    Rhat4
## y  10.06446 2.547552 6.821473 13.19516 336.3347 1.009329
## mu 10.05594 2.368941 7.314577 12.78223 313.4941 1.012113
# Model TN
m_tn <- ulam(
  alist(
    y ~ dstudent(2, mu, 1),
    mu ~ normal(10, 1)
  ), data=list(N=1), chains=4
)
## Running MCMC with 4 sequential chains, with 1 thread(s) per chain...
## 
## Chain 1 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 1 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 1 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 1 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 1 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 1 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 1 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 1 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 1 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 1 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 1 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 1 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 1 finished in 0.0 seconds.
## Chain 2 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 2 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 2 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 2 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 2 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 2 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 2 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 2 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 2 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 2 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 2 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 2 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 2 finished in 0.0 seconds.
## Chain 3 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 3 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 3 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 3 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 3 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 3 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 3 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 3 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 3 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 3 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 3 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 3 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 3 finished in 0.0 seconds.
## Chain 4 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 4 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 4 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 4 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 4 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 4 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 4 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 4 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 4 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 4 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 4 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 4 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 4 finished in 0.0 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.0 seconds.
## Total execution time: 0.5 seconds.
precis(m_tn)
##        mean       sd     5.5%    94.5%    n_eff    Rhat4
## y  10.15171 2.447250 7.119308 13.25176 341.4774 1.010469
## mu 10.03503 0.984235 8.552783 11.71107 644.8556 1.001575
# Model TT
m_tt <- ulam(
  alist(
    y ~ dstudent(2, mu, 1),
    mu ~ dstudent(2, 10, 1)
  ), data=list(N=1), chains=4
)
## Running MCMC with 4 sequential chains, with 1 thread(s) per chain...
## 
## Chain 1 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 1 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 1 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 1 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 1 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 1 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 1 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 1 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 1 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 1 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 1 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 1 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 1 finished in 0.0 seconds.
## Chain 2 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 2 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 2 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 2 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 2 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 2 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 2 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 2 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 2 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 2 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 2 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 2 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 2 finished in 0.0 seconds.
## Chain 3 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 3 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 3 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 3 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 3 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 3 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 3 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 3 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 3 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 3 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 3 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 3 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 3 finished in 0.0 seconds.
## Chain 4 Iteration:   1 / 1000 [  0%]  (Warmup) 
## Chain 4 Iteration: 100 / 1000 [ 10%]  (Warmup) 
## Chain 4 Iteration: 200 / 1000 [ 20%]  (Warmup) 
## Chain 4 Iteration: 300 / 1000 [ 30%]  (Warmup) 
## Chain 4 Iteration: 400 / 1000 [ 40%]  (Warmup) 
## Chain 4 Iteration: 500 / 1000 [ 50%]  (Warmup) 
## Chain 4 Iteration: 501 / 1000 [ 50%]  (Sampling) 
## Chain 4 Iteration: 600 / 1000 [ 60%]  (Sampling) 
## Chain 4 Iteration: 700 / 1000 [ 70%]  (Sampling) 
## Chain 4 Iteration: 800 / 1000 [ 80%]  (Sampling) 
## Chain 4 Iteration: 900 / 1000 [ 90%]  (Sampling) 
## Chain 4 Iteration: 1000 / 1000 [100%]  (Sampling) 
## Chain 4 finished in 0.0 seconds.
## 
## All 4 chains finished successfully.
## Mean chain execution time: 0.0 seconds.
## Total execution time: 0.5 seconds.
precis(m_tt)
##        mean       sd     5.5%    94.5%    n_eff    Rhat4
## y  9.776656 3.487327 4.834618 14.48930 489.4093 1.006145
## mu 9.829930 2.513559 6.957895 12.47837 375.0271 1.008504
# Extract results
mu_nn <- extract.samples(m_nn)$mu
mu_nt <- extract.samples(m_nt)$mu
mu_tn <- extract.samples(m_tn)$mu
mu_tt <- extract.samples(m_tt)$mu

# Plot
dens(mu_nn, col="blue")
dens(mu_nt, col="purple", add=TRUE)
dens(mu_tn, col="red", add=TRUE)
dens(mu_tt, col="orange", add=TRUE)
legend("topright",
       col=c("blue","purple", "red", "orange"),
       pch=15,
       legend=c("Model NN","Model NT", "Model TN", "Model TT"))

# Based on this posterior plot on mu, we can see that most has Gaussian-ish shape. However model with Student T's distribution as priors (model NT and model TT) has thicker tail, which will impact shrinkage of the posterior. While model with normal prior distirbution (model NN and model TN), are less likely to experience deviations at higher mu value.