ld_stan_comp.R

shige — Dec 11, 2013, 2:38 PM

library(LaplacesDemon)
Loading required package: parallel

##############################  Demon Data  ###############################
data(demonsnacks)
y <- log(demonsnacks$Calories)
X <- cbind(1, as.matrix(log(demonsnacks[,c(1,4,10)]+1)))
J <- ncol(X)
for (j in 2:J) X[,j] <- CenterScale(X[,j])
mon.names <- "LP"
parm.names <- as.parm.names(list(beta=rep(0,J), sigma=0))
pos.beta <- grep("beta", parm.names)
pos.sigma <- grep("sigma", parm.names)
PGF <- function(Data) return(c(rnormv(Data$J,0,10), rhalfcauchy(1,5)))
MyData <- list(J=J, PGF=PGF, X=X, mon.names=mon.names,
               parm.names=parm.names, pos.beta=pos.beta, pos.sigma=pos.sigma, y=y)

##########################  Model Specification  ##########################
Model <- function(parm, Data)
{
  ### Parameters
  beta <- parm[Data$pos.beta]
  sigma <- interval(parm[Data$pos.sigma], 1e-100, Inf)
  parm[Data$pos.sigma] <- sigma
  ### Log of Prior Densities
  beta.prior <- sum(dnormv(beta, 0, 1000, log=TRUE))
  sigma.prior <- dhalfcauchy(sigma, 25, log=TRUE)
  ### Log-Likelihood
  mu <- tcrossprod(Data$X, t(beta))
  LL <- sum(dnorm(Data$y, mu, sigma, log=TRUE))
  ### Log-Posterior
  LP <- LL + beta.prior + sigma.prior
  Modelout <- list(LP=LP, Dev=-2*LL, Monitor=LP,
                   yhat=rnorm(length(mu), mu, sigma), parm=parm)
  return(Modelout)
}

set.seed(666)

############################  Initial Values  #############################
Initial.Values <- GIV(Model, MyData, PGF=TRUE)

########################  Hit-And-Run Metropolis  #########################
system.time(Fit <- LaplacesDemon(Model, Data=MyData, Initial.Values,
                     Covar=NULL, Iterations=32500, Status=5000, Thinning=100,
                     Algorithm="HARM", Specs=NULL))

Laplace's Demon was called on Wed Dec 11 14:38:06 2013

Performing initial checks...
Algorithm: Hit-And-Run Metropolis 

Laplace's Demon is beginning to update...
Iteration: 5000,   Proposal: Multivariate
Iteration: 10000,   Proposal: Multivariate
Iteration: 15000,   Proposal: Multivariate
Iteration: 20000,   Proposal: Multivariate
Iteration: 25000,   Proposal: Multivariate
Iteration: 30000,   Proposal: Multivariate

Assessing Stationarity
Assessing Thinning and ESS
Creating Summaries
Estimating Log of the Marginal Likelihood
Creating Output

Laplace's Demon has finished.
   user  system elapsed 
  2.624   0.008   2.639 

#######################  Elliptical Slice Sampling  #######################
# Fit <- LaplacesDemon(Model, Data=MyData, Initial.Values,
#                      Covar=NULL, Iterations=19000, Status=100, Thinning=1,
#                      Algorithm="ESS", Specs=list(B=NULL))

Consort(Fit)

#############################################################
# Consort with Laplace's Demon                              #
#############################################################
Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values, 
    Covar = NULL, Iterations = 32500, Status = 5000, Thinning = 100, 
    Algorithm = "HARM", Specs = NULL)

Acceptance Rate: 0.2388
Adaptive: 32501
Algorithm: Hit-And-Run Metropolis
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
beta[1] beta[2] beta[3] beta[4]   sigma 
0.03317 0.21115 0.13650 0.21313 0.01703 

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
       All Stationary
Dbar 83.17      83.17
pD   16.21      16.21
DIC  99.39      99.39

Delayed Rejection (DR): 0
Initial Values:
[1]  2.3822  6.3699 -1.1230  6.4136  0.1046

Iterations: 32500
Log(Marginal Likelihood): -43.04
Minutes of run-time: 0.04
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 5
Periodicity: 32501
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 0
Recommended Burn-In of Un-thinned Samples: 0
Recommended Thinning: 200
Status is displayed every 5000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 325
Thinning: 100


Summary of All Samples
             Mean     SD     MCSE   ESS        LB   Median       UB
beta[1]    5.0422 0.1072 0.006308 325.0   4.83897   5.0475   5.2510
beta[2]    0.5658 0.3289 0.023678 263.6   0.04243   0.5517   1.1065
beta[3]    1.1857 0.3472 0.020746 325.0   0.59719   1.1673   1.8663
beta[4]    0.8915 0.3464 0.023124 291.8   0.22450   0.8818   1.5506
sigma      0.7194 0.1262 0.007116 325.0   0.56602   0.7036   0.9397
Deviance  83.1746 5.6940 0.427007 248.4  77.99458  82.0994  92.0249
LP       -62.7640 2.8478 0.213554 248.5 -67.18934 -62.2264 -60.1735


Summary of Stationary Samples
             Mean     SD     MCSE   ESS        LB   Median       UB
beta[1]    5.0422 0.1072 0.006308 325.0   4.83897   5.0475   5.2510
beta[2]    0.5658 0.3289 0.023678 263.6   0.04243   0.5517   1.1065
beta[3]    1.1857 0.3472 0.020746 325.0   0.59719   1.1673   1.8663
beta[4]    0.8915 0.3464 0.023124 291.8   0.22450   0.8818   1.5506
sigma      0.7194 0.1262 0.007116 325.0   0.56602   0.7036   0.9397
Deviance  83.1746 5.6940 0.427007 248.4  77.99458  82.0994  92.0249
LP       -62.7640 2.8478 0.213554 248.5 -67.18934 -62.2264 -60.1735

Demonic Suggestion

Due to the combination of the following conditions,

1. Hit-And-Run Metropolis
2. The acceptance rate (0.2388) is within the interval [0.15,0.5].
3. At least one target MCSE is >= 6.27% of its marginal posterior
   standard deviation.
4. Each target distribution has an effective sample size (ESS)
   of at least 100.
5. Each target distribution became stationary by
   1 iteration.

Laplace's Demon has not been appeased, and suggests
copy/pasting the following R code into the R console,
and running it.

Initial.Values <- as.initial.values(Fit)
Fit <- LaplacesDemon(Model, Data=MyData, Initial.Values,
     Covar=NULL, Iterations=65000, Status=254, Thinning=200,
     Algorithm="HARM", Specs=list(alpha.star=0.234, B=NULL))

Laplace's Demon is finished consorting.

########################  Rstan  #########################
library(rstan)
Loading required package: Rcpp
Loading required package: inline

Attaching package: 'inline'

The following object is masked from 'package:Rcpp':

    registerPlugin

rstan (Version 2.0.1, packaged: 2013-10-25 13:14:25 UTC, GitRev: 1a89615fac00)

N <- length(y)

stan_data <- list(N = N, J = J, y = y, x = X)

rs_code <- '
  data {
    int<lower=1> N;
    int<lower=1> J;
    matrix[N,J] x;
    vector[N] y;
  }
  parameters {
    vector[J] beta;
    real<lower=0> sigma;
  }
  model {
    //sigma ~ cauchy(0, 2.5);
    //beta ~ normal(0, 1000);
    y ~ normal(x * beta, sigma);
  }
'

stan_mod <- stan(model_code = rs_code, data = stan_data, chains = 1)

TRANSLATING MODEL 'rs_code' FROM Stan CODE TO C++ CODE NOW.
COMPILING THE C++ CODE FOR MODEL 'rs_code' NOW.
SAMPLING FOR MODEL 'rs_code' NOW (CHAIN 1).

Iteration:    1 / 2000 [  0%]  (Warmup)
Iteration:  200 / 2000 [ 10%]  (Warmup)
Iteration:  400 / 2000 [ 20%]  (Warmup)
Iteration:  600 / 2000 [ 30%]  (Warmup)
Iteration:  800 / 2000 [ 40%]  (Warmup)
Iteration: 1000 / 2000 [ 50%]  (Warmup)
Iteration: 1200 / 2000 [ 60%]  (Sampling)
Iteration: 1400 / 2000 [ 70%]  (Sampling)
Iteration: 1600 / 2000 [ 80%]  (Sampling)
Iteration: 1800 / 2000 [ 90%]  (Sampling)
Iteration: 2000 / 2000 [100%]  (Sampling)
Elapsed Time: 0.05 seconds (Warm-up)
              0.03 seconds (Sampling)
              0.08 seconds (Total)

system.time(stan_fit <- stan(fit = stan_mod, data = stan_data, iter = 2000, verbose = F))
SAMPLING FOR MODEL 'rs_code' NOW (CHAIN 1).

Iteration:    1 / 2000 [  0%]  (Warmup)
Iteration:  200 / 2000 [ 10%]  (Warmup)
Iteration:  400 / 2000 [ 20%]  (Warmup)
Iteration:  600 / 2000 [ 30%]  (Warmup)
Iteration:  800 / 2000 [ 40%]  (Warmup)
Iteration: 1000 / 2000 [ 50%]  (Warmup)
Iteration: 1200 / 2000 [ 60%]  (Sampling)
Iteration: 1400 / 2000 [ 70%]  (Sampling)
Iteration: 1600 / 2000 [ 80%]  (Sampling)
Iteration: 1800 / 2000 [ 90%]  (Sampling)
Iteration: 2000 / 2000 [100%]  (Sampling)
Elapsed Time: 0.04 seconds (Warm-up)
              0.04 seconds (Sampling)
              0.08 seconds (Total)

SAMPLING FOR MODEL 'rs_code' NOW (CHAIN 2).

Iteration:    1 / 2000 [  0%]  (Warmup)
Iteration:  200 / 2000 [ 10%]  (Warmup)
Iteration:  400 / 2000 [ 20%]  (Warmup)
Iteration:  600 / 2000 [ 30%]  (Warmup)
Iteration:  800 / 2000 [ 40%]  (Warmup)
Iteration: 1000 / 2000 [ 50%]  (Warmup)
Iteration: 1200 / 2000 [ 60%]  (Sampling)
Iteration: 1400 / 2000 [ 70%]  (Sampling)
Iteration: 1600 / 2000 [ 80%]  (Sampling)
Iteration: 1800 / 2000 [ 90%]  (Sampling)
Iteration: 2000 / 2000 [100%]  (Sampling)
Elapsed Time: 0.04 seconds (Warm-up)
              0.03 seconds (Sampling)
              0.07 seconds (Total)

SAMPLING FOR MODEL 'rs_code' NOW (CHAIN 3).

Iteration:    1 / 2000 [  0%]  (Warmup)
Iteration:  200 / 2000 [ 10%]  (Warmup)
Iteration:  400 / 2000 [ 20%]  (Warmup)
Iteration:  600 / 2000 [ 30%]  (Warmup)
Iteration:  800 / 2000 [ 40%]  (Warmup)
Iteration: 1000 / 2000 [ 50%]  (Warmup)
Iteration: 1200 / 2000 [ 60%]  (Sampling)
Iteration: 1400 / 2000 [ 70%]  (Sampling)
Iteration: 1600 / 2000 [ 80%]  (Sampling)
Iteration: 1800 / 2000 [ 90%]  (Sampling)
Iteration: 2000 / 2000 [100%]  (Sampling)
Elapsed Time: 0.04 seconds (Warm-up)
              0.03 seconds (Sampling)
              0.07 seconds (Total)

SAMPLING FOR MODEL 'rs_code' NOW (CHAIN 4).

Iteration:    1 / 2000 [  0%]  (Warmup)
Iteration:  200 / 2000 [ 10%]  (Warmup)
Iteration:  400 / 2000 [ 20%]  (Warmup)
Iteration:  600 / 2000 [ 30%]  (Warmup)
Iteration:  800 / 2000 [ 40%]  (Warmup)
Iteration: 1000 / 2000 [ 50%]  (Warmup)
Iteration: 1200 / 2000 [ 60%]  (Sampling)
Iteration: 1400 / 2000 [ 70%]  (Sampling)
Iteration: 1600 / 2000 [ 80%]  (Sampling)
Iteration: 1800 / 2000 [ 90%]  (Sampling)
Iteration: 2000 / 2000 [100%]  (Sampling)
Elapsed Time: 0.04 seconds (Warm-up)
              0.04 seconds (Sampling)
              0.08 seconds (Total)
   user  system elapsed 
  0.312   0.000   0.313 

print(stan_fit, digits=3)
Inference for Stan model: rs_code.
4 chains, each with iter=2000; warmup=1000; thin=1; 
post-warmup draws per chain=1000, total post-warmup draws=4000.

          mean se_mean    sd   2.5%    25%    50%    75%  97.5% n_eff
beta[1]  5.041   0.002 0.109  4.826  4.970  5.042  5.114  5.254  3523
beta[2]  0.584   0.005 0.251  0.085  0.419  0.590  0.752  1.079  3059
beta[3]  1.187   0.005 0.277  0.634  1.006  1.183  1.373  1.735  2832
beta[4]  0.894   0.005 0.293  0.317  0.706  0.892  1.079  1.496  2887
sigma    0.706   0.002 0.087  0.563  0.644  0.699  0.757  0.899  2163
lp__    -5.614   0.043 1.551 -9.416 -6.437 -5.274 -4.462 -3.601  1287
         Rhat
beta[1] 1.001
beta[2] 1.000
beta[3] 1.000
beta[4] 1.000
sigma   1.000
lp__    1.002

Samples were drawn using NUTS(diag_e) at Wed Dec 11 14:38:33 2013.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).