Assignment 7

STA6543

Author

Stephen Garcia (wqr974)

Published

July 7, 2025

Chapter 8

Tree-Based Methods

Conceptual Exercises:

Problem 2

Consider the Gini index, classification error, and entropy in a simple classification setting with two classes. Create a single plot that displays each of these quantities as a function of pm1. The xaxis should display pm1, ranging from 0 to 1, and the y-axis should display the value of the Gini index, classification error, and entropy. Hint: In a setting with two classes, pm1 = 1- pm2. You could make this plot by hand, but it will be much easier to make in R.

Applied Exercises:

Problem 8

In the lab, a classification tree was applied to the Carseats data set after converting Sales into a qualitative response variable. Now we will seek to predict Sales using regression trees and related approaches, treating the response as a quantitative variable.

(a)
Split the data set into a training set and a test set.

Warning: package 'caret' was built under R version 4.4.2
Loading required package: lattice
Warning: package 'rpart.plot' was built under R version 4.4.3

(b)
Fit a regression tree to the training set. Plot the tree, and interpret the results. What test MSE do you obtain?

[1] 4.651189

Response:
The regression tree splits first on ShelveLoc, indicating that shelf location is the most influential predictor of sales. The left side of the tree mostly reflects lower prices and less advertising, while the right side reflects higher-priced, well-placed products with higher predicted sales. The unpruned tree makes fine-grained predictions but may be overfitting. The test MSE is 4.651, which will be compared to the cross-validated pruned tree in the next step to assess the impact of pruning.

(c)
Use cross-validation in order to determine the optimal level of tree complexity. Does pruning the tree improve the test MSE?

Warning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo,
: There were missing values in resampled performance measures.
          cp
2 0.01739551

[1] 4.168602

Response:
Using 10-fold cross-validation with the caret package, we identified the optimal tree complexity that minimizes RMSE. The resulting pruned tree used fewer splits than the fully grown tree and prioritized key variables such as ShelveLoc, Price, and CompPrice. The pruned model achieved a lower test MSE (4.169) compared to the unpruned model (4.651), confirming that pruning improved predictive accuracy and reduced overfitting.

(d)
Use the bagging approach in order to analyze this data. What test MSE do you obtain? Use the importance() function to determine which variables are most important.

Warning: package 'randomForest' was built under R version 4.4.2
[1] 2.673099
               %IncMSE IncNodePurity
CompPrice   38.6569275    283.940716
Income      11.7051119    144.725894
Advertising 25.9804178    225.010347
Population   2.3945503     82.509377
Price       68.5950357    731.439777
ShelveLoc   83.7960138    782.662580
Age         19.8915172    193.729475
Education    0.4388766     61.084137
Urban        2.0329572     11.384805
US           1.6476353      9.296592

Response:
Using the bagging approach with all predictors (mtry = 10), the test set MSE was 2.673, which is a substantial improvement over the regression trees. Variable importance, based on the importanceoutput,shows that ShelveLoc and Price are the most influential predictors of Sales, followed by Advertising and CompPrice. Variables like Urban and US had negligible impact on model performance. Bagging’s strength here lies in its ability to reduce variance by averaging many decision trees trained on bootstrap samples.

(e)
Use random forests to analyze this data. What test MSE do you obtain? Use the importance() function to determine which variables are most important. Describe the effect of m, the number of variables considered at each split, on the error rate obtained.

  mtry
8    8

[1] 2.680154
rf variable importance

                Overall
ShelveLocGood   100.000
Price            94.756
CompPrice        46.017
ShelveLocMedium  36.427
Advertising      30.392
Age              28.085
Income           12.247
USYes             5.828
Education         4.581
UrbanYes          3.480
Population        0.000
                     %IncMSE IncNodePurity
CompPrice        1.363535639     276.42941
Income           0.188310157     150.28587
Advertising      0.675127776     218.28005
Population      -0.019047342      91.51165
Price            4.673152598     699.66591
ShelveLocGood    5.181775821     630.44206
ShelveLocMedium  0.702049352     110.75045
Age              0.569619309     210.85476
Education        0.035063923      70.66203
UrbanYes         0.009801894      12.79391
USYes            0.033154780      18.97139

Response:
A random forest was trained using caret with a grid search over mtry values from 1 to 10. The optimal value of mtry was found to be 8, where cross-validated RMSE was minimized. The model achieved a test MSE of 2.680, which is on par with the bagging model and significantly better than both unpruned and pruned regression trees.

Variable importance analysis revealed that ShelveLoc (especially the “Good” level) and Price were the dominant predictors of Sales. Lower-importance variables included Urban, US, and Population, which had minimal effect on model accuracy. The plot of error versus mtry showed that performance improved rapidly up to mtry ≈ 6–8, after which the benefit of additional predictors diminished.

(f)
Now analyze the data using BART, and report your results.

Warning: package 'bartMachine' was built under R version 4.4.3
Warning: package 'missForest' was built under R version 4.4.3
bartMachine initializing with 50 trees...
bartMachine vars checked...
bartMachine java init...
bartMachine factors created...
bartMachine before preprocess...
bartMachine after preprocess... 12 total features...
bartMachine sigsq estimated...
bartMachine training data finalized...
Now building bartMachine for regression...
evaluating in sample data...done
[1] 1.858556
.....

Response:
## 📊 Model Comparison Table

Model Description Test MSE
Unpruned Tree Basic regression tree using rpart() 4.651
Pruned Tree (CV) Tree with optimal cp via cross-validation 4.169
Bagging Random forest with mtry = p (all predictors) 2.673
Random Forest Tuned mtry via caret (best = 8) 2.680
BART Bayesian Additive Regression Trees 1.682

The BART model achieved the lowest test MSE of 1.682, outperforming all other methods. Variable importance analysis indicated that Price, CompPrice, and Advertising were the most influential predictors, while ShelveLoc had a lesser impact compared to tree-based methods. BART’s Bayesian framework allows it to capture complex relationships and interactions effectively, leading to superior predictive performance.

Problem 9

This problem involves the OJ data set which is part of the ISLR2 package.
(a)
Create a training set containing a random sample of 800 observations, and a test set containing the remaining observations.

Warning: package 'ISLR2' was built under R version 4.4.3

Attaching package: 'ISLR2'
The following objects are masked from 'package:ISLR':

    Auto, Credit

(b)
Fit a tree to the training data, with Purchase as the response and the other variables as predictors. Use the summary() function to produce summary statistics about the tree, and describe the results obtained. What is the training error rate? How many terminal nodes does the tree have?

Call:
(function (formula, data, weights, subset, na.action = na.rpart, 
    method, model = FALSE, x = FALSE, y = TRUE, parms, control, 
    cost, ...) 
{
    Call <- match.call()
    if (is.data.frame(model)) {
        m <- model
        model <- FALSE
    }
    else {
        indx <- match(c("formula", "data", "weights", "subset"), 
            names(Call), nomatch = 0)
        if (indx[1] == 0) 
            stop("a 'formula' argument is required")
        temp <- Call[c(1, indx)]
        temp$na.action <- na.action
        temp[[1]] <- quote(stats::model.frame)
        m <- eval.parent(temp)
    }
    Terms <- attr(m, "terms")
    if (any(attr(Terms, "order") > 1)) 
        stop("Trees cannot handle interaction terms")
    Y <- model.response(m)
    wt <- model.weights(m)
    if (any(wt < 0)) 
        stop("negative weights not allowed")
    if (!length(wt)) 
        wt <- rep(1, nrow(m))
    offset <- model.offset(m)
    X <- rpart.matrix(m)
    nobs <- nrow(X)
    nvar <- ncol(X)
    if (missing(method)) {
        method <- if (is.factor(Y) || is.character(Y)) 
            "class"
        else if (inherits(Y, "Surv")) 
            "exp"
        else if (is.matrix(Y)) 
            "poisson"
        else "anova"
    }
    if (is.list(method)) {
        mlist <- method
        method <- "user"
        init <- if (missing(parms)) 
            mlist$init(Y, offset, wt = wt)
        else mlist$init(Y, offset, parms, wt)
        keep <- rpartcallback(mlist, nobs, init)
        method.int <- 4
        parms <- init$parms
    }
    else {
        method.int <- pmatch(method, c("anova", "poisson", "class", 
            "exp"))
        if (is.na(method.int)) 
            stop("Invalid method")
        method <- c("anova", "poisson", "class", "exp")[method.int]
        if (method.int == 4) 
            method.int <- 2
        init <- if (missing(parms)) 
            get(paste("rpart", method, sep = "."), envir = environment())(Y, 
                offset, , wt)
        else get(paste("rpart", method, sep = "."), envir = environment())(Y, 
            offset, parms, wt)
        ns <- asNamespace("rpart")
        if (!is.null(init$print)) 
            environment(init$print) <- ns
        if (!is.null(init$summary)) 
            environment(init$summary) <- ns
        if (!is.null(init$text)) 
            environment(init$text) <- ns
    }
    Y <- init$y
    xlevels <- .getXlevels(Terms, m)
    cats <- rep(0, ncol(X))
    if (!is.null(xlevels)) {
        indx <- match(names(xlevels), colnames(X), nomatch = 0)
        cats[indx] <- (unlist(lapply(xlevels, length)))[indx > 
            0]
    }
    extraArgs <- list(...)
    if (length(extraArgs)) {
        controlargs <- names(formals(rpart.control))
        indx <- match(names(extraArgs), controlargs, nomatch = 0)
        if (any(indx == 0)) 
            stop(gettextf("Argument %s not matched", names(extraArgs)[indx == 
                0]), domain = NA)
    }
    controls <- rpart.control(...)
    if (!missing(control)) 
        controls[names(control)] <- control
    xval <- controls$xval
    if (is.null(xval) || (length(xval) == 1 && xval == 0) || 
        method == "user") {
        xgroups <- 0
        xval <- 0
    }
    else if (length(xval) == 1) {
        xgroups <- sample(rep(1:xval, length.out = nobs), nobs, 
            replace = FALSE)
    }
    else if (length(xval) == nobs) {
        xgroups <- xval
        xval <- length(unique(xgroups))
    }
    else {
        if (!is.null(attr(m, "na.action"))) {
            temp <- as.integer(attr(m, "na.action"))
            xval <- xval[-temp]
            if (length(xval) == nobs) {
                xgroups <- xval
                xval <- length(unique(xgroups))
            }
            else stop("Wrong length for 'xval'")
        }
        else stop("Wrong length for 'xval'")
    }
    if (missing(cost)) 
        cost <- rep(1, nvar)
    else {
        if (length(cost) != nvar) 
            stop("Cost vector is the wrong length")
        if (any(cost <= 0)) 
            stop("Cost vector must be positive")
    }
    tfun <- function(x) if (is.matrix(x)) 
        rep(is.ordered(x), ncol(x))
    else is.ordered(x)
    labs <- sub("^`(.*)`$", "\\1", attr(Terms, "term.labels"))
    isord <- unlist(lapply(m[labs], tfun))
    storage.mode(X) <- "double"
    storage.mode(wt) <- "double"
    temp <- as.double(unlist(init$parms))
    if (!length(temp)) 
        temp <- 0
    rpfit <- .Call(C_rpart, ncat = as.integer(cats * !isord), 
        method = as.integer(method.int), as.double(unlist(controls)), 
        temp, as.integer(xval), as.integer(xgroups), as.double(t(init$y)), 
        X, wt, as.integer(init$numy), as.double(cost))
    nsplit <- nrow(rpfit$isplit)
    ncat <- if (!is.null(rpfit$csplit)) 
        nrow(rpfit$csplit)
    else 0
    if (nsplit == 0) 
        xval <- 0
    numcp <- ncol(rpfit$cptable)
    temp <- if (nrow(rpfit$cptable) == 3) 
        c("CP", "nsplit", "rel error")
    else c("CP", "nsplit", "rel error", "xerror", "xstd")
    dimnames(rpfit$cptable) <- list(temp, 1:numcp)
    tname <- c("<leaf>", colnames(X))
    splits <- matrix(c(rpfit$isplit[, 2:3], rpfit$dsplit), ncol = 5, 
        dimnames = list(tname[rpfit$isplit[, 1] + 1], c("count", 
            "ncat", "improve", "index", "adj")))
    index <- rpfit$inode[, 2]
    nadd <- sum(isord[rpfit$isplit[, 1]])
    if (nadd > 0) {
        newc <- matrix(0, nadd, max(cats))
        cvar <- rpfit$isplit[, 1]
        indx <- isord[cvar]
        cdir <- splits[indx, 2]
        ccut <- floor(splits[indx, 4])
        splits[indx, 2] <- cats[cvar[indx]]
        splits[indx, 4] <- ncat + 1:nadd
        for (i in 1:nadd) {
            newc[i, 1:(cats[(cvar[indx])[i]])] <- -as.integer(cdir[i])
            newc[i, 1:ccut[i]] <- as.integer(cdir[i])
        }
        catmat <- if (ncat == 0) 
            newc
        else {
            cs <- rpfit$csplit
            ncs <- ncol(cs)
            ncc <- ncol(newc)
            if (ncs < ncc) 
                cs <- cbind(cs, matrix(0, nrow(cs), ncc - ncs))
            rbind(cs, newc)
        }
        ncat <- ncat + nadd
    }
    else catmat <- rpfit$csplit
    if (nsplit == 0) {
        frame <- data.frame(row.names = 1, var = "<leaf>", n = rpfit$inode[, 
            5], wt = rpfit$dnode[, 3], dev = rpfit$dnode[, 1], 
            yval = rpfit$dnode[, 4], complexity = rpfit$dnode[, 
                2], ncompete = 0, nsurrogate = 0)
    }
    else {
        temp <- ifelse(index == 0, 1, index)
        svar <- ifelse(index == 0, 0, rpfit$isplit[temp, 1])
        frame <- data.frame(row.names = rpfit$inode[, 1], var = tname[svar + 
            1], n = rpfit$inode[, 5], wt = rpfit$dnode[, 3], 
            dev = rpfit$dnode[, 1], yval = rpfit$dnode[, 4], 
            complexity = rpfit$dnode[, 2], ncompete = pmax(0, 
                rpfit$inode[, 3] - 1), nsurrogate = rpfit$inode[, 
                4])
    }
    if (method.int == 3) {
        numclass <- init$numresp - 2
        nodeprob <- rpfit$dnode[, numclass + 5]/sum(wt)
        temp <- pmax(1, init$counts)
        temp <- rpfit$dnode[, 4 + (1:numclass)] %*% diag(init$parms$prior/temp)
        yprob <- temp/rowSums(temp)
        yval2 <- matrix(rpfit$dnode[, 4 + (0:numclass)], ncol = numclass + 
            1)
        frame$yval2 <- cbind(yval2, yprob, nodeprob)
    }
    else if (init$numresp > 1) 
        frame$yval2 <- rpfit$dnode[, -(1:3), drop = FALSE]
    if (is.null(init$summary)) 
        stop("Initialization routine is missing the 'summary' function")
    functions <- if (is.null(init$print)) 
        list(summary = init$summary)
    else list(summary = init$summary, print = init$print)
    if (!is.null(init$text)) 
        functions <- c(functions, list(text = init$text))
    if (method == "user") 
        functions <- c(functions, mlist)
    where <- rpfit$which
    names(where) <- row.names(m)
    ans <- list(frame = frame, where = where, call = Call, terms = Terms, 
        cptable = t(rpfit$cptable), method = method, parms = init$parms, 
        control = controls, functions = functions, numresp = init$numresp)
    if (nsplit) 
        ans$splits = splits
    if (ncat > 0) 
        ans$csplit <- catmat + 2
    if (nsplit) 
        ans$variable.importance <- importance(ans)
    if (model) {
        ans$model <- m
        if (missing(y)) 
            y <- FALSE
    }
    if (y) 
        ans$y <- Y
    if (x) {
        ans$x <- X
        ans$wt <- wt
    }
    ans$ordered <- isord
    if (!is.null(attr(m, "na.action"))) 
        ans$na.action <- attr(m, "na.action")
    if (!is.null(xlevels)) 
        attr(ans, "xlevels") <- xlevels
    if (method == "class") 
        attr(ans, "ylevels") <- init$ylevels
    class(ans) <- "rpart"
    ans
})(formula = .outcome ~ ., data = list(c(237, 239, 245, 227, 
240, 263, 276, 278, 278, 240, 268, 269, 254, 257, 264, 269, 271, 
272, 274, 278, 246, 265, 275, 256, 271, 229, 233, 236, 237, 244, 
245, 246, 247, 248, 252, 254, 256, 256, 258, 259, 260, 263, 265, 
265, 266, 267, 269, 270, 272, 272, 274, 275, 276, 278, 230, 252, 
255, 275, 276, 277, 252, 277, 237, 269, 242, 244, 277, 277, 227, 
229, 234, 236, 240, 241, 245, 245, 248, 249, 249, 251, 253, 254, 
257, 257, 258, 259, 259, 260, 261, 261, 263, 265, 266, 266, 267, 
270, 270, 271, 273, 274, 275, 276, 278, 242, 237, 251, 261, 277, 
233, 236, 243, 244, 277, 230, 231, 233, 263, 264, 265, 266, 269, 
270, 272, 274, 277, 269, 229, 230, 231, 233, 239, 240, 242, 246, 
249, 250, 251, 255, 257, 260, 262, 265, 266, 267, 269, 271, 272, 
274, 275, 229, 230, 231, 233, 237, 241, 253, 259, 260, 263, 264, 
265, 265, 267, 270, 272, 274, 234, 243, 251, 253, 242, 244, 247, 
274, 276, 233, 234, 235, 236, 250, 254, 257, 260, 261, 262, 264, 
266, 268, 269, 271, 272, 275, 229, 257, 264, 265, 267, 270, 272, 
273, 276, 278, 229, 235, 261, 273, 276, 278, 229, 234, 239, 240, 
241, 247, 248, 250, 253, 259, 263, 264, 266, 267, 273, 275, 276, 
229, 231, 233, 234, 240, 242, 256, 262, 263, 269, 270, 272, 273, 
275, 276, 276, 278, 238, 241, 251, 257, 264, 271, 230, 231, 251, 
267, 268, 262, 266, 229, 236, 238, 242, 254, 233, 251, 253, 254, 
260, 263, 264, 267, 268, 269, 277, 278, 230, 273, 227, 229, 252, 
265, 259, 261, 240, 244, 256, 264, 266, 267, 274, 228, 237, 258, 
242, 254, 257, 260, 227, 227, 229, 230, 235, 237, 240, 257, 258, 
272, 274, 276, 236, 243, 255, 261, 234, 235, 238, 258, 259, 260, 
235, 242, 277, 262, 264, 265, 267, 270, 272, 234, 257, 261, 265, 
275, 243, 251, 256, 257, 261, 266, 275, 233, 256, 260, 262, 263, 
267, 268, 271, 272, 274, 275, 277, 246, 228, 265, 267, 272, 274, 
275, 276, 277, 246, 253, 266, 266, 267, 269, 253, 235, 236, 237, 
239, 240, 242, 245, 246, 267, 262, 267, 276, 229, 244, 248, 252, 
255, 276, 278, 231, 257, 259, 275, 241, 277, 262, 263, 242, 247, 
250, 236, 236, 237, 238, 239, 254, 264, 266, 259, 274, 274, 259, 
228, 231, 234, 235, 236, 236, 238, 239, 245, 251, 275, 276, 233, 
270, 245, 230, 266, 227, 277, 253, 236, 237, 240, 255, 256, 272, 
254, 255, 258, 263, 264, 265, 231, 234, 251, 253, 254, 255, 258, 
259, 259, 260, 260, 261, 262, 262, 264, 265, 266, 267, 269, 259, 
260, 262, 262, 264, 265, 267, 269, 270, 272, 272, 273, 274, 242, 
260, 274, 242, 239, 241, 248, 262, 233, 236, 237, 238, 239, 240, 
241, 247, 248, 249, 251, 254, 261, 269, 270, 271, 266, 265, 267, 
243, 256, 259, 261, 254, 260, 267, 268, 270, 273, 274, 274, 277, 
235, 242, 243, 256, 259, 278, 227, 228, 229, 230, 231, 233, 234, 
235, 236, 241, 245, 247, 248, 250, 254, 255, 256, 257, 259, 260, 
263, 264, 265, 269, 271, 272, 274, 275, 269, 270, 275, 274, 275, 
276, 242, 250, 229, 253, 256, 259, 259, 233, 262, 234, 238, 229, 
232, 232, 266, 241, 248, 256, 259, 237, 242, 278, 250, 253, 260, 
263, 278, 228, 235, 245, 248, 232, 267, 278, 247, 250, 230, 232, 
233, 274, 275, 241, 237, 257, 259, 244, 276, 277, 241, 258, 259, 
259, 262, 264, 265, 265, 266, 267, 268, 269, 270, 271, 273, 274, 
277, 228, 229, 232, 240, 244, 248, 251, 270, 274, 228, 237, 241, 
252, 231, 251, 243, 262, 268, 270, 274, 274, 274, 275, 275, 234, 
269, 232, 232, 237, 240, 242, 246, 251, 253, 274, 234, 256, 228, 
233, 236, 260, 268, 269, 274, 275, 276, 248, 251, 254, 256, 260, 
262, 277, 265, 269, 274, 253, 254, 257, 258, 259, 260, 262, 269, 
277, 238, 273, 275, 276, 236, 273, 247, 246, 246, 277, 269, 272, 
275, 276, 231, 231, 231, 234, 236, 242, 227, 227, 229, 233, 249, 
250, 255, 258, 259, 260, 262, 266, 268, 270, 271, 273, 275, 277, 
232, 227, 230, 233, 236, 238, 242, 244, 275, 276, 277, 278, 233, 
262, 264, 274, 274, 269, 271, 275, 229, 243, 258, 229, 254, 275, 
228, 263, 235, 238, 260, 272, 272, 230, 231, 232, 233, 235, 240, 
240, 252, 256, 258, 260, 260, 263, 265, 265, 269, 270, 272, 228, 
233, 235, 237, 238, 238, 239, 240, 241, 252, 253, 255, 256, 228, 
230, 231, 237, 238, 243, 227, 233, 235, 236, 242, 245, 251, 251, 
252, 256, 261, 270), c(1, 1, 1, 1, 7, 7, 7, 7, 7, 1, 2, 2, 7, 
7, 7, 7, 7, 7, 7, 7, 2, 7, 2, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 7, 
2, 4, 4, 2, 4, 4, 3, 4, 4, 4, 7, 2, 7, 2, 2, 2, 2, 2, 2, 2, 1, 
2, 2, 1, 3, 2, 2, 3, 2, 2, 3, 2, 2, 1, 2, 2, 2, 1, 2, 2, 2, 1, 
2, 7, 2, 2, 7, 2, 3, 1, 4, 7, 4, 7, 7, 7, 7, 7, 3, 3, 1, 4, 4, 
4, 4, 4, 1, 4, 4, 4, 2, 7, 7, 4, 4, 7, 4, 7, 4, 4, 7, 4, 4, 4, 
4, 4, 4, 7, 7, 7, 4, 4, 4, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 
7, 7, 7, 7, 7, 2, 1, 1, 1, 1, 3, 3, 3, 7, 7, 7, 7, 7, 3, 7, 7, 
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 2, 3, 2, 2, 2, 2, 2, 2, 2, 
3, 3, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 2, 
2, 3, 2, 2, 2, 3, 2, 2, 2, 2, 2, 3, 2, 2, 2, 2, 2, 2, 2, 3, 3, 
3, 3, 3, 3, 1, 1, 1, 4, 7, 1, 1, 7, 7, 7, 7, 7, 7, 7, 7, 1, 7, 
7, 7, 7, 7, 7, 7, 7, 4, 4, 4, 4, 4, 4, 7, 7, 4, 1, 3, 7, 2, 1, 
7, 1, 2, 1, 2, 2, 2, 2, 7, 2, 2, 7, 7, 2, 7, 7, 7, 2, 7, 7, 1, 
1, 7, 7, 7, 3, 3, 2, 3, 3, 2, 3, 3, 4, 7, 7, 4, 4, 7, 7, 4, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 7, 7, 3, 2, 2, 2, 2, 3, 2, 7, 2, 3, 
2, 7, 7, 7, 7, 7, 2, 7, 7, 7, 7, 7, 7, 7, 7, 2, 3, 3, 2, 3, 2, 
2, 3, 3, 3, 2, 2, 7, 3, 3, 3, 3, 3, 3, 3, 2, 2, 2, 2, 1, 7, 1, 
1, 3, 2, 2, 7, 2, 7, 4, 2, 7, 7, 4, 7, 7, 7, 7, 3, 2, 2, 2, 2, 
7, 2, 2, 2, 2, 1, 1, 7, 7, 2, 4, 7, 4, 7, 1, 3, 3, 3, 3, 3, 7, 
7, 7, 7, 7, 7, 7, 1, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 
7, 1, 7, 7, 7, 7, 7, 7, 1, 1, 7, 7, 1, 1, 1, 7, 7, 2, 2, 7, 1, 
3, 3, 3, 2, 3, 2, 2, 2, 3, 2, 3, 3, 3, 2, 2, 2, 2, 2, 2, 2, 7, 
7, 7, 1, 2, 2, 2, 4, 1, 1, 1, 1, 1, 1, 1, 1, 7, 3, 7, 7, 7, 7, 
3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 
3, 3, 3, 3, 3, 3, 3, 7, 7, 1, 4, 4, 4, 1, 7, 3, 3, 3, 3, 3, 7, 
1, 3, 7, 7, 4, 1, 1, 1, 2, 2, 2, 2, 3, 3, 2, 2, 2, 7, 7, 2, 2, 
2, 2, 7, 2, 2, 1, 1, 3, 3, 3, 7, 2, 3, 1, 1, 1, 7, 7, 7, 3, 1, 
7, 7, 7, 7, 7, 7, 7, 7, 7, 1, 7, 7, 7, 7, 7, 4, 4, 4, 4, 4, 1, 
4, 4, 4, 2, 3, 3, 3, 4, 4, 1, 2, 1, 2, 7, 2, 2, 2, 2, 7, 7, 1, 
1, 1, 7, 1, 1, 7, 7, 7, 7, 7, 7, 7, 7, 4, 7, 7, 7, 7, 7, 4, 4, 
4, 4, 4, 4, 4, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 1, 1, 1, 2, 
2, 2, 2, 2, 2, 1, 2, 2, 7, 7, 3, 3, 3, 3, 3, 3, 4, 2, 2, 2, 2, 
2, 2, 2, 7, 3, 2, 2, 3, 2, 3, 3, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 7, 1, 7, 1, 7, 3, 3, 2, 7, 4, 7, 7, 7, 7, 2, 2, 3, 
3, 3, 3, 3, 7, 7, 7, 7, 7, 7, 7, 2, 2, 4, 7, 1, 7, 7, 7, 4, 7, 
4, 1, 1, 1, 7, 1, 1, 1, 1, 1, 1, 1, 1, 1, 7, 7, 7, 7, 7, 2, 1, 
7, 1, 1, 1, 7, 1, 7, 7, 7, 7, 1), c(1.75, 1.75, 1.86, 1.69, 1.86, 
1.86, 1.99, 2.06, 2.06, 1.75, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 2.06, 1.89, 1.86, 1.96, 1.99, 1.99, 1.79, 1.79, 
1.79, 1.79, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 2.09, 2.09, 2.09, 1.69, 1.89, 1.99, 2.09, 1.99, 2.09, 
1.99, 2.09, 1.79, 1.99, 1.99, 1.86, 1.99, 1.99, 1.69, 1.69, 1.69, 
1.75, 1.75, 1.75, 1.89, 1.86, 1.89, 1.89, 1.86, 1.99, 1.89, 1.89, 
1.99, 1.86, 1.86, 1.99, 1.86, 1.86, 1.76, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.96, 1.96, 1.99, 1.99, 1.99, 
1.75, 1.99, 1.86, 2.09, 1.75, 1.75, 1.86, 1.86, 1.99, 1.79, 1.79, 
1.69, 1.99, 1.99, 1.99, 1.99, 1.99, 1.86, 1.99, 1.99, 2.09, 1.86, 
1.69, 1.69, 1.79, 1.79, 1.79, 1.79, 1.86, 1.99, 1.99, 1.86, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.99, 1.86, 1.86, 1.86, 1.99, 1.99, 1.99, 
1.99, 1.69, 1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.96, 1.69, 1.86, 1.76, 1.76, 
1.99, 1.99, 1.99, 1.86, 1.99, 1.75, 1.75, 1.75, 1.79, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.99, 
1.69, 1.86, 1.99, 1.86, 1.86, 1.86, 1.86, 1.86, 1.99, 1.99, 1.79, 
1.79, 1.86, 1.86, 2.09, 2.09, 1.79, 1.79, 1.79, 1.79, 1.79, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.86, 1.96, 2.09, 
1.69, 1.69, 1.69, 1.79, 1.75, 1.75, 1.89, 1.86, 1.86, 1.99, 1.86, 
1.86, 1.86, 1.96, 1.99, 1.99, 1.99, 1.79, 1.79, 1.99, 1.99, 1.99, 
1.99, 1.69, 1.69, 1.76, 1.99, 1.86, 1.76, 1.86, 1.69, 1.75, 1.75, 
1.86, 1.86, 1.75, 1.86, 1.86, 1.76, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.99, 2.06, 1.79, 1.99, 1.79, 1.79, 1.99, 1.99, 1.86, 1.86, 
1.79, 1.86, 1.99, 1.86, 1.86, 1.86, 1.86, 1.69, 1.75, 1.76, 1.75, 
1.89, 1.86, 1.86, 1.69, 1.69, 1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.99, 1.75, 1.86, 1.86, 1.86, 1.75, 1.79, 1.79, 
1.86, 1.99, 1.99, 1.69, 1.99, 2.09, 1.99, 1.86, 1.86, 1.99, 1.99, 
1.86, 1.75, 1.99, 1.76, 1.76, 1.96, 1.86, 1.76, 1.76, 1.76, 1.76, 
1.86, 1.96, 1.75, 1.86, 1.99, 1.86, 1.86, 1.86, 1.86, 1.99, 1.86, 
1.86, 1.96, 2.09, 1.89, 1.69, 1.86, 1.86, 1.86, 1.86, 1.96, 1.99, 
1.99, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.89, 1.79, 1.79, 1.75, 
1.79, 1.75, 1.75, 1.99, 1.99, 1.99, 1.86, 1.86, 1.99, 1.79, 1.99, 
1.99, 1.99, 1.99, 2.09, 2.09, 1.69, 1.86, 1.86, 1.96, 1.86, 1.99, 
1.76, 1.76, 1.99, 1.89, 1.89, 1.75, 1.75, 1.75, 1.79, 1.75, 1.86, 
1.86, 1.99, 1.86, 1.86, 1.86, 1.86, 1.79, 1.69, 1.69, 1.69, 1.75, 
1.75, 1.75, 1.75, 1.89, 1.89, 1.96, 1.99, 1.75, 1.86, 1.89, 1.79, 
1.86, 1.79, 1.99, 1.76, 1.79, 1.79, 1.79, 1.99, 1.99, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.69, 1.75, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.76, 1.76, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.75, 1.86, 1.86, 1.86, 1.79, 1.79, 1.99, 
1.86, 1.79, 1.75, 1.75, 1.75, 1.79, 1.75, 1.79, 1.99, 1.99, 1.89, 
1.89, 1.89, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.89, 
1.86, 1.86, 1.99, 1.76, 1.86, 1.86, 1.86, 1.86, 1.96, 1.96, 1.99, 
1.75, 1.99, 1.86, 1.86, 1.86, 2.06, 1.79, 1.79, 1.79, 1.79, 1.79, 
1.79, 1.79, 1.79, 1.79, 1.79, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
2.09, 1.86, 1.86, 1.96, 1.99, 2.09, 2.09, 1.86, 1.86, 1.79, 1.99, 
1.99, 1.99, 1.99, 1.75, 1.76, 1.79, 1.75, 1.69, 1.79, 1.69, 1.86, 
1.86, 1.89, 1.89, 1.86, 1.75, 1.99, 2.09, 1.89, 1.89, 1.86, 1.86, 
2.06, 1.69, 1.69, 1.89, 1.89, 1.69, 1.86, 1.99, 1.86, 1.86, 1.79, 
1.79, 1.79, 1.86, 1.96, 1.79, 1.75, 1.76, 1.76, 1.86, 1.99, 1.99, 
1.79, 1.76, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.99, 1.79, 1.79, 1.79, 1.79, 1.99, 
1.86, 1.99, 1.99, 1.99, 1.69, 1.79, 1.79, 1.99, 1.79, 1.99, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.96, 1.96, 1.96, 1.96, 1.75, 1.86, 1.69, 
1.69, 1.75, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.75, 1.86, 1.69, 
1.75, 1.75, 1.99, 1.86, 1.86, 1.86, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 2.09, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.99, 1.75, 1.86, 1.96, 1.99, 1.75, 1.86, 
1.89, 1.89, 1.89, 1.99, 1.86, 1.86, 1.99, 1.99, 1.79, 1.79, 1.79, 
1.79, 1.79, 1.99, 1.79, 1.69, 1.69, 1.69, 1.89, 1.89, 1.89, 1.86, 
1.86, 1.99, 1.86, 1.86, 1.99, 1.86, 1.99, 1.99, 1.96, 1.99, 1.69, 
1.69, 1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 1.96, 1.99, 1.99, 1.99, 
1.75, 1.76, 1.86, 1.96, 1.86, 1.99, 1.99, 1.96, 1.69, 1.99, 1.86, 
1.69, 1.86, 1.99, 1.69, 1.86, 1.79, 1.79, 1.99, 1.99, 1.99, 1.69, 
1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 1.89, 1.89, 1.99, 1.86, 1.76, 
1.86, 1.86, 1.86, 1.99, 1.86, 1.99, 1.69, 1.69, 1.69, 1.75, 1.75, 
1.75, 1.75, 1.75, 1.86, 1.76, 1.76, 1.76, 1.76, 1.69, 1.69, 1.69, 
1.75, 1.75, 1.86, 1.69, 1.75, 1.69, 1.75, 1.86, 1.86, 1.76, 1.86, 
1.86, 1.86, 1.86, 1.86), c(1.99, 1.99, 2.09, 1.69, 2.09, 2.13, 
2.13, 2.13, 2.13, 1.99, 2.18, 2.18, 2.18, 2.18, 2.13, 2.13, 2.13, 
2.13, 2.13, 2.13, 2.09, 2.13, 2.18, 2.29, 2.09, 1.79, 2.09, 2.09, 
2.09, 2.23, 2.23, 2.23, 2.23, 2.23, 2.23, 2.29, 2.29, 2.29, 2.29, 
2.29, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 
2.09, 2.09, 2.09, 2.09, 1.99, 2.09, 2.29, 2.09, 2.18, 2.09, 2.23, 
2.09, 2.09, 2.09, 2.23, 2.09, 2.18, 2.13, 1.69, 1.69, 1.99, 1.99, 
1.99, 1.99, 2.09, 2.09, 2.09, 2.09, 2.09, 2.23, 2.09, 2.09, 2.29, 
2.18, 2.18, 2.29, 2.18, 2.18, 1.99, 2.18, 2.18, 2.18, 1.99, 2.18, 
2.18, 2.18, 2.18, 2.18, 2.13, 2.18, 2.18, 2.13, 2.18, 2.23, 1.99, 
2.23, 2.13, 2.09, 1.99, 1.99, 2.09, 2.09, 2.13, 1.79, 1.79, 1.99, 
2.09, 2.09, 2.09, 2.09, 2.09, 2.18, 2.09, 2.09, 2.09, 2.18, 1.69, 
1.99, 1.79, 2.09, 2.09, 2.23, 2.09, 2.23, 2.23, 2.09, 2.23, 2.29, 
2.29, 2.09, 2.09, 2.09, 2.13, 2.13, 2.13, 2.09, 2.09, 2.09, 2.13, 
1.69, 1.99, 1.99, 1.99, 1.99, 2.09, 2.09, 2.18, 2.13, 2.13, 2.13, 
2.13, 2.13, 2.13, 2.13, 2.13, 2.18, 1.99, 1.99, 2.09, 2.09, 2.23, 
2.23, 2.23, 2.13, 2.13, 1.99, 1.99, 1.99, 2.09, 2.09, 2.18, 2.18, 
2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 1.69, 
2.18, 2.09, 2.18, 2.18, 2.18, 2.18, 2.18, 2.18, 2.18, 1.79, 2.09, 
2.18, 2.18, 2.09, 2.09, 1.79, 2.09, 2.23, 2.23, 2.23, 2.23, 2.23, 
2.23, 2.23, 2.29, 2.09, 2.09, 2.09, 2.09, 2.18, 2.18, 2.09, 1.69, 
1.69, 1.69, 2.09, 1.99, 1.99, 2.18, 2.18, 2.18, 2.09, 2.18, 2.18, 
2.18, 2.18, 2.18, 2.18, 2.18, 2.09, 2.23, 2.23, 2.29, 2.09, 2.09, 
1.69, 1.69, 2.09, 2.09, 2.13, 1.99, 1.99, 1.69, 1.99, 1.99, 2.09, 
2.18, 1.99, 2.09, 2.09, 2.09, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 
2.13, 2.13, 1.79, 2.09, 1.79, 1.79, 2.23, 2.09, 2.18, 2.13, 2.23, 
2.09, 2.29, 2.13, 2.18, 1.99, 2.13, 1.69, 1.99, 2.18, 1.99, 2.09, 
2.18, 2.18, 1.69, 1.69, 1.69, 1.99, 1.99, 1.99, 2.09, 2.18, 2.18, 
2.18, 2.13, 2.13, 1.99, 1.99, 2.18, 2.13, 1.99, 2.09, 2.09, 2.18, 
2.29, 2.09, 1.99, 2.23, 2.09, 2.09, 2.13, 2.13, 2.09, 2.09, 2.13, 
1.99, 2.29, 1.99, 1.99, 2.13, 1.99, 2.09, 2.18, 2.18, 1.99, 1.99, 
2.13, 1.99, 2.18, 2.09, 2.18, 2.18, 2.18, 2.18, 2.09, 2.18, 2.13, 
2.18, 2.09, 2.09, 1.69, 2.13, 2.13, 2.13, 2.13, 2.18, 2.13, 2.13, 
2.09, 2.09, 2.13, 2.13, 2.13, 2.13, 2.09, 2.09, 2.09, 1.99, 2.23, 
1.99, 1.99, 2.23, 2.23, 2.09, 2.18, 2.18, 2.13, 1.79, 2.23, 2.23, 
2.23, 2.29, 2.09, 2.09, 1.69, 2.18, 2.18, 2.18, 1.99, 2.13, 1.99, 
1.99, 2.23, 2.09, 2.09, 1.99, 1.99, 1.99, 2.09, 1.99, 2.18, 2.13, 
2.09, 2.18, 2.13, 2.13, 2.18, 1.79, 1.69, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 2.09, 2.09, 2.13, 2.13, 1.99, 2.13, 2.09, 1.79, 2.13, 
1.79, 2.13, 2.09, 2.09, 2.09, 2.23, 2.29, 2.29, 2.13, 2.18, 2.18, 
2.18, 2.13, 2.13, 2.13, 1.69, 1.99, 2.09, 2.09, 2.18, 2.18, 2.18, 
2.18, 2.18, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 1.99, 2.13, 
2.13, 2.18, 2.13, 2.13, 2.13, 1.99, 1.99, 2.13, 2.13, 2.18, 2.18, 
2.18, 2.13, 2.13, 1.99, 2.18, 2.13, 1.99, 2.23, 2.23, 2.23, 2.18, 
2.09, 1.99, 1.99, 1.99, 2.23, 1.99, 2.23, 2.23, 2.23, 2.09, 2.09, 
2.09, 2.18, 2.18, 2.18, 2.18, 2.13, 2.13, 2.13, 1.99, 2.18, 2.18, 
2.18, 2.29, 1.99, 1.99, 1.99, 2.18, 2.18, 2.13, 2.13, 2.13, 1.99, 
2.23, 2.09, 2.18, 2.18, 2.13, 1.79, 1.79, 1.79, 1.79, 1.79, 2.09, 
2.09, 2.09, 2.09, 2.23, 2.23, 2.23, 2.23, 2.23, 2.29, 2.29, 2.29, 
2.29, 2.29, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 
2.13, 2.13, 2.13, 2.09, 2.09, 2.09, 1.99, 2.09, 1.79, 2.23, 2.29, 
2.29, 2.29, 1.99, 1.99, 2.09, 1.99, 1.69, 2.09, 1.99, 1.99, 1.99, 
2.09, 2.18, 2.18, 1.99, 2.23, 2.09, 2.09, 2.09, 2.18, 2.13, 2.13, 
1.69, 1.99, 2.09, 2.09, 1.99, 2.18, 2.18, 2.09, 2.09, 1.79, 2.09, 
2.09, 2.13, 2.18, 2.23, 1.99, 2.18, 1.99, 2.09, 2.13, 2.13, 2.23, 
2.18, 2.18, 2.18, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.18, 
2.13, 2.13, 2.13, 2.13, 2.13, 1.79, 1.79, 2.09, 2.23, 2.23, 2.09, 
2.23, 2.09, 2.09, 1.69, 2.09, 2.23, 2.23, 1.79, 2.23, 1.99, 2.18, 
1.99, 2.18, 2.13, 2.18, 2.18, 2.18, 2.18, 1.99, 2.13, 1.99, 1.99, 
1.99, 2.09, 1.99, 2.09, 2.09, 2.09, 2.13, 1.99, 2.18, 1.69, 1.99, 
1.99, 2.09, 2.13, 2.13, 2.13, 2.13, 2.13, 2.23, 2.23, 2.29, 2.29, 
2.09, 2.09, 2.09, 2.13, 2.13, 2.13, 2.09, 2.18, 2.18, 2.18, 2.18, 
2.13, 2.13, 2.13, 2.13, 1.99, 2.18, 2.13, 2.18, 1.99, 2.18, 2.09, 
2.09, 2.09, 2.13, 2.18, 2.18, 2.13, 2.13, 1.79, 1.79, 1.79, 2.09, 
2.09, 2.23, 1.79, 1.69, 1.69, 1.69, 2.09, 2.09, 2.18, 2.18, 2.18, 
2.09, 2.18, 2.18, 2.09, 2.18, 2.09, 2.09, 2.18, 2.18, 1.99, 1.69, 
1.69, 1.99, 1.99, 1.99, 1.99, 2.09, 2.13, 2.13, 2.13, 2.13, 1.99, 
1.99, 2.13, 2.13, 2.13, 2.09, 2.09, 2.18, 1.69, 2.23, 2.18, 1.69, 
2.18, 2.13, 1.69, 2.18, 2.09, 2.09, 2.09, 2.09, 2.09, 1.99, 1.99, 
1.99, 1.99, 1.99, 2.09, 2.09, 2.09, 2.18, 2.29, 2.13, 1.99, 2.13, 
2.13, 2.13, 2.09, 2.13, 2.09, 1.69, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 2.09, 2.09, 2.18, 2.18, 1.69, 1.99, 1.99, 1.99, 
1.99, 2.09, 1.69, 1.99, 1.99, 1.99, 1.99, 2.09, 2.09, 2.09, 2.09, 
2.18, 2.13, 2.18), c(0, 0, 0.17, 0, 0, 0.27, 0, 0, 0, 0, 0, 0, 
0, 0, 0.37, 0.27, 0, 0, 0.47, 0, 0, 0.37, 0, 0, 0.1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, 0.1, 0.1, 0.1, 
0.1, 0.1, 0, 0, 0, 0, 0.2, 0.2, 0, 0.13, 0, 0, 0, 0.2, 0, 0.2, 
0, 0.1, 0, 0, 0, 0.5, 0, 0, 0, 0, 0, 0, 0, 0.17, 0, 0, 0, 0, 
0.13, 0.13, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0.2, 0, 0, 0, 0, 0.5, 0, 0, 0, 0, 0.1, 
0.1, 0.1, 0.1, 0, 0, 0, 0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0.1, 0.37, 0, 0.27, 0.1, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0.1, 0, 0, 0.27, 0.37, 0.37, 0.37, 0, 0.27, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0.47, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.37, 0.37, 
0, 0.27, 0, 0, 0, 0, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0.2, 0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, 0.1, 0.1, 0, 
0, 0.2, 0, 0.3, 0, 0, 0, 0.16, 0.13, 0, 0, 0.1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0.1, 0.1, 0, 0.3, 0, 0.1, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0.1, 0.1, 0, 0, 0.27, 0.37, 0, 0, 0.27, 0.5, 0, 0, 0, 
0, 0, 0, 0.1, 0, 0, 0, 0, 0, 0.37, 0, 0, 0.47, 0, 0, 0, 0.16, 
0.13, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.47, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0.2, 0, 0.37, 0.37, 0.1, 0.1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, 0, 0.47, 
0, 0.2, 0, 0, 0.37, 0, 0, 0.47, 0, 0, 0.5, 0, 0.1, 0.37, 0.37, 
0, 0.27, 0.13, 0, 0, 0, 0, 0, 0.16, 0, 0, 0.1, 0, 0, 0, 0, 0, 
0, 0, 0, 0.2, 0.2, 0.3, 0, 0, 0, 0, 0.5, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0.37, 0.1, 0, 0.47, 0.47, 0, 0, 0.3, 0, 0, 0, 0, 
0, 0, 0, 0.13, 0, 0, 0, 0.27, 0, 0, 0.37, 0, 0.5, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0.27, 0.37, 0.37, 0.3, 0, 0.1, 0.1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0.37, 0.37, 0, 0, 0.27, 0, 0, 0, 0, 0, 0, 
0, 0.27, 0, 0, 0, 0, 0.47, 0.16, 0, 0.47, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0.13, 0.13, 0, 0, 0, 0, 0.37, 0.37, 0, 
0, 0.13, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.24, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0.1, 0.1, 0.1, 0.1, 0, 0, 0, 0.27, 0.27, 0, 0, 0, 0.2, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.13, 0, 0, 0, 0.2, 
0, 0.13, 0, 0.27, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.47, 
0, 0, 0, 0, 0, 0, 0, 0.5, 0, 0, 0, 0, 0, 0.37, 0.37, 0.37, 0.37, 
0, 0, 0, 0.27, 0, 0, 0.47, 0.5, 0, 0, 0, 0, 0, 0, 0, 0.1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.47, 0, 0, 0, 0, 0, 0.27, 0, 0, 
0, 0, 0, 0, 0.1, 0.1, 0.47, 0, 0, 0, 0, 0, 0, 0, 0.27, 0.47, 
0, 0, 0, 0, 0, 0, 0, 0, 0.2, 0.37, 0.27, 0.47, 0.1, 0, 0, 0, 
0, 0, 0, 0.27, 0.5, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.24, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.13, 0, 0, 0, 0, 0, 0.1, 
0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.24, 0.24, 0, 
0, 0.37, 0, 0.47, 0.1, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.13, 0.13, 0, 0, 0, 0.27, 0.37, 
0.37, 0.1, 0.27, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, 0.1, 0, 0, 0), c(0, 
0.3, 0, 0, 0, 0, 0.54, 0, 0, 0.3, 0, 0, 0, 0, 0, 0, 0, 0, 0.54, 
0, 0, 0, 0.8, 0, 0.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0.4, 0.4, 0.4, 0.4, 0, 0, 
0, 0, 0.4, 0, 0.4, 0, 0.4, 0, 0, 0, 0.2, 0, 0, 0, 0, 0, 0, 0.3, 
0.3, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0.4, 0.7, 0, 0, 0, 
0, 0.1, 0, 0.4, 0, 0, 0.06, 0.54, 0.8, 0.8, 0.54, 0, 0, 0, 0, 
0.24, 0.4, 0.4, 0.4, 0.2, 0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0.4, 0.4, 0.4, 0, 0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 
0, 0, 0, 0, 0, 0, 0.4, 0.4, 0.4, 0.54, 0.2, 0, 0, 0.4, 0.4, 0, 
0, 0, 0.24, 0, 0, 0, 0, 0, 0, 0, 0.8, 0, 0.8, 0, 0, 0, 0, 0, 
0.54, 0.54, 0.4, 0.4, 0.4, 0, 0, 0, 0, 0.24, 0.24, 0, 0, 0, 0, 
0, 0, 0, 0.54, 0.2, 0, 0, 0, 0.4, 0, 0.06, 0.06, 0, 0, 0, 0, 
0, 0.06, 0.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.06, 
0.8, 0.4, 0, 0, 0, 0, 0.3, 0.3, 0, 0.6, 0, 0, 0, 0.06, 0.06, 
0.8, 0, 0, 0, 0, 0, 0, 0.4, 0, 0.4, 0.2, 0.2, 0, 0, 0, 0.4, 0.1, 
0.2, 0.4, 0.4, 0, 0, 0.4, 0, 0, 0, 0.24, 0, 0, 0, 0, 0, 0, 0, 
0, 0.4, 0, 0, 0, 0, 0, 0.24, 0, 0, 0, 0, 0, 0.2, 0.54, 0, 0, 
0, 0.3, 0, 0, 0.7, 0, 0, 0, 0, 0.4, 0, 0, 0, 0, 0.06, 0.54, 0.54, 
0, 0.8, 0, 0.24, 0.4, 0, 0, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0, 0, 
0, 0.4, 0.4, 0, 0.1, 0.74, 0.8, 0, 0, 0, 0, 0.1, 0.74, 0.4, 0, 
0, 0.6, 0, 0.4, 0, 0.4, 0.06, 0.54, 0.8, 0.4, 0, 0, 0, 0, 0, 
0.54, 0.8, 0.54, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.3, 0.3, 
0, 0, 0, 0.6, 0.4, 0.54, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0.4, 0.8, 
0.3, 0, 0.4, 0.4, 0, 0, 0, 0.4, 0, 0.4, 0, 0.3, 0, 0, 0, 0, 0.54, 
0.54, 0, 0, 0, 0, 0, 0, 0.4, 0, 0.3, 0, 0, 0.74, 0, 0.4, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.2, 0.4, 
0, 0, 0, 0, 0, 0, 0, 0.24, 0.24, 0.24, 0, 0, 0, 0, 0.1, 0, 0, 
0, 0.24, 0, 0, 0.4, 0.1, 0, 0, 0, 0, 0, 0.54, 0.54, 0.3, 0.7, 
0.54, 0.3, 0, 0, 0, 0.6, 0, 0, 0, 0, 0, 0.3, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0.06, 0, 0, 0, 0.8, 0, 0.4, 0, 0, 0.2, 0.2, 0.2, 0, 
0, 0.74, 0.74, 0, 0.4, 0, 0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0, 0, 0, 0.4, 0.4, 
0.4, 0.4, 0, 0, 0.74, 0.4, 0.4, 0.4, 0.3, 0, 0, 0, 0, 0, 0, 0.4, 
0.4, 0, 0.4, 0.2, 0, 0, 0.1, 0.3, 0, 0, 0.4, 0, 0, 0, 0, 0, 0.7, 
0, 0, 0, 0, 0, 0, 0.4, 0.4, 0, 0, 0, 0, 0, 0, 0.54, 0.8, 0, 0, 
0, 0.2, 0.2, 0.54, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0.54, 0.54, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0, 0, 
0, 0.8, 0.6, 0.2, 0, 0.54, 0.8, 0.8, 0.8, 0.8, 0.4, 0, 0, 0, 
0, 0, 0.3, 0, 0, 0, 0.54, 0.4, 0, 0, 0.4, 0.4, 0, 0, 0, 0.54, 
0.54, 0.54, 0, 0, 0, 0, 0, 0, 0.4, 0, 0, 0.54, 0, 0, 0, 0, 0, 
0.24, 0, 0, 0, 0, 0, 0.74, 0, 0, 0.06, 0, 0, 0, 0, 0, 0.06, 0.54, 
0.54, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.6, 0, 
0, 0, 0.4, 0.4, 0.8, 0, 0, 0, 0.2, 0, 0, 0, 0.3, 0, 0.74, 0, 
0, 0, 0.4, 0.4, 0, 0.74, 0.54, 0, 0.4, 0.8, 0.2, 0, 0, 0.2, 0, 
0.54, 0, 0, 0, 0, 0, 0.4, 0.4, 0, 0, 0.4, 0.4, 0.4, 0, 0, 0, 
0, 0, 0.24, 0.2, 0, 0, 0, 0, 0, 0.4, 0, 0, 0, 0.4, 0, 0, 0.3, 
0.3, 0.3, 0, 0, 0, 0, 0, 0, 0, 0.4, 0.4, 0, 0, 0.4, 0, 0, 0.3, 
0.2, 0, 0, 0, 0, 0.24, 0), c(0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 
0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 1, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 
0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 
1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 
1, 1, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 
0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0), c(0, 1, 0, 0, 0, 0, 1, 
0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 
0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 
0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 1, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 
1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 1, 
1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
1, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 
1, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 
1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 
1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0), c(0.5, 0.6, 
0.68, 0.4, 0.988606, 0.990885, 0.992708, 0.744, 0.7952, 0.5, 
0.4, 0.32, 0.5, 0.6, 0.7952, 0.868928, 0.895142, 0.916114, 0.932891, 
0.946313, 0.5, 0.6, 0.68, 0.6352, 0.4, 0.936769, 0.959532, 0.967626, 
0.9741, 0.823424, 0.858739, 0.886992, 0.909593, 0.927675, 0.793712, 
0.834969, 0.667976, 0.73438, 0.830003, 0.891202, 0.912962, 0.930369, 
0.944296, 0.755436, 0.804349, 0.843479, 0.874783, 0.699827, 0.559861, 
0.647889, 0.718311, 0.774649, 0.819719, 0.855775, 0.68, 0.868928, 
0.895142, 0.96564, 0.972512, 0.97801, 0.68, 0.7952, 0.5, 0.6, 
0.5, 0.6, 0.6, 0.4, 0.744, 0.7952, 0.83616, 0.868928, 0.916114, 
0.932891, 0.79705, 0.83764, 0.870112, 0.916872, 0.933497, 0.946798, 
0.757438, 0.805951, 0.875808, 0.900647, 0.920517, 0.936414, 0.949131, 
0.959305, 0.967444, 0.973955, 0.979164, 0.786665, 0.829332, 0.863466, 
0.890772, 0.912618, 0.930094, 0.944076, 0.964208, 0.977093, 0.981675, 
0.98534, 0.988272, 0.6, 0.52, 0.5, 0.48, 0.384, 0.5, 0.4, 0.32, 
0.256, 0.131072, 0.868928, 0.895142, 0.916114, 0.932891, 0.946313, 
0.95705, 0.96564, 0.97801, 0.982408, 0.988741, 0.990993, 0.994235, 
0.6, 0.95705, 0.96564, 0.972512, 0.97801, 0.982408, 0.985926, 
0.988741, 0.994235, 0.995388, 0.996311, 0.997049, 0.998111, 0.998489, 
0.998791, 0.999226, 0.999505, 0.999604, 0.999683, 0.999746, 0.999797, 
0.999838, 0.99987, 0.999896, 0.68, 0.544, 0.6352, 0.70816, 0.766528, 
0.813222, 0.880462, 0.938797, 0.951037, 0.96083, 0.968664, 0.974931, 
0.979945, 0.983956, 0.987165, 0.989732, 0.991785, 0.256, 0.4048, 
0.32384, 0.459072, 0.5, 0.4, 0.32, 0.52, 0.616, 0.628114, 0.702491, 
0.561993, 0.649594, 0.820592, 0.856474, 0.885179, 0.941212, 0.952969, 
0.962375, 0.9699, 0.97592, 0.984589, 0.987671, 0.99211, 0.993688, 
0.99495, 0.16384, 0.5, 0.6, 0.48, 0.384, 0.3072, 0.44576, 0.556608, 
0.645286, 0.716229, 0.5648, 0.45184, 0.53376, 0.427008, 0.341606, 
0.273285, 0.027488, 0.02199, 0.017592, 0.011259, 0.009007, 0.003689, 
0.002951, 0.002361, 0.001889, 0.001511, 0.000967, 0.000774, 0.000619, 
0.000495, 0.000317, 0.000254, 0.000203, 0.584, 0.4672, 0.37376, 
0.299008, 0.439206, 0.351365, 0.339899, 0.271919, 0.217535, 0.174028, 
0.339223, 0.471378, 0.577102, 0.661682, 0.729346, 0.783476, 0.826781, 
0.7952, 0.63616, 0.5, 0.6, 0.68, 0.5, 0.4, 0.32, 0.456, 0.65184, 
0.4, 0.5, 0.4, 0.149422, 0.119538, 0.076504, 0.061203, 0.048963, 
0.744, 0.7952, 0.83616, 0.868928, 0.895142, 0.932891, 0.946313, 
0.95705, 0.96564, 0.972512, 0.97801, 0.982408, 0.52, 0.616, 0.5, 
0.4, 0.5, 0.4, 0.4, 0.32, 0.5, 0.4, 0.32, 0.2048, 0.36384, 0.491072, 
0.314286, 0.5, 0.4, 0.32, 0.131072, 0.104858, 0.283886, 0.427109, 
0.072993, 0.258395, 0.165373, 0.132298, 0.084671, 0.214189, 0.309665, 
0.198186, 0.358548, 0.274861, 0.219889, 0.112583, 0.346778, 0.277422, 
0.421938, 0.33755, 0.227109, 0.181687, 0.14535, 0.059535, 0.047628, 
0.038103, 0.54816, 0.416, 0.5328, 0.104858, 0.283886, 0.427109, 
0.341687, 0.27335, 0.21868, 0.5, 0.48, 0.384, 0.3072, 0.44576, 
0.6, 0.48, 0.584, 0.6672, 0.787008, 0.829606, 0.863685, 0.524608, 
0.619686, 0.695749, 0.556599, 0.445279, 0.484979, 0.587983, 0.670386, 
0.536309, 0.629047, 0.56259, 0.450072, 0.48, 0.584, 0.73376, 
0.787008, 0.890948, 0.5, 0.6, 0.48, 0.384, 0.256, 0.4, 0.52, 
0.616, 0.6928, 0.75424, 0.6, 0.5, 0.6, 0.68, 0.7952, 0.83616, 
0.895142, 0.916114, 0.5, 0.6, 0.4, 0.32, 0.2048, 0.4, 0.75424, 
0.803392, 0.874171, 0.899337, 0.983112, 0.986489, 0.5, 0.6, 0.68, 
0.744, 0.5, 0.4, 0.5, 0.4, 0.6, 0.68, 0.744, 0.6, 0.68, 0.544, 
0.6352, 0.70816, 0.766528, 0.850578, 0.880462, 0.32, 0.5, 0.4, 
0.5, 0.26624, 0.212992, 0.136315, 0.109052, 0.087242, 0.069793, 
0.055835, 0.044668, 0.228587, 0.18287, 0.477037, 0.381629, 0.5, 
0.6, 0.5, 0.6672, 0.6, 0.4, 0.32, 0.4, 0.077677, 0.062142, 0.249713, 
0.199771, 0.359817, 0.6, 0.5, 0.6, 0.68, 0.7952, 0.83616, 0.868928, 
0.5, 0.6, 0.48, 0.584, 0.73376, 0.787008, 0.890948, 0.912758, 
0.930207, 0.944165, 0.955332, 0.964266, 0.971413, 0.97713, 0.981704, 
0.985363, 0.988291, 0.990633, 0.992506, 0.36384, 0.592858, 0.674286, 
0.739429, 0.791543, 0.833234, 0.866588, 0.89327, 0.914616, 0.945354, 
0.956283, 0.965027, 0.972021, 0.4, 0.456, 0.6, 0.3072, 0.813222, 
0.850578, 0.5, 0.4, 0.5072, 0.60576, 0.484608, 0.587686, 0.670149, 
0.736119, 0.788895, 0.831116, 0.864893, 0.891914, 0.913532, 0.930825, 
0.955728, 0.977333, 0.981866, 0.985493, 0.68, 0.4, 0.52, 0.4, 
0.4, 0.32, 0.256, 0.5, 0.6, 0.744, 0.5952, 0.47616, 0.580928, 
0.664742, 0.731794, 0.585435, 0.4, 0.256, 0.2048, 0.131072, 0.104858, 
0.067109, 0.053687, 0.04295, 0.03436, 0.027488, 0.02199, 0.014074, 
0.011259, 0.009007, 0.007206, 0.005765, 0.004612, 0.003689, 0.002951, 
0.001889, 0.001511, 0.001209, 0.000967, 0.000774, 0.000619, 0.000396, 
0.000317, 0.000254, 0.000203, 5.3e-05, 3.4e-05, 2.7e-05, 1.7e-05, 
1.4e-05, 0.5, 0.6, 0.68, 0.5, 0.4, 0.32, 0.24576, 0.196608, 0.48, 
0.24576, 0.396608, 0.517286, 0.613829, 0.5, 0.6, 0.256, 0.2048, 
0.68, 0.4, 0.52, 0.416, 0.3648, 0.29184, 0.5, 0.4, 0.5648, 0.45184, 
0.561472, 0.6, 0.68, 0.7952, 0.868928, 0.6, 0.5, 0.32, 0.2048, 
0.16384, 0.6, 0.68, 0.6, 0.66905, 0.73524, 0.131072, 0.104858, 
0.083886, 0.5, 0.6, 0.5, 0.104858, 0.083886, 0.067109, 0.52, 
0.416, 0.3328, 0.6, 0.6, 0.68, 0.744, 0.7952, 0.83616, 0.868928, 
0.895142, 0.916114, 0.932891, 0.946313, 0.96564, 0.972512, 0.97801, 
0.982408, 0.985926, 0.988741, 0.740314, 0.792251, 0.867041, 0.893632, 
0.914906, 0.931925, 0.94554, 0.956432, 0.965145, 0.32, 0.256, 
0.2048, 0.16384, 0.6, 0.48, 0.32, 0.256, 0.2048, 0.331072, 0.411886, 
0.329509, 0.263607, 0.210886, 0.168709, 0.5, 0.4, 0.7952, 0.83616, 
0.895142, 0.916114, 0.932891, 0.946313, 0.95705, 0.96564, 0.97801, 
0.5, 0.4, 0.351365, 0.384874, 0.507899, 0.606319, 0.485055, 0.588044, 
0.670435, 0.736348, 0.589079, 0.5, 0.6, 0.68, 0.744, 0.63616, 
0.708928, 0.850971, 0.544, 0.6352, 0.766528, 0.6, 0.68, 0.7952, 
0.83616, 0.868928, 0.895142, 0.916114, 0.95705, 0.77801, 0.16384, 
0.464858, 0.571886, 0.32, 0.5, 0.6, 0.68, 0.5, 0.4, 0.416, 0.616, 
0.6928, 0.55424, 0.443392, 0.4, 0.32, 0.256, 0.2048, 0.16384, 
0.131072, 0.5, 0.211886, 0.169509, 0.108486, 0.069431, 0.055545, 
0.035549, 0.018201, 0.014561, 0.011649, 0.009319, 0.004771, 0.003054, 
0.201954, 0.361563, 0.289251, 0.231401, 0.148096, 0.48, 0.6, 
0.544, 0.4352, 0.54816, 0.350822, 0.143697, 0.251966, 0.161258, 
0.129007, 0.103205, 0.082564, 0.4, 0.616, 0.4928, 0.740314, 0.592251, 
0.5, 0.6, 0.4, 0.5, 0.32, 0.256, 0.6, 0.544, 0.6352, 0.4, 0.32, 
0.131072, 0.104858, 0.083886, 0.067109, 0.253687, 0.5648, 0.65184, 
0.721472, 0.821742, 0.885915, 0.941588, 0.953271, 0.610093, 0.55046, 
0.640368, 0.712294, 0.815868, 0.852695, 0.882156, 0.905725, 0.92458, 
0.939664, 0.951731, 0.4, 0.5, 0.4, 0.32, 0.456, 0.5648, 0.65184, 
0.721472, 0.777178, 0.821742, 0.857394, 0.885915, 0.908732, 0.744, 
0.5952, 0.67616, 0.740928, 0.792742, 0.834194, 0.32, 0.52384, 
0.619072, 0.695258, 0.556206, 0.444965, 0.355972, 0.484778, 0.587822, 
0.670258, 0.588965, 0.671172), c(1.99, 1.69, 2.09, 1.69, 2.09, 
2.13, 1.59, 2.13, 2.13, 1.69, 2.18, 2.18, 2.18, 2.18, 2.13, 2.13, 
2.13, 2.13, 1.59, 2.13, 2.09, 2.13, 1.38, 2.29, 1.69, 1.79, 2.09, 
2.09, 2.09, 2.23, 2.23, 2.23, 2.23, 2.23, 2.23, 2.29, 2.29, 2.29, 
2.29, 2.29, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 2.09, 1.69, 
1.69, 1.69, 1.69, 1.69, 2.09, 1.99, 2.09, 2.29, 1.69, 2.18, 1.69, 
2.23, 1.69, 2.09, 2.09, 2.23, 1.89, 2.18, 2.13, 1.69, 1.69, 1.99, 
1.99, 1.69, 1.69, 2.09, 2.09, 2.09, 2.09, 2.09, 2.23, 2.09, 2.09, 
1.89, 2.18, 2.18, 2.29, 1.78, 1.48, 1.99, 2.18, 2.18, 2.18, 1.89, 
2.18, 1.78, 2.18, 2.18, 2.12, 1.59, 1.38, 1.38, 1.59, 2.18, 2.23, 
1.99, 2.23, 1.89, 1.69, 1.59, 1.59, 1.89, 1.89, 2.13, 1.79, 1.79, 
1.99, 2.09, 2.09, 2.09, 2.09, 2.09, 2.18, 1.69, 1.69, 1.69, 2.18, 
1.49, 1.99, 1.79, 2.09, 2.09, 2.23, 2.09, 2.23, 2.23, 2.09, 2.23, 
2.29, 1.89, 2.09, 2.09, 2.09, 2.13, 2.13, 2.13, 1.69, 1.69, 1.69, 
1.59, 1.49, 1.99, 1.99, 1.59, 1.59, 2.09, 2.09, 2.18, 1.89, 2.13, 
2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 1.38, 1.99, 1.19, 2.09, 2.09, 
2.23, 2.23, 2.23, 1.59, 1.59, 1.59, 1.59, 1.59, 2.09, 2.09, 2.18, 
2.18, 1.89, 1.89, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 1.59, 
1.49, 2.18, 2.09, 2.18, 1.78, 2.18, 2.12, 2.12, 2.18, 2.18, 1.79, 
2.09, 2.18, 2.12, 1.69, 2.09, 1.79, 2.09, 2.23, 2.23, 2.23, 2.23, 
2.23, 2.23, 2.23, 2.29, 2.09, 2.09, 2.09, 2.09, 2.12, 1.38, 1.69, 
1.69, 1.69, 1.69, 2.09, 1.69, 1.69, 2.18, 1.58, 2.18, 2.09, 2.18, 
2.12, 2.12, 1.38, 2.18, 2.18, 2.18, 2.09, 2.23, 2.23, 1.89, 2.09, 
1.69, 1.49, 1.49, 2.09, 2.09, 2.13, 1.59, 1.89, 1.49, 1.59, 1.59, 
2.09, 2.18, 1.59, 2.09, 2.09, 2.09, 1.89, 2.13, 2.13, 2.13, 2.13, 
2.13, 2.13, 2.13, 1.79, 1.69, 1.79, 1.79, 2.23, 2.09, 2.18, 1.89, 
2.23, 2.09, 2.29, 2.13, 2.18, 1.79, 1.59, 1.69, 1.99, 2.18, 1.69, 
2.09, 2.18, 1.48, 1.69, 1.69, 1.69, 1.99, 1.59, 1.99, 2.09, 2.18, 
2.18, 2.12, 1.59, 1.59, 1.99, 1.19, 2.18, 1.89, 1.59, 2.09, 2.09, 
2.18, 2.29, 2.09, 1.99, 2.23, 1.69, 2.09, 2.13, 2.13, 2.09, 2.09, 
2.13, 1.59, 1.89, 1.99, 1.89, 1.39, 1.19, 2.09, 2.18, 2.18, 1.99, 
1.89, 1.39, 1.59, 2.18, 2.09, 1.58, 2.18, 1.78, 2.18, 1.69, 2.12, 
1.59, 1.38, 1.69, 2.09, 1.69, 2.13, 2.13, 2.13, 1.59, 1.38, 1.59, 
2.13, 2.09, 2.09, 2.13, 2.13, 2.13, 2.13, 2.09, 2.09, 2.09, 1.99, 
2.23, 1.69, 1.69, 2.23, 2.23, 2.09, 1.58, 1.78, 1.59, 1.79, 2.23, 
2.23, 2.23, 2.29, 1.69, 2.09, 1.69, 2.18, 1.78, 1.38, 1.69, 2.13, 
1.59, 1.59, 2.23, 2.09, 2.09, 1.59, 1.99, 1.59, 2.09, 1.69, 2.18, 
2.13, 2.09, 2.18, 1.59, 1.59, 2.18, 1.79, 1.69, 1.99, 1.99, 1.99, 
1.59, 1.99, 1.69, 2.09, 2.09, 1.39, 2.13, 1.59, 2.13, 2.09, 1.79, 
2.13, 1.79, 2.13, 2.09, 2.09, 2.09, 2.23, 2.29, 2.29, 2.13, 2.18, 
2.18, 2.18, 2.13, 2.13, 2.13, 1.49, 1.59, 2.09, 2.09, 2.18, 2.18, 
2.18, 2.18, 2.18, 1.89, 1.89, 1.89, 2.13, 2.13, 2.13, 2.13, 1.89, 
2.13, 2.13, 2.18, 1.89, 2.13, 2.13, 1.59, 1.89, 2.13, 2.13, 2.18, 
2.18, 2.18, 1.59, 1.59, 1.69, 1.48, 1.59, 1.69, 2.23, 2.23, 2.23, 
1.58, 2.09, 1.99, 1.99, 1.99, 2.23, 1.69, 2.23, 2.23, 2.23, 2.09, 
2.09, 2.09, 2.18, 2.18, 2.18, 2.12, 2.13, 2.13, 2.13, 1.19, 2.18, 
1.78, 2.18, 2.29, 1.79, 1.79, 1.79, 2.18, 2.18, 1.39, 1.39, 2.13, 
1.59, 2.23, 1.89, 2.18, 2.18, 2.13, 1.79, 1.79, 1.79, 1.79, 1.79, 
2.09, 2.09, 2.09, 2.09, 2.23, 2.23, 2.23, 2.23, 2.23, 2.29, 2.29, 
2.29, 1.89, 2.29, 2.09, 2.09, 2.09, 2.09, 2.09, 1.69, 1.69, 1.69, 
1.69, 2.13, 2.13, 1.39, 1.69, 1.69, 1.69, 1.69, 2.09, 1.79, 2.23, 
2.29, 2.29, 2.29, 1.59, 1.59, 2.09, 1.59, 1.49, 2.09, 1.99, 1.89, 
1.69, 2.09, 2.18, 1.78, 1.99, 2.23, 2.09, 2.09, 2.09, 1.48, 2.13, 
2.13, 1.69, 1.99, 2.09, 2.09, 1.59, 1.78, 2.18, 2.09, 2.09, 1.79, 
2.09, 2.09, 1.59, 1.38, 2.23, 1.99, 2.18, 1.79, 1.89, 1.59, 2.13, 
2.23, 2.18, 2.18, 2.18, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 2.13, 
2.18, 2.13, 2.13, 1.59, 1.59, 2.13, 1.79, 1.79, 2.09, 2.23, 2.23, 
2.09, 2.23, 2.09, 1.69, 1.69, 2.09, 2.23, 2.23, 1.79, 2.23, 1.19, 
1.58, 1.79, 2.18, 1.59, 1.38, 1.38, 1.38, 1.38, 1.59, 2.13, 1.99, 
1.99, 1.99, 2.09, 1.69, 2.09, 2.09, 2.09, 1.59, 1.59, 2.18, 1.69, 
1.59, 1.59, 2.09, 2.13, 2.13, 1.59, 1.59, 1.59, 2.23, 2.23, 2.29, 
2.29, 2.09, 2.09, 1.69, 2.13, 2.13, 1.59, 2.09, 2.18, 2.18, 2.18, 
2.18, 1.89, 2.13, 2.13, 2.13, 1.99, 2.18, 1.39, 2.18, 1.99, 2.12, 
2.09, 2.09, 2.09, 2.13, 2.18, 2.12, 1.59, 1.59, 1.79, 1.79, 1.79, 
2.09, 2.09, 2.23, 1.79, 1.69, 1.69, 1.69, 2.09, 2.09, 2.18, 2.18, 
2.18, 2.09, 1.58, 2.18, 2.09, 2.18, 1.69, 1.69, 1.38, 2.18, 1.99, 
1.69, 1.49, 1.99, 1.99, 1.99, 1.69, 2.09, 1.39, 2.13, 2.13, 2.13, 
1.59, 1.59, 2.13, 1.39, 1.59, 2.09, 1.69, 1.38, 1.49, 2.23, 2.18, 
1.49, 2.18, 1.59, 1.69, 2.18, 2.09, 2.09, 2.09, 1.69, 1.69, 1.99, 
1.99, 1.59, 1.59, 1.59, 2.09, 2.09, 2.09, 2.18, 2.29, 1.89, 1.79, 
2.13, 2.13, 2.13, 2.09, 2.13, 1.69, 1.69, 1.99, 1.99, 1.59, 1.99, 
1.99, 1.69, 1.69, 1.69, 2.09, 2.09, 2.18, 2.18, 1.69, 1.99, 1.99, 
1.59, 1.59, 2.09, 1.69, 1.59, 1.99, 1.99, 1.69, 1.89, 2.09, 2.09, 
2.09, 2.18, 1.89, 2.18), c(1.75, 1.75, 1.69, 1.69, 1.86, 1.59, 
1.99, 2.06, 2.06, 1.75, 1.86, 1.86, 1.86, 1.86, 1.49, 1.59, 1.86, 
1.86, 1.39, 2.06, 1.89, 1.49, 1.96, 1.99, 1.89, 1.79, 1.79, 1.79, 
1.79, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.89, 1.89, 1.89, 1.89, 1.89, 1.89, 1.99, 1.99, 
1.99, 2.09, 1.89, 1.89, 1.69, 1.76, 1.99, 2.09, 1.99, 1.89, 1.99, 
1.89, 1.79, 1.89, 1.99, 1.86, 1.99, 1.49, 1.69, 1.69, 1.69, 1.75, 
1.75, 1.75, 1.89, 1.69, 1.89, 1.89, 1.86, 1.99, 1.76, 1.76, 1.99, 
1.86, 1.86, 1.99, 1.86, 1.86, 1.76, 1.86, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.96, 1.96, 1.99, 1.99, 1.99, 1.75, 
1.99, 1.86, 1.89, 1.75, 1.75, 1.86, 1.86, 1.49, 1.79, 1.79, 1.69, 
1.99, 1.89, 1.89, 1.89, 1.89, 1.86, 1.99, 1.99, 1.89, 1.86, 1.69, 
1.69, 1.79, 1.79, 1.79, 1.79, 1.86, 1.99, 1.99, 1.86, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.89, 1.49, 1.86, 1.59, 1.89, 1.99, 1.99, 1.99, 
1.69, 1.69, 1.69, 1.75, 1.75, 1.86, 1.76, 1.86, 1.86, 1.59, 1.49, 
1.49, 1.49, 1.86, 1.59, 1.86, 1.96, 1.69, 1.86, 1.76, 1.76, 1.99, 
1.99, 1.99, 1.39, 1.99, 1.75, 1.75, 1.75, 1.79, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.49, 1.49, 1.86, 1.59, 1.86, 1.86, 1.99, 1.69, 
1.86, 1.89, 1.86, 1.86, 1.86, 1.86, 1.86, 1.99, 1.99, 1.79, 1.79, 
1.86, 1.86, 1.89, 1.89, 1.79, 1.79, 1.79, 1.79, 1.79, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.89, 1.89, 1.89, 1.86, 1.96, 1.89, 1.69, 
1.39, 1.69, 1.79, 1.75, 1.59, 1.76, 1.86, 1.86, 1.89, 1.86, 1.86, 
1.86, 1.96, 1.99, 1.99, 1.99, 1.79, 1.79, 1.99, 1.99, 1.89, 1.89, 
1.69, 1.39, 1.76, 1.89, 1.86, 1.76, 1.86, 1.69, 1.75, 1.75, 1.86, 
1.86, 1.75, 1.76, 1.76, 1.76, 1.86, 1.59, 1.49, 1.86, 1.86, 1.59, 
1.49, 2.06, 1.79, 1.99, 1.79, 1.79, 1.99, 1.89, 1.86, 1.86, 1.79, 
1.86, 1.99, 1.49, 1.86, 1.86, 1.39, 1.69, 1.75, 1.76, 1.59, 1.76, 
1.86, 1.86, 1.69, 1.69, 1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 1.86, 
1.86, 1.39, 1.99, 1.75, 1.86, 1.86, 1.86, 1.75, 1.79, 1.79, 1.86, 
1.99, 1.99, 1.69, 1.99, 1.89, 1.99, 1.49, 1.49, 1.89, 1.89, 1.86, 
1.75, 1.99, 1.76, 1.76, 1.96, 1.86, 1.76, 1.76, 1.76, 1.76, 1.86, 
1.96, 1.75, 1.86, 1.99, 1.86, 1.86, 1.86, 1.86, 1.89, 1.86, 1.39, 
1.96, 1.89, 1.89, 1.69, 1.49, 1.86, 1.86, 1.39, 1.96, 1.99, 1.49, 
1.86, 1.76, 1.49, 1.49, 1.86, 1.59, 1.76, 1.79, 1.79, 1.75, 1.79, 
1.75, 1.59, 1.99, 1.99, 1.89, 1.86, 1.86, 1.99, 1.79, 1.99, 1.99, 
1.99, 1.99, 1.89, 1.89, 1.39, 1.86, 1.86, 1.96, 1.86, 1.49, 1.76, 
1.76, 1.99, 1.89, 1.89, 1.75, 1.75, 1.75, 1.79, 1.75, 1.86, 1.49, 
1.89, 1.86, 1.39, 1.39, 1.86, 1.79, 1.39, 1.69, 1.69, 1.75, 1.75, 
1.75, 1.75, 1.89, 1.76, 1.96, 1.99, 1.75, 1.59, 1.89, 1.79, 1.49, 
1.79, 1.49, 1.76, 1.79, 1.79, 1.79, 1.99, 1.99, 1.86, 1.86, 1.86, 
1.86, 1.59, 1.49, 1.49, 1.39, 1.75, 1.76, 1.76, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.86, 1.49, 1.49, 1.86, 1.86, 
1.59, 1.86, 1.86, 1.86, 1.86, 1.76, 1.76, 1.86, 1.59, 1.86, 1.86, 
1.86, 1.86, 1.39, 1.59, 1.86, 1.39, 1.86, 1.79, 1.79, 1.99, 1.86, 
1.79, 1.75, 1.75, 1.75, 1.79, 1.75, 1.79, 1.99, 1.99, 1.89, 1.76, 
1.76, 1.86, 1.86, 1.86, 1.86, 1.49, 1.49, 1.86, 1.86, 1.76, 1.86, 
1.86, 1.99, 1.76, 1.86, 1.86, 1.86, 1.86, 1.96, 1.96, 1.75, 1.75, 
1.99, 1.86, 1.86, 1.86, 2.06, 1.79, 1.79, 1.79, 1.79, 1.79, 1.79, 
1.79, 1.79, 1.79, 1.79, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.99, 1.99, 1.89, 1.89, 1.89, 1.89, 1.99, 1.99, 2.09, 
1.59, 1.59, 1.96, 1.99, 2.09, 1.89, 1.86, 1.86, 1.79, 1.99, 1.99, 
1.99, 1.99, 1.75, 1.76, 1.79, 1.75, 1.69, 1.79, 1.69, 1.86, 1.86, 
1.89, 1.76, 1.86, 1.75, 1.99, 1.89, 1.89, 1.76, 1.86, 1.59, 2.06, 
1.69, 1.69, 1.89, 1.89, 1.69, 1.86, 1.99, 1.86, 1.86, 1.79, 1.79, 
1.79, 1.39, 1.96, 1.79, 1.75, 1.76, 1.76, 1.86, 1.99, 1.49, 1.79, 
1.76, 1.86, 1.86, 1.86, 1.49, 1.49, 1.49, 1.49, 1.86, 1.86, 1.86, 
1.59, 1.86, 1.86, 1.39, 1.49, 1.79, 1.79, 1.79, 1.79, 1.99, 1.86, 
1.99, 1.89, 1.99, 1.69, 1.79, 1.79, 1.99, 1.79, 1.99, 1.86, 1.86, 
1.86, 1.86, 1.39, 1.96, 1.96, 1.96, 1.96, 1.75, 1.59, 1.69, 1.69, 
1.75, 1.86, 1.86, 1.86, 1.76, 1.76, 1.39, 1.75, 1.86, 1.69, 1.75, 
1.75, 1.99, 1.86, 1.59, 1.39, 1.99, 1.99, 1.99, 1.99, 1.99, 1.99, 
1.99, 1.99, 1.89, 1.49, 1.59, 1.39, 1.76, 1.86, 1.86, 1.86, 1.86, 
1.86, 1.86, 1.59, 1.49, 1.75, 1.86, 1.96, 1.99, 1.75, 1.86, 1.89, 
1.89, 1.89, 1.75, 1.86, 1.86, 1.99, 1.99, 1.79, 1.79, 1.79, 1.79, 
1.79, 1.99, 1.79, 1.69, 1.69, 1.69, 1.89, 1.89, 1.76, 1.86, 1.86, 
1.99, 1.86, 1.86, 1.89, 1.86, 1.89, 1.99, 1.96, 1.99, 1.69, 1.69, 
1.69, 1.69, 1.75, 1.75, 1.86, 1.86, 1.96, 1.99, 1.75, 1.75, 1.75, 
1.76, 1.49, 1.96, 1.39, 1.89, 1.89, 1.96, 1.69, 1.99, 1.86, 1.69, 
1.86, 1.99, 1.69, 1.86, 1.79, 1.79, 1.99, 1.99, 1.99, 1.69, 1.69, 
1.69, 1.75, 1.75, 1.86, 1.86, 1.76, 1.76, 1.99, 1.86, 1.76, 1.59, 
1.49, 1.49, 1.89, 1.59, 1.99, 1.69, 1.69, 1.69, 1.75, 1.75, 1.75, 
1.75, 1.75, 1.86, 1.76, 1.76, 1.76, 1.76, 1.69, 1.69, 1.69, 1.75, 
1.75, 1.86, 1.69, 1.75, 1.69, 1.75, 1.86, 1.86, 1.76, 1.76, 1.76, 
1.86, 1.86, 1.86), c(0.24, -0.06, 0.4, 0, 0.23, 0.54, -0.4, 0.07, 
0.07, -0.06, 0.32, 0.32, 0.32, 0.32, 0.64, 0.54, 0.27, 0.27, 
0.2, 0.07, 0.2, 0.64, -0.58, 0.3, -0.2, 0, 0.3, 0.3, 0.3, 0.24, 
0.24, 0.24, 0.24, 0.24, 0.24, 0.3, 0.3, 0.3, 0.3, 0.3, 0.1, 0.1, 
0.2, 0.2, 0.2, 0.2, 0.2, 0.2, -0.3, -0.3, -0.3, -0.4, -0.2, 0.2, 
0.3, 0.33, 0.3, -0.4, 0.19, -0.2, 0.24, -0.2, 0.3, 0.2, 0.24, 
0.03, 0.19, 0.64, 0, 0, 0.3, 0.24, -0.06, -0.06, 0.2, 0.4, 0.2, 
0.2, 0.23, 0.24, 0.33, 0.33, -0.1, 0.32, 0.32, 0.3, -0.08, -0.38, 
0.23, 0.32, 0.32, 0.32, 0.03, 0.32, -0.08, 0.32, 0.32, 0.26, 
-0.27, -0.58, -0.58, -0.4, 0.19, 0.24, 0.24, 0.24, 0.03, -0.2, 
-0.16, -0.16, 0.03, 0.03, 0.64, 0, 0, 0.3, 0.1, 0.2, 0.2, 0.2, 
0.2, 0.32, -0.3, -0.3, -0.2, 0.32, -0.2, 0.3, 0, 0.3, 0.3, 0.44, 
0.23, 0.24, 0.24, 0.23, 0.24, 0.3, -0.1, 0.1, 0.1, 0.2, 0.64, 
0.27, 0.54, -0.2, -0.3, -0.3, -0.4, -0.2, 0.3, 0.3, -0.16, -0.16, 
0.23, 0.33, 0.32, 0.03, 0.54, 0.64, 0.64, 0.64, 0.27, 0.54, 0.27, 
-0.58, 0.3, -0.67, 0.33, 0.33, 0.24, 0.24, 0.24, 0.2, -0.4, -0.16, 
-0.16, -0.16, 0.3, 0.23, 0.32, 0.32, 0.03, 0.03, 0.27, 0.64, 
0.64, 0.27, 0.54, 0.27, 0.27, -0.4, -0.2, 0.32, 0.2, 0.32, -0.08, 
0.32, 0.26, 0.26, 0.19, 0.19, 0, 0.3, 0.32, 0.26, -0.2, 0.2, 
0, 0.3, 0.44, 0.44, 0.44, 0.24, 0.24, 0.24, 0.24, 0.3, 0.1, 0.2, 
0.2, 0.2, 0.26, -0.58, -0.2, 0, 0.3, 0, 0.3, -0.06, 0.1, 0.42, 
-0.28, 0.32, 0.2, 0.32, 0.26, 0.26, -0.58, 0.19, 0.19, 0.19, 
0.3, 0.44, 0.24, -0.1, 0.2, -0.2, -0.2, 0.1, 0.33, 0.2, 0.27, 
-0.17, 0.03, -0.2, -0.16, -0.16, 0.23, 0.32, -0.16, 0.33, 0.33, 
0.33, 0.03, 0.54, 0.64, 0.27, 0.27, 0.54, 0.64, 0.07, 0, -0.3, 
0, 0, 0.24, 0.2, 0.32, 0.03, 0.44, 0.23, 0.3, 0.64, 0.32, -0.07, 
0.2, 0, 0.24, 0.42, 0.1, 0.33, 0.32, -0.38, 0, 0, 0, 0.3, -0.16, 
0.24, 0.23, 0.32, 0.32, 0.26, 0.2, -0.4, 0.24, -0.67, 0.32, 0.03, 
-0.16, 0.3, 0.3, 0.32, 0.3, 0.1, 0.3, 0.24, -0.2, 0.1, 0.64, 
0.64, 0.2, 0.2, 0.27, -0.16, -0.1, 0.23, 0.13, -0.57, -0.67, 
0.33, 0.42, 0.42, 0.23, 0.03, -0.57, -0.16, 0.32, 0.1, -0.28, 
0.32, -0.08, 0.32, -0.2, 0.26, 0.2, -0.58, -0.2, 0.2, 0, 0.64, 
0.27, 0.27, 0.2, -0.58, -0.4, 0.64, 0.23, 0.33, 0.64, 0.64, 0.27, 
0.54, 0.33, 0.3, 0.3, 0.24, 0.44, -0.06, 0.1, 0.24, 0.24, 0.2, 
-0.28, -0.08, -0.4, 0, 0.24, 0.24, 0.24, 0.3, -0.2, 0.2, 0.3, 
0.32, -0.08, -0.58, -0.17, 0.64, -0.17, -0.17, 0.24, 0.2, 0.2, 
-0.16, 0.24, -0.16, 0.3, -0.06, 0.32, 0.64, 0.2, 0.32, 0.2, 0.2, 
0.32, 0, 0.3, 0.3, 0.3, 0.24, -0.16, 0.24, -0.06, 0.2, 0.33, 
-0.57, 0.14, -0.16, 0.54, 0.2, 0, 0.64, 0, 0.64, 0.33, 0.3, 0.3, 
0.44, 0.3, 0.3, 0.27, 0.32, 0.32, 0.32, 0.54, 0.64, 0.64, 0.1, 
-0.16, 0.33, 0.33, 0.32, 0.32, 0.32, 0.32, 0.32, 0.03, 0.03, 
0.03, 0.27, 0.27, 0.64, 0.64, 0.03, 0.27, 0.54, 0.32, 0.03, 0.27, 
0.27, -0.17, 0.13, 0.27, 0.54, 0.32, 0.32, 0.32, -0.27, 0.2, 
0.1, -0.38, 0.2, -0.17, 0.44, 0.44, 0.24, -0.28, 0.3, 0.24, 0.24, 
0.24, 0.44, -0.06, 0.44, 0.24, 0.24, 0.2, 0.33, 0.33, 0.32, 0.32, 
0.32, 0.26, 0.64, 0.64, 0.27, -0.67, 0.42, -0.08, 0.32, 0.3, 
0.03, -0.07, -0.07, 0.32, 0.32, -0.57, -0.57, 0.38, -0.16, 0.24, 
0.03, 0.32, 0.32, 0.07, 0, 0, 0, 0, 0, 0.3, 0.3, 0.3, 0.3, 0.44, 
0.24, 0.24, 0.24, 0.24, 0.3, 0.3, 0.3, -0.1, 0.3, 0.1, 0.1, 0.2, 
0.2, 0.2, -0.2, -0.3, -0.3, -0.4, 0.54, 0.54, -0.57, -0.3, -0.4, 
-0.2, -0.17, 0.23, 0, 0.24, 0.3, 0.3, 0.3, -0.16, -0.17, 0.3, 
-0.16, -0.2, 0.3, 0.3, 0.03, -0.17, 0.2, 0.42, -0.08, 0.24, 0.24, 
0.2, 0.2, 0.33, -0.38, 0.54, 0.07, 0, 0.3, 0.2, 0.2, -0.1, -0.08, 
0.19, 0.23, 0.23, 0, 0.3, 0.3, 0.2, -0.58, 0.44, 0.24, 0.42, 
0.03, 0.03, -0.4, 0.64, 0.44, 0.42, 0.32, 0.32, 0.27, 0.64, 0.64, 
0.64, 0.64, 0.27, 0.27, 0.32, 0.54, 0.27, -0.27, 0.2, 0.64, 0, 
0, 0.3, 0.44, 0.24, 0.23, 0.24, 0.2, -0.3, 0, 0.3, 0.44, 0.24, 
0, 0.24, -0.67, -0.28, -0.07, 0.32, 0.2, -0.58, -0.58, -0.58, 
-0.58, -0.16, 0.54, 0.3, 0.3, 0.24, 0.23, -0.17, 0.23, 0.33, 
0.33, 0.2, -0.16, 0.32, 0, -0.16, -0.16, 0.1, 0.27, 0.54, 0.2, 
-0.4, -0.4, 0.24, 0.24, 0.3, 0.3, 0.1, 0.1, -0.2, 0.64, 0.54, 
0.2, 0.33, 0.32, 0.32, 0.32, 0.32, 0.03, 0.27, 0.54, 0.64, 0.24, 
0.32, -0.57, 0.19, 0.24, 0.26, 0.2, 0.2, 0.2, 0.38, 0.32, 0.26, 
-0.4, -0.4, 0, 0, 0, 0.3, 0.3, 0.24, 0, 0, 0, 0, 0.2, 0.2, 0.42, 
0.32, 0.32, 0.1, -0.28, 0.32, 0.2, 0.32, -0.2, -0.3, -0.58, 0.19, 
0.3, 0, -0.2, 0.3, 0.24, 0.24, -0.17, 0.23, -0.57, 0.14, 0.38, 
0.38, -0.16, -0.17, 0.64, -0.57, 0.2, 0.2, -0.2, -0.58, -0.2, 
0.24, 0.32, -0.2, 0.32, -0.4, 0, 0.32, 0.3, 0.3, 0.1, -0.3, -0.3, 
0.3, 0.3, -0.1, -0.16, -0.16, 0.23, 0.23, 0.33, 0.42, 0.3, 0.03, 
0.03, 0.54, 0.64, 0.64, 0.2, 0.54, -0.3, 0, 0.3, 0.3, -0.16, 
0.24, 0.24, -0.06, -0.06, -0.17, 0.33, 0.33, 0.42, 0.42, 0, 0.3, 
0.3, -0.16, -0.16, 0.23, 0, -0.16, 0.3, 0.24, -0.17, 0.03, 0.33, 
0.33, 0.33, 0.32, 0.03, 0.32), c(0, 0, 0, 0, 1, 1, 1, 1, 1, 0, 
0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 1, 1, 1, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 
0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 
0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 
1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 
1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 
1, 1, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 
1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 
1, 0, 0, 0, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 0, 0, 
0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 
0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0, 0, 0, 1, 1, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 
1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 
1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 
1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 
1, 1, 0, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 
1, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 1, 1, 
0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 1, 0, 1, 1, 
1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 
1, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 1, 1, 0), c(0, 0.150754, 0, 
0, 0, 0, 0.253521, 0, 0, 0.150754, 0, 0, 0, 0, 0, 0, 0, 0, 0.253521, 
0, 0, 0, 0.366972, 0, 0.191388, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.191388, 0.191388, 0.191388, 
0.191388, 0.191388, 0, 0, 0, 0, 0.191388, 0, 0.191388, 0, 0.191388, 
0, 0, 0, 0.095694, 0, 0, 0, 0, 0, 0, 0.150754, 0.150754, 0, 0, 
0, 0, 0, 0, 0, 0, 0.174672, 0, 0, 0, 0.183486, 0.321101, 0, 0, 
0, 0, 0.050251, 0, 0.183486, 0, 0, 0.027523, 0.253521, 0.366972, 
0.366972, 0.253521, 0, 0, 0, 0, 0.112676, 0.191388, 0.201005, 
0.201005, 0.095694, 0.095694, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.191388, 
0.191388, 0.191388, 0, 0.118343, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0.174672, 0, 0, 0, 0, 0, 0, 0.191388, 0.191388, 0.191388, 
0.253521, 0.118343, 0, 0, 0.201005, 0.201005, 0, 0, 0, 0.112676, 
0, 0, 0, 0, 0, 0, 0, 0.366972, 0, 0.40201, 0, 0, 0, 0, 0, 0.253521, 
0.253521, 0.201005, 0.201005, 0.201005, 0, 0, 0, 0, 0.112676, 
0.112676, 0, 0, 0, 0, 0, 0, 0, 0.253521, 0.118343, 0, 0, 0, 0.183486, 
0, 0.027523, 0.027523, 0, 0, 0, 0, 0, 0.027523, 0.191388, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.027523, 0.366972, 
0.191388, 0, 0, 0, 0, 0.150754, 0.150754, 0, 0.275229, 0, 0, 
0, 0.027523, 0.027523, 0.366972, 0, 0, 0, 0, 0, 0, 0.174672, 
0, 0.191388, 0.118343, 0.118343, 0, 0, 0, 0.201005, 0.050251, 
0.118343, 0.201005, 0.201005, 0, 0, 0.201005, 0, 0, 0, 0.112676, 
0, 0, 0, 0, 0, 0, 0, 0, 0.191388, 0, 0, 0, 0, 0, 0.112676, 0, 
0, 0, 0, 0, 0.100503, 0.253521, 0, 0, 0, 0.150754, 0, 0, 0.321101, 
0, 0, 0, 0, 0.201005, 0, 0, 0, 0, 0.027523, 0.253521, 0.253521, 
0, 0.40201, 0, 0.112676, 0.201005, 0, 0, 0, 0, 0, 0, 0, 0.191388, 
0, 0, 0, 0, 0, 0, 0.201005, 0.174672, 0, 0.050251, 0.347418, 
0.40201, 0, 0, 0, 0, 0.050251, 0.347418, 0.201005, 0, 0, 0.275229, 
0, 0.183486, 0, 0.191388, 0.027523, 0.253521, 0.366972, 0.191388, 
0, 0, 0, 0, 0, 0.253521, 0.366972, 0.253521, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0.150754, 0.150754, 0, 0, 0, 0.275229, 0.183486, 
0.253521, 0, 0, 0, 0, 0, 0.191388, 0, 0, 0, 0.183486, 0.366972, 
0.150754, 0, 0.201005, 0.201005, 0, 0, 0, 0.201005, 0, 0.201005, 
0, 0.150754, 0, 0, 0, 0, 0.253521, 0.253521, 0, 0, 0, 0, 0, 0, 
0.201005, 0, 0.150754, 0, 0, 0.347418, 0, 0.201005, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.118343, 0.201005, 
0, 0, 0, 0, 0, 0, 0, 0.112676, 0.112676, 0.112676, 0, 0, 0, 0, 
0.050251, 0, 0, 0, 0.112676, 0, 0, 0.201005, 0.050251, 0, 0, 
0, 0, 0, 0.253521, 0.253521, 0.150754, 0.321101, 0.253521, 0.150754, 
0, 0, 0, 0.275229, 0, 0, 0, 0, 0, 0.150754, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0.027523, 0, 0, 0, 0.40201, 0, 0.183486, 0, 0, 0.100503, 
0.100503, 0.100503, 0, 0, 0.347418, 0.347418, 0, 0.201005, 0, 
0.095694, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0.174672, 0, 0, 0, 0, 0, 0, 0.191388, 0.191388, 0.191388, 
0.191388, 0, 0, 0.347418, 0.191388, 0.191388, 0.191388, 0.150754, 
0, 0, 0, 0, 0, 0, 0.201005, 0.201005, 0, 0.201005, 0.118343, 
0, 0, 0.050251, 0.150754, 0, 0, 0.183486, 0, 0, 0, 0, 0, 0.321101, 
0, 0, 0, 0, 0, 0, 0.201005, 0.183486, 0, 0, 0, 0, 0, 0, 0.253521, 
0.366972, 0, 0, 0, 0.100503, 0.095694, 0.253521, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.253521, 0.253521, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0.191388, 0, 0, 0, 0, 0, 0, 0.40201, 0.275229, 
0.100503, 0, 0.253521, 0.366972, 0.366972, 0.366972, 0.366972, 
0.201005, 0, 0, 0, 0, 0, 0.150754, 0, 0, 0, 0.253521, 0.201005, 
0, 0, 0.201005, 0.201005, 0, 0, 0, 0.253521, 0.253521, 0.253521, 
0, 0, 0, 0, 0, 0, 0.191388, 0, 0, 0.253521, 0, 0, 0, 0, 0, 0.112676, 
0, 0, 0, 0, 0, 0.347418, 0, 0, 0.027523, 0, 0, 0, 0, 0, 0.027523, 
0.253521, 0.253521, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0.275229, 0, 0, 0, 0.191388, 0.191388, 0.366972, 0, 0, 
0, 0.118343, 0, 0, 0, 0.150754, 0, 0.347418, 0, 0, 0, 0.201005, 
0.201005, 0, 0.347418, 0.253521, 0, 0.191388, 0.366972, 0.118343, 
0, 0, 0.118343, 0, 0.253521, 0, 0, 0, 0, 0, 0.191388, 0.191388, 
0, 0, 0.201005, 0.201005, 0.201005, 0, 0, 0, 0, 0, 0.112676, 
0.100503, 0, 0, 0, 0, 0, 0.191388, 0, 0, 0, 0.201005, 0, 0, 0.150754, 
0.150754, 0.150754, 0, 0, 0, 0, 0, 0, 0, 0.201005, 0.201005, 
0, 0, 0.201005, 0, 0, 0.150754, 0.095694, 0, 0, 0, 0, 0.112676, 
0), c(0, 0, 0.091398, 0, 0, 0.145161, 0, 0, 0, 0, 0, 0, 0, 0, 
0.198925, 0.145161, 0, 0, 0.252688, 0, 0, 0.198925, 0, 0, 0.050251, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.050251, 
0.050251, 0.050251, 0.050251, 0.050251, 0.050251, 0, 0, 0, 0, 
0.095694, 0.095694, 0, 0.068783, 0, 0, 0, 0.095694, 0, 0.095694, 
0, 0.050251, 0, 0, 0, 0.251256, 0, 0, 0, 0, 0, 0, 0, 0.091398, 
0, 0, 0, 0, 0.068783, 0.068783, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.095694, 0, 0, 
0, 0, 0.251256, 0, 0, 0, 0, 0.050251, 0.050251, 0.050251, 0.050251, 
0, 0, 0, 0.095694, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0.050251, 0.198925, 0, 0.145161, 0.050251, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0.053763, 0, 0, 0.145161, 0.198925, 0.198925, 
0.198925, 0, 0.145161, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.252688, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.198925, 0.198925, 0, 0.145161, 
0, 0, 0, 0, 0, 0.050251, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.095694, 
0.095694, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.050251, 0.050251, 
0.050251, 0, 0, 0.095694, 0, 0.177515, 0, 0, 0, 0.091429, 0.068783, 
0, 0, 0.050251, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.050251, 0.050251, 
0, 0.177515, 0, 0.050251, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.053763, 
0.053763, 0, 0, 0.145161, 0.198925, 0, 0, 0.145161, 0.251256, 
0, 0, 0, 0, 0, 0, 0.050251, 0, 0, 0, 0, 0, 0.198925, 0, 0, 0.252688, 
0, 0, 0, 0.091429, 0.068783, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0.252688, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.095694, 
0, 0.198925, 0.198925, 0.050251, 0.050251, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.050251, 0, 0.252688, 
0, 0.095694, 0, 0, 0.198925, 0, 0, 0.252688, 0, 0, 0.251256, 
0, 0.053763, 0.198925, 0.198925, 0, 0.145161, 0.068783, 0, 0, 
0, 0, 0, 0.091429, 0, 0, 0.050251, 0, 0, 0, 0, 0, 0, 0, 0, 0.095694, 
0.095694, 0.177515, 0, 0, 0, 0, 0.251256, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0.198925, 0.050251, 0, 0.252688, 0.252688, 0, 0, 
0.177515, 0, 0, 0, 0, 0, 0, 0, 0.068783, 0, 0, 0, 0.145161, 0, 
0, 0.198925, 0, 0.251256, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.145161, 
0.198925, 0.198925, 0.177515, 0, 0.053763, 0.053763, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0.198925, 0.198925, 0, 0, 0.145161, 0, 0, 
0, 0, 0, 0, 0, 0.145161, 0, 0, 0, 0, 0.252688, 0.091429, 0, 0.252688, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.068783, 0.068783, 
0, 0, 0, 0, 0.198925, 0.198925, 0, 0, 0.068783, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0.120603, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.050251, 0.050251, 
0.050251, 0.050251, 0, 0, 0, 0.145161, 0.145161, 0, 0, 0, 0.095694, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.068783, 
0, 0, 0, 0.095694, 0, 0.068783, 0, 0.145161, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0.252688, 0, 0, 0, 0, 0, 0, 0, 0.251256, 
0, 0, 0, 0, 0, 0.198925, 0.198925, 0.198925, 0.198925, 0, 0, 
0, 0.145161, 0, 0, 0.252688, 0.251256, 0, 0, 0, 0, 0, 0, 0, 0.050251, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.252688, 0, 0, 0, 0, 0, 0.145161, 
0, 0, 0, 0, 0, 0, 0.053763, 0.053763, 0.252688, 0, 0, 0, 0, 0, 
0, 0, 0.145161, 0.252688, 0, 0, 0, 0, 0, 0, 0, 0, 0.095694, 0.198925, 
0.145161, 0.252688, 0.053763, 0, 0, 0, 0, 0, 0, 0.145161, 0.251256, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.120603, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0.068783, 0, 0, 0, 0, 0, 0.050251, 0, 
0.050251, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.120603, 0.120603, 
0, 0, 0.198925, 0, 0.252688, 0.050251, 0.050251, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.068783, 
0.068783, 0, 0, 0, 0.145161, 0.198925, 0.198925, 0.050251, 0.145161, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0.053763, 0.053763, 0, 0, 0), c(0.24, 0.24, 
0.23, 0, 0.23, 0.27, 0.14, 0.07, 0.07, 0.24, 0.32, 0.32, 0.32, 
0.32, 0.27, 0.27, 0.27, 0.27, 0.27, 0.07, 0.2, 0.27, 0.22, 0.3, 
0.1, 0, 0.3, 0.3, 0.3, 0.24, 0.24, 0.24, 0.24, 0.24, 0.24, 0.3, 
0.3, 0.3, 0.3, 0.3, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 
0.1, 0.1, 0, 0, 0, 0.3, 0.2, 0.3, 0, 0.19, 0, 0.24, 0, 0.3, 0.1, 
0.24, 0.23, 0.19, 0.14, 0, 0, 0.3, 0.24, 0.24, 0.24, 0.2, 0.23, 
0.2, 0.2, 0.23, 0.24, 0.2, 0.2, 0.3, 0.32, 0.32, 0.3, 0.32, 0.32, 
0.23, 0.32, 0.32, 0.32, 0.13, 0.32, 0.32, 0.32, 0.32, 0.32, 0.27, 
0.22, 0.22, 0.14, 0.19, 0.24, 0.24, 0.24, 0.27, 0, 0.24, 0.24, 
0.23, 0.23, 0.14, 0, 0, 0.3, 0.1, 0.1, 0.1, 0.1, 0.1, 0.32, 0.1, 
0.1, 0, 0.32, 0, 0.3, 0, 0.3, 0.3, 0.44, 0.23, 0.24, 0.24, 0.23, 
0.24, 0.3, 0.3, 0.1, 0.1, 0.1, 0.27, 0.27, 0.27, 0.1, 0.1, 0.1, 
0.14, 0, 0.3, 0.3, 0.24, 0.24, 0.23, 0.23, 0.32, 0.27, 0.27, 
0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.22, 0.3, 0.13, 0.33, 0.33, 
0.24, 0.24, 0.24, 0.27, 0.14, 0.24, 0.24, 0.24, 0.3, 0.23, 0.32, 
0.32, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.14, 
0, 0.32, 0.1, 0.32, 0.32, 0.32, 0.32, 0.32, 0.19, 0.19, 0, 0.3, 
0.32, 0.32, 0, 0, 0, 0.3, 0.44, 0.44, 0.44, 0.24, 0.24, 0.24, 
0.24, 0.3, 0.1, 0.1, 0.1, 0.1, 0.32, 0.22, 0, 0, 0, 0, 0.3, 0.24, 
0.24, 0.29, 0.32, 0.32, 0.1, 0.32, 0.32, 0.32, 0.22, 0.19, 0.19, 
0.19, 0.3, 0.44, 0.24, 0.3, 0.1, 0.1, 0, 0, 0.33, 0.1, 0.27, 
0.23, 0.13, 0, 0.24, 0.24, 0.23, 0.32, 0.24, 0.23, 0.23, 0.33, 
0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.14, 0.07, 0, 0.1, 0, 0, 
0.24, 0.1, 0.32, 0.27, 0.44, 0.23, 0.3, 0.27, 0.32, 0.13, 0.27, 
0, 0.24, 0.42, 0.24, 0.2, 0.32, 0.32, 0, 0, 0, 0.3, 0.24, 0.24, 
0.23, 0.32, 0.32, 0.32, 0.27, 0.14, 0.24, 0.13, 0.32, 0.27, 0.24, 
0.3, 0.3, 0.32, 0.3, 0.1, 0.3, 0.24, 0, 0.1, 0.27, 0.27, 0.1, 
0.1, 0.27, 0.24, 0.3, 0.23, 0.23, 0.17, 0.13, 0.33, 0.42, 0.42, 
0.23, 0.13, 0.17, 0.24, 0.32, 0.1, 0.32, 0.32, 0.32, 0.32, 0.1, 
0.32, 0.27, 0.22, 0, 0.2, 0, 0.27, 0.27, 0.27, 0.27, 0.22, 0.14, 
0.14, 0.23, 0.23, 0.27, 0.27, 0.27, 0.27, 0.2, 0.3, 0.3, 0.24, 
0.44, 0.24, 0.24, 0.24, 0.24, 0.1, 0.32, 0.32, 0.14, 0, 0.24, 
0.24, 0.24, 0.3, 0, 0, 0, 0.32, 0.32, 0.22, 0.13, 0.14, 0.23, 
0.23, 0.24, 0.2, 0.2, 0.24, 0.24, 0.24, 0.3, 0.24, 0.32, 0.27, 
0.1, 0.32, 0.27, 0.27, 0.32, 0, 0, 0.3, 0.3, 0.24, 0.24, 0.24, 
0.24, 0.2, 0.2, 0.17, 0.14, 0.24, 0.27, 0.2, 0, 0.27, 0, 0.14, 
0.33, 0.3, 0.3, 0.44, 0.3, 0.3, 0.27, 0.32, 0.32, 0.32, 0.27, 
0.27, 0.27, 0, 0.24, 0.23, 0.23, 0.32, 0.32, 0.32, 0.32, 0.32, 
0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.13, 0.27, 0.27, 0.32, 
0.27, 0.27, 0.27, 0.23, 0.23, 0.27, 0.27, 0.32, 0.32, 0.32, 0.27, 
0.27, 0.24, 0.32, 0.27, 0.13, 0.44, 0.44, 0.24, 0.32, 0.3, 0.24, 
0.24, 0.24, 0.44, 0.24, 0.44, 0.24, 0.24, 0.2, 0.2, 0.2, 0.32, 
0.32, 0.32, 0.32, 0.27, 0.27, 0.27, 0.13, 0.29, 0.32, 0.32, 0.3, 
0.23, 0.13, 0.13, 0.32, 0.32, 0.17, 0.17, 0.14, 0.24, 0.24, 0.23, 
0.32, 0.32, 0.07, 0, 0, 0, 0, 0, 0.3, 0.3, 0.3, 0.3, 0.44, 0.24, 
0.24, 0.24, 0.24, 0.3, 0.3, 0.3, 0.3, 0.3, 0.1, 0.1, 0.1, 0.1, 
0.1, 0.1, 0.1, 0.1, 0, 0.27, 0.27, 0.17, 0.1, 0, 0, 0.13, 0.23, 
0, 0.24, 0.3, 0.3, 0.3, 0.24, 0.23, 0.3, 0.24, 0, 0.3, 0.3, 0.13, 
0.13, 0.2, 0.29, 0.32, 0.24, 0.24, 0, 0.2, 0.2, 0.32, 0.27, 0.07, 
0, 0.3, 0.2, 0.2, 0.3, 0.32, 0.19, 0.23, 0.23, 0, 0.3, 0.3, 0.27, 
0.22, 0.44, 0.24, 0.42, 0.23, 0.23, 0.14, 0.14, 0.44, 0.42, 0.32, 
0.32, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.27, 0.32, 0.27, 0.27, 
0.27, 0.27, 0.14, 0, 0, 0.3, 0.44, 0.24, 0.23, 0.24, 0.1, 0.1, 
0, 0.3, 0.44, 0.24, 0, 0.24, 0.13, 0.32, 0.13, 0.32, 0.27, 0.22, 
0.22, 0.22, 0.22, 0.24, 0.27, 0.3, 0.3, 0.24, 0.23, 0.13, 0.23, 
0.23, 0.23, 0.27, 0.24, 0.32, 0, 0.24, 0.24, 0.1, 0.27, 0.27, 
0.27, 0.14, 0.14, 0.24, 0.24, 0.3, 0.3, 0.1, 0.1, 0, 0.27, 0.27, 
0.27, 0.23, 0.32, 0.32, 0.32, 0.32, 0.27, 0.27, 0.27, 0.14, 0.24, 
0.32, 0.17, 0.19, 0.24, 0.32, 0.2, 0.2, 0.2, 0.14, 0.32, 0.32, 
0.14, 0.14, 0, 0, 0, 0.3, 0.3, 0.24, 0, 0, 0, 0, 0.2, 0.2, 0.29, 
0.32, 0.32, 0.1, 0.32, 0.32, 0.1, 0.32, 0.1, 0.1, 0.22, 0.19, 
0.3, 0, 0, 0.3, 0.24, 0.24, 0.13, 0.23, 0.17, 0.14, 0.14, 0.14, 
0.24, 0.23, 0.27, 0.17, 0.27, 0.1, 0.1, 0.22, 0, 0.24, 0.32, 
0, 0.32, 0.14, 0, 0.32, 0.3, 0.3, 0.1, 0.1, 0.1, 0.3, 0.3, 0.3, 
0.24, 0.24, 0.23, 0.23, 0.2, 0.29, 0.3, 0.27, 0.23, 0.27, 0.27, 
0.27, 0.1, 0.27, 0.1, 0, 0.3, 0.3, 0.24, 0.24, 0.24, 0.24, 0.24, 
0.13, 0.33, 0.33, 0.42, 0.42, 0, 0.3, 0.3, 0.24, 0.24, 0.23, 
0, 0.24, 0.3, 0.24, 0.13, 0.23, 0.33, 0.23, 0.23, 0.32, 0.27, 
0.32), c(1, 1, 1, 1, 0, 0, 0, 0, 0, 1, 2, 2, 0, 0, 0, 0, 0, 0, 
0, 0, 2, 0, 2, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 
4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 0, 2, 4, 4, 2, 4, 
4, 3, 4, 4, 4, 0, 2, 0, 2, 2, 2, 2, 2, 2, 2, 1, 2, 2, 1, 3, 2, 
2, 3, 2, 2, 3, 2, 2, 1, 2, 2, 2, 1, 2, 2, 2, 1, 2, 0, 2, 2, 0, 
2, 3, 1, 4, 0, 4, 0, 0, 0, 0, 0, 3, 3, 1, 4, 4, 4, 4, 4, 1, 4, 
4, 4, 2, 0, 0, 4, 4, 0, 4, 0, 4, 4, 0, 4, 4, 4, 4, 4, 4, 0, 0, 
0, 4, 4, 4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 
2, 1, 1, 1, 1, 3, 3, 3, 0, 0, 0, 0, 0, 3, 0, 0, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 0, 0, 0, 2, 3, 2, 2, 2, 2, 2, 2, 2, 3, 3, 2, 2, 3, 
3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 2, 2, 3, 2, 2, 2, 
3, 2, 2, 2, 2, 2, 3, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 1, 
1, 1, 4, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 
0, 0, 4, 4, 4, 4, 4, 4, 0, 0, 4, 1, 3, 0, 2, 1, 0, 1, 2, 1, 2, 
2, 2, 2, 0, 2, 2, 0, 0, 2, 0, 0, 0, 2, 0, 0, 1, 1, 0, 0, 0, 3, 
3, 2, 3, 3, 2, 3, 3, 4, 0, 0, 4, 4, 0, 0, 4, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 0, 0, 3, 2, 2, 2, 2, 3, 2, 0, 2, 3, 2, 0, 0, 0, 0, 
0, 2, 0, 0, 0, 0, 0, 0, 0, 0, 2, 3, 3, 2, 3, 2, 2, 3, 3, 3, 2, 
2, 0, 3, 3, 3, 3, 3, 3, 3, 2, 2, 2, 2, 1, 0, 1, 1, 3, 2, 2, 0, 
2, 0, 4, 2, 0, 0, 4, 0, 0, 0, 0, 3, 2, 2, 2, 2, 0, 2, 2, 2, 2, 
1, 1, 0, 0, 2, 4, 0, 4, 0, 1, 3, 3, 3, 3, 3, 0, 0, 0, 0, 0, 0, 
0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 
0, 0, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 2, 2, 0, 1, 3, 3, 3, 2, 3, 
2, 2, 2, 3, 2, 3, 3, 3, 2, 2, 2, 2, 2, 2, 2, 0, 0, 0, 1, 2, 2, 
2, 4, 1, 1, 1, 1, 1, 1, 1, 1, 0, 3, 0, 0, 0, 0, 3, 3, 3, 3, 3, 
3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 
3, 3, 0, 0, 1, 4, 4, 4, 1, 0, 3, 3, 3, 3, 3, 0, 1, 3, 0, 0, 4, 
1, 1, 1, 2, 2, 2, 2, 3, 3, 2, 2, 2, 0, 0, 2, 2, 2, 2, 0, 2, 2, 
1, 1, 3, 3, 3, 0, 2, 3, 1, 1, 1, 0, 0, 0, 3, 1, 0, 0, 0, 0, 0, 
0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 4, 4, 4, 4, 4, 1, 4, 4, 4, 2, 3, 
3, 3, 4, 4, 1, 2, 1, 2, 0, 2, 2, 2, 2, 0, 0, 1, 1, 1, 0, 1, 1, 
0, 0, 0, 0, 0, 0, 0, 0, 4, 0, 0, 0, 0, 0, 4, 4, 4, 4, 4, 4, 4, 
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 2, 2, 2, 2, 2, 2, 
1, 2, 2, 0, 0, 3, 3, 3, 3, 3, 3, 4, 2, 2, 2, 2, 2, 2, 2, 0, 3, 
2, 2, 3, 2, 3, 3, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 
1, 0, 1, 0, 3, 3, 2, 0, 4, 0, 0, 0, 0, 2, 2, 3, 3, 3, 3, 3, 0, 
0, 0, 0, 0, 0, 0, 2, 2, 4, 0, 1, 0, 0, 0, 4, 0, 4, 1, 1, 1, 0, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 2, 1, 0, 1, 1, 1, 0, 
1, 0, 0, 0, 0, 1), c(1, 1, 1, 2, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 2, 1, 
2, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 
1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 2, 1, 2, 2, 2, 2, 1, 2, 1, 2, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 2, 
2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 
2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 
1, 1, 2, 2, 1, 1, 2, 1, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 1, 1, 2, 1, 
2, 2, 1, 2, 1, 1, 2, 1, 2, 2, 2, 1, 2, 2, 1, 2, 2, 2, 2, 2, 1, 
2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 2, 2, 2, 2, 1, 2, 2, 1, 
2, 2, 1, 1, 1, 1, 1, 2, 1, 1, 2, 2, 2, 1, 1, 2, 1, 1, 2, 2, 2, 
1, 1, 1, 1, 1, 2, 2, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 
1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 2, 2, 2, 2, 2, 2, 
2, 2, 2, 1, 2, 2, 1, 1, 2, 1, 1, 2, 1, 2, 2, 1, 2, 1, 2, 1, 1, 
1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 2, 1, 
1, 2, 2, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 
1, 2, 2, 2, 2, 1, 1, 2, 2, 1, 1, 1, 2, 1, 2, 2, 2, 2, 2, 2, 2, 
2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 
2, 2, 2, 2, 2, 2, 1, 1, 1, 2, 2, 2, 2, 1, 2, 1, 1, 1, 1, 1, 2, 
2, 2, 1, 1, 2, 1, 2, 1, 2, 2, 2, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 
2, 1, 2, 1, 1, 1, 2, 2, 2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
1, 1, 2, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 2, 1, 1, 1, 2, 2, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 
2, 2, 2, 1, 2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 2, 2, 2, 2, 2, 
2, 2, 2, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 1, 2, 1, 2, 2, 1, 2, 2, 
2, 2, 2, 1, 2, 1, 2, 1, 1, 2, 2, 2, 2, 2, 1, 1, 2, 2, 2, 2, 2, 
2, 1, 2, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 
2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 2, 1, 
1, 2, 2, 2, 1, 1, 1, 1, 1, 1)), control = list(20, 7, 0, 4, 5, 
    2, 0, 30, 0))
  n= 801 

          CP nsplit rel error
1 0.50641026      0 1.0000000
2 0.03205128      1 0.4935897
3 0.01000000      3 0.4294872

Variable importance
       LoyalCH      PriceDiff    SalePriceMM WeekofPurchase         DiscMM 
            69             10              6              3              3 
     PctDiscMM        PriceMM        PriceCH        StoreID         DiscCH 
             3              3              1              1              1 

Node number 1: 801 observations,    complexity param=0.5064103
  predicted class=CH  expected loss=0.3895131  P(node) =1
    class counts:   489   312
   probabilities: 0.610 0.390 
  left son=2 (507 obs) right son=3 (294 obs)
  Primary splits:
      LoyalCH   < 0.482304  to the right, improve=133.57510, (0 missing)
      StoreID   < 3.5       to the right, improve= 42.77419, (0 missing)
      Store7Yes < 0.5       to the right, improve= 24.13460, (0 missing)
      STORE     < 0.5       to the left,  improve= 24.13460, (0 missing)
      PriceDiff < 0.015     to the right, improve= 22.51990, (0 missing)
  Surrogate splits:
      PriceMM        < 1.89      to the right, agree=0.640, adj=0.020, (0 split)
      WeekofPurchase < 228.5     to the right, agree=0.638, adj=0.014, (0 split)
      StoreID        < 3.5       to the right, agree=0.638, adj=0.014, (0 split)
      DiscCH         < 0.485     to the left,  agree=0.637, adj=0.010, (0 split)
      SalePriceMM    < 1.285     to the right, agree=0.637, adj=0.010, (0 split)

Node number 2: 507 observations,    complexity param=0.03205128
  predicted class=CH  expected loss=0.1696252  P(node) =0.6329588
    class counts:   421    86
   probabilities: 0.830 0.170 
  left son=4 (264 obs) right son=5 (243 obs)
  Primary splits:
      LoyalCH       < 0.74912   to the right, improve=16.985330, (0 missing)
      PriceDiff     < 0.015     to the right, improve=14.000760, (0 missing)
      SalePriceMM   < 1.84      to the right, improve=11.621730, (0 missing)
      ListPriceDiff < 0.255     to the right, improve=10.194770, (0 missing)
      DiscMM        < 0.03      to the left,  improve= 7.969729, (0 missing)
  Surrogate splits:
      PriceMM        < 2.04      to the right, agree=0.617, adj=0.202, (0 split)
      WeekofPurchase < 239.5     to the right, agree=0.613, adj=0.193, (0 split)
      SalePriceMM    < 2.04      to the right, agree=0.613, adj=0.193, (0 split)
      PriceCH        < 1.755     to the right, agree=0.611, adj=0.189, (0 split)
      PriceDiff      < 0.015     to the right, agree=0.592, adj=0.148, (0 split)

Node number 3: 294 observations
  predicted class=MM  expected loss=0.2312925  P(node) =0.3670412
    class counts:    68   226
   probabilities: 0.231 0.769 

Node number 4: 264 observations
  predicted class=CH  expected loss=0.04545455  P(node) =0.329588
    class counts:   252    12
   probabilities: 0.955 0.045 

Node number 5: 243 observations,    complexity param=0.03205128
  predicted class=CH  expected loss=0.3045267  P(node) =0.3033708
    class counts:   169    74
   probabilities: 0.695 0.305 
  left son=10 (205 obs) right son=11 (38 obs)
  Primary splits:
      PriceDiff     < -0.165    to the right, improve=18.949300, (0 missing)
      ListPriceDiff < 0.235     to the right, improve=13.454280, (0 missing)
      SalePriceMM   < 1.84      to the right, improve=11.286660, (0 missing)
      DiscMM        < 0.03      to the left,  improve= 8.212141, (0 missing)
      PctDiscMM     < 0.0137615 to the left,  improve= 8.212141, (0 missing)
  Surrogate splits:
      SalePriceMM    < 1.585     to the right, agree=0.914, adj=0.447, (0 split)
      DiscMM         < 0.57      to the left,  agree=0.897, adj=0.342, (0 split)
      PctDiscMM      < 0.264375  to the left,  agree=0.897, adj=0.342, (0 split)
      WeekofPurchase < 274.5     to the left,  agree=0.864, adj=0.132, (0 split)
      SpecialMM      < 0.5       to the left,  agree=0.848, adj=0.026, (0 split)

Node number 10: 205 observations
  predicted class=CH  expected loss=0.2195122  P(node) =0.2559301
    class counts:   160    45
   probabilities: 0.780 0.220 

Node number 11: 38 observations
  predicted class=MM  expected loss=0.2368421  P(node) =0.0474407
    class counts:     9    29
   probabilities: 0.237 0.763 

Response:
A classification tree was fit using Purchase as the response and all other variables as predictors. The summary shows that the tree split first on LoyalCH, the most important variable. The training error rate was approximately 17.5%, and the final tree had 4 terminal nodes.

(c)
Type in the name of the tree object in order to get a detailed text output. Pick one of the terminal nodes, and interpret the information displayed.

n= 801 

node), split, n, loss, yval, (yprob)
      * denotes terminal node

 1) root 801 312 CH (0.61048689 0.38951311)  
   2) LoyalCH>=0.482304 507  86 CH (0.83037475 0.16962525)  
     4) LoyalCH>=0.74912 264  12 CH (0.95454545 0.04545455) *
     5) LoyalCH< 0.74912 243  74 CH (0.69547325 0.30452675)  
      10) PriceDiff>=-0.165 205  45 CH (0.78048780 0.21951220) *
      11) PriceDiff< -0.165 38   9 MM (0.23684211 0.76315789) *
   3) LoyalCH< 0.482304 294  68 MM (0.23129252 0.76870748) *

Response:
### (c) Terminal Node Interpretation We examine Node 4:

This is a terminal node, meaning no further splits occur beyond this point. The node was reached by customers with moderate loyalty to CH, based on earlier splits on the LoyalCH variable. The model predicts that customers in this segment will purchase CH with 95.5% confidence, indicating a highly pure group. This node captures a large segment (264 observations) with strong brand preference despite not having the highest loyalty score.

(d)
Create a plot of the tree, and interpret the results.

Response:
The tree splits first on LoyalCH, showing that brand loyalty is the strongest predictor of purchase behavior.

  • When LoyalCH < 0.48, the predicted class is MM (Minute Maid).
  • When LoyalCH ≥ 0.75, the predicted class is CH (Citrus Hill), with high confidence.
  • For values between these thresholds, the decision is further refined by PriceDiff.

This structure indicates that loyal customers are more likely to stay with CH, while price differences play a greater role for less loyal customers.

(e) Predict the response on the test data, and produce a confusion matrix comparing the test labels to the predicted test labels. What is the test error rate?

    test_predictions
      CH  MM
  CH 135  29
  MM  20  85
[1] 0.1821561

Response:
| | Predicted CH | Predicted MM | | ————- | ———— | ———— | | Actual CH | 135 | 29 | | Actual MM | 20 | 85 | The confusion matrix shows that the model correctly predicted 135 CH purchases and 85 MM purchases, with some misclassifications. The test error rate is approximately 0.19, indicating that about 19% of predictions were incorrect.

(f)
Apply the cv.tree() function to the training set in order to determine the optimal tree size.

Warning: package 'tree' was built under R version 4.4.3

Response: Cross-validation indicates that the optimal tree size is 2 terminal nodes, suggesting that a simpler tree provides the lowest expected test error and further splits do not improve performance.

(g)
Produce a plot with tree size on the x-axis and cross-validated classification error rate on the y-axis.

(h)
Which tree size corresponds to the lowest cross-validated classification error rate? Response:
The tree size with 2 terminal nodes corresponds to the lowest cross-validated classification error rate, as shown by the lowest point on the CV error curve.

(i)
Produce a pruned tree corresponding to the optimal tree size obtained using cross-validation. If cross-validation does not lead to selection of a pruned tree, then create a pruned tree with five terminal nodes.

(j)
Compare the training error rates between the pruned and unpruned trees. Which is higher?

[1] 0.1672909
[1] 0.1997503

Response:
The training error rate for the unpruned tree is 16.7%, while for the pruned tree it is 19.98%. As expected, the pruned tree has a higher training error, since pruning simplifies the model and reduces overfitting.

(k)
Compare the test error rates between the pruned and unpruned trees. Which is higher?

[1] 0.1821561
[1] 0.2007435

Response: The unpruned tree achieved a test error rate of 18.2%, while the pruned tree had a slightly higher test error rate of 20.1%. Although pruning simplifies the model, in this case it resulted in a small decrease in predictive accuracy on the test data. The unpruned tree performed slightly better in terms of test classification accuracy.