Describe the benefits and risks of using these five methods. As part of a larger experiment, Dale (1992) looked at six samples of a wetland soil undergoing a simulated snowmelt. Three were randomly selected for treatment with a neutral pH snowmelt; the other three got a reduced pH snowmelt. The observed response was the number of Copepoda removed from each microcosm during the first 14 days of snowmelt.
Reduced pH \((256,159,149)\)
Neutral pH \((54,123,248)\)
Using randomizationmethods, test the null hypothesis that the two treatments have equal average numbers of Copepoda versus a two-sided alternative.
#Permutation test
#兩試驗樣本數相同下進行
#原本資料計算
library(dplyr)
##
## Attaching package: 'dplyr'
## The following objects are masked from 'package:stats':
##
## filter, lag
## The following objects are masked from 'package:base':
##
## intersect, setdiff, setequal, union
library(knitr)
A<-c(256,159,149);A
## [1] 256 159 149
B<-c(54,123,247);B
## [1] 54 123 247
D<-A[]-B[];D<-matrix(D,ncol = length(A))
#A與B資料的差異和
D_sum<-apply(D,1,sum)
#各種排列數差異和計算
W<-c(A,B)
W<-matrix(data = W,ncol =length(W) )
W_sum<-apply(W,1,sum)
cb_W<- combn(W,3)
cb_A_sum<-apply(cb_W,2,sum)
cb_B_sum=NULL
n=NULL
for(i in 1:choose(length(W),length(W)/2)){
n=W_sum-cb_A_sum[i]
cb_B_sum=c(cb_B_sum,n)
}
all_d_sum<-cb_A_sum-cb_B_sum
#將所有排列差異和按照順序列出
all_d_sum <- matrix(sort(all_d_sum),ncol = 5,byrow = T)
print(all_d_sum)
## [,1] [,2] [,3] [,4] [,5]
## [1,] -336 -316 -264 -140 -126
## [2,] -122 -88 -70 -68 -50
## [3,] 50 68 70 88 122
## [4,] 126 140 264 316 336
#畫直方圖
hist(all_d_sum,breaks = 2000)
#計算P值(計算有幾種可能使得兩者的差異絕對值不小於觀察差異)
p_value<-2*sum(1*(D_sum<=all_d_sum)/choose(length(W),length(W)/2))
p_value
## [1] 0.4
#統計模擬看看
W
## [,1] [,2] [,3] [,4] [,5] [,6]
## [1,] 256 159 149 54 123 247
s_d=NULL
P_hat_value=NULL
sd_p_hat=NULL
for (j in 1:1000) {
s_d=NULL
for(i in 1:1000){
x <- sample(W,3)
sx <- sum(x)
sy <- sum(W)-sum(x)
sd <- sx-sy
s_d=c(s_d,sd)
}
p_hat_value <- 2*sum(1*(D_sum<=s_d[-1]))/1000
P_hat_value = c(P_hat_value,p_hat_value)
}
sort(P_hat_value)
## [1] 0.334 0.336 0.336 0.338 0.338 0.338 0.340 0.342 0.342 0.342 0.342
## [12] 0.344 0.344 0.344 0.346 0.348 0.348 0.348 0.348 0.348 0.350 0.350
## [23] 0.352 0.352 0.352 0.352 0.352 0.354 0.354 0.354 0.356 0.356 0.356
## [34] 0.356 0.356 0.356 0.356 0.356 0.358 0.358 0.358 0.358 0.358 0.358
## [45] 0.358 0.358 0.358 0.360 0.360 0.360 0.360 0.360 0.360 0.360 0.360
## [56] 0.360 0.360 0.360 0.360 0.362 0.362 0.362 0.362 0.362 0.362 0.364
## [67] 0.364 0.364 0.364 0.364 0.364 0.364 0.364 0.364 0.364 0.366 0.366
## [78] 0.366 0.366 0.366 0.366 0.366 0.366 0.366 0.366 0.366 0.366 0.366
## [89] 0.366 0.368 0.368 0.368 0.368 0.368 0.368 0.368 0.368 0.368 0.368
## [100] 0.370 0.370 0.370 0.370 0.370 0.370 0.370 0.370 0.370 0.370 0.370
## [111] 0.370 0.370 0.370 0.370 0.370 0.370 0.372 0.372 0.372 0.372 0.372
## [122] 0.372 0.372 0.372 0.372 0.372 0.372 0.372 0.372 0.372 0.372 0.372
## [133] 0.372 0.374 0.374 0.374 0.374 0.374 0.374 0.374 0.374 0.374 0.374
## [144] 0.374 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376
## [155] 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376
## [166] 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.376 0.378
## [177] 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378
## [188] 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378 0.378
## [199] 0.378 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380
## [210] 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380 0.380
## [221] 0.380 0.380 0.380 0.380 0.382 0.382 0.382 0.382 0.382 0.382 0.382
## [232] 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382
## [243] 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.382 0.384
## [254] 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384
## [265] 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.384 0.386
## [276] 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386
## [287] 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386 0.386
## [298] 0.386 0.386 0.386 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.388
## [309] 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.388
## [320] 0.388 0.388 0.388 0.388 0.388 0.388 0.388 0.390 0.390 0.390 0.390
## [331] 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390
## [342] 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390 0.390
## [353] 0.390 0.390 0.390 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392
## [364] 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392
## [375] 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.392 0.394 0.394 0.394
## [386] 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394
## [397] 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394
## [408] 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.394 0.396 0.396 0.396
## [419] 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396
## [430] 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396
## [441] 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.396 0.398 0.398 0.398
## [452] 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398
## [463] 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398
## [474] 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.398 0.400
## [485] 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400
## [496] 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400
## [507] 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400 0.400
## [518] 0.400 0.400 0.400 0.400 0.400 0.400 0.402 0.402 0.402 0.402 0.402
## [529] 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402
## [540] 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402
## [551] 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402 0.402
## [562] 0.402 0.402 0.402 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404
## [573] 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404
## [584] 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.404 0.406
## [595] 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406
## [606] 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406 0.406
## [617] 0.406 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408
## [628] 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.408 0.410
## [639] 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410
## [650] 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.410
## [661] 0.410 0.410 0.410 0.410 0.410 0.410 0.410 0.412 0.412 0.412 0.412
## [672] 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412
## [683] 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412 0.412
## [694] 0.414 0.414 0.414 0.414 0.414 0.414 0.414 0.414 0.414 0.414 0.414
## [705] 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416
## [716] 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.416
## [727] 0.416 0.416 0.416 0.416 0.416 0.416 0.416 0.418 0.418 0.418 0.418
## [738] 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.418
## [749] 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.418 0.420 0.420 0.420
## [760] 0.420 0.420 0.420 0.420 0.420 0.420 0.420 0.420 0.420 0.420 0.420
## [771] 0.420 0.420 0.420 0.420 0.420 0.422 0.422 0.422 0.422 0.422 0.422
## [782] 0.422 0.422 0.422 0.422 0.422 0.422 0.422 0.422 0.422 0.422 0.422
## [793] 0.422 0.422 0.424 0.424 0.424 0.424 0.424 0.424 0.424 0.424 0.424
## [804] 0.424 0.424 0.424 0.424 0.424 0.424 0.426 0.426 0.426 0.426 0.426
## [815] 0.426 0.426 0.426 0.426 0.426 0.426 0.426 0.426 0.426 0.426 0.426
## [826] 0.426 0.426 0.426 0.426 0.426 0.426 0.428 0.428 0.428 0.428 0.428
## [837] 0.428 0.428 0.428 0.428 0.428 0.428 0.428 0.428 0.428 0.428 0.428
## [848] 0.428 0.428 0.428 0.428 0.430 0.430 0.430 0.430 0.430 0.430 0.430
## [859] 0.430 0.430 0.430 0.430 0.430 0.430 0.430 0.430 0.430 0.430 0.430
## [870] 0.432 0.432 0.432 0.432 0.432 0.432 0.432 0.432 0.432 0.432 0.432
## [881] 0.432 0.432 0.432 0.432 0.434 0.434 0.434 0.434 0.434 0.434 0.434
## [892] 0.434 0.434 0.434 0.436 0.436 0.436 0.436 0.436 0.436 0.436 0.436
## [903] 0.436 0.436 0.436 0.436 0.438 0.438 0.438 0.438 0.438 0.438 0.438
## [914] 0.438 0.438 0.438 0.438 0.438 0.438 0.438 0.438 0.438 0.438 0.438
## [925] 0.438 0.438 0.438 0.440 0.440 0.440 0.440 0.440 0.440 0.440 0.440
## [936] 0.440 0.442 0.442 0.442 0.442 0.442 0.442 0.444 0.444 0.444 0.444
## [947] 0.444 0.444 0.444 0.444 0.444 0.444 0.444 0.444 0.444 0.446 0.446
## [958] 0.446 0.446 0.446 0.446 0.446 0.446 0.448 0.448 0.448 0.448 0.448
## [969] 0.448 0.448 0.448 0.448 0.450 0.450 0.450 0.450 0.452 0.452 0.452
## [980] 0.452 0.452 0.454 0.456 0.456 0.456 0.456 0.456 0.458 0.458 0.460
## [991] 0.460 0.460 0.464 0.464 0.466 0.466 0.468 0.468 0.472 0.474
P_hat_value <- sum(P_hat_value)/1000
sd_p_hat <- (p_hat_value*(1-p_hat_value))/1000
print(paste('P hat value :' , P_hat_value))
## [1] "P hat value : 0.400934"
print(paste('stand error of p hat :' ,sd_p_hat))
## [1] "stand error of p hat : 0.000234624"
hist(s_d,breaks = 50,main="Histogram of stand error",xlab = "stand error")
We know
\(Y_{ij}\sim N(u_i,\sigma^{2})\) , \(E(Y_{ij}^2)=u_i^2+\sigma^2\)
\(Y_{i.}\sim N(u_{i,}\frac{\sigma^{2}}{n})\) , \(E(Y_{i.}^2)=u_i^2+\frac{\sigma^2}{n_i}\)
One remaining parameter to estimate is
\(Var({\epsilon}_{ij})={\sigma}^2 , j=1,2,...,n_i , i=1,2,...,g\)
Consider
\(\hat{{\sigma}^2}=MS_E=\frac{SS_E}{N-g}=\frac{\sum_{i=1}^{g}\sum_{j=1}^{n_i}(Y_{ij}-\bar{Y}_{i.})^2}{N-g}\)
In both the separate means and single means models,\(\hat{\sigma}^2\)is unbiased for \(\sigma^2\),i.e.
\(E(\hat{{\sigma}^2})=E(\frac{\sum_{i=1}^{g}\sum_{j=1}^{n_i}(Y_{ij}-\bar{Y}_{i.})^2}{N-g})\)
\(=E(\frac{\sum_{i=1}^{g}\sum_{j=1}^{n_i}(Y_{ij}^2-2Y_{ij}\bar{Y}_{i.}+\bar{Y}_{i.}^2))}{N-g}\)
\(=E(\frac{\sum_{i=1}^{g}\sum_{j=1}^{n_i}(Y_{ij}^2-2Y_{ij}\bar{Y}_{i.}+\bar{Y}_{i.}^2))}{N-g}\)
\(=E(\frac{\sum_{i=1}^{g}(\sum_{j=1}^{n_i}Y_{ij}-n_i\bar{Y}_{i.}^2)}{N-g})\)
\(=\frac{\sum_{i=1}^{g}n_i(u_i^2+\sigma^2-\sum_{i=1}^{g}n_i(u_i^2+\frac{\sigma^2}{n_i}))}{N-g}\)
\(=\frac{\sum_{i=1}^{g}n_i\sigma^2-\sum_{i=1}^{g}\sigma^2}{N-g}\)
\(=\sigma^2\)