Brian B. Avants et al.
2017-07-30
This is a compilable document with source code located here:
https://github.com/stnava/ANTsTutorial
To get this source, do:
git clone http://github.com/stnava/ANTsTutorial.git
It is expected that you will compile and run this from within the cloned ANTsTutorial
directory. The document needs the ants tutorial data discussed below. It depends on R, rmarkdown and ANTsR primarily.
Herein, links are in this color.
The Pediatric Template of Brain Perfusion (PTBP) at figshare.
git clone http://github.com/jeffduda/NeuroBattery.git
This will give you both raw and processed output for a single multiple modality subject.
We test (occasionally) against this reference output to monitor stability of our processing.
Install ANTsR dependencies
pkgmin=c("magrittr","Rcpp") # core dependencies
pkgxtra=c("igraph","randomForest","misc3d","rgl",
"mFilter","fastICA","fpc","knitr","rmarkdown",
"pixmap","png","signal","visreg")
install.packages( c( pkgmin, pkgxtra ) )
filename.tar.gz
Install via command line (or RStudio tools
\(\rightarrow\) install
):
R CMD INSTALL filename.tar.gz
Test via ( in R
) calling:
library(ANTsR)
?antsRegistration
Done!
library(ANTsR)
## Loading required package: ANTsRCore
##
## Attaching package: 'ANTsRCore'
## The following object is masked from 'package:stats':
##
## var
## The following objects are masked from 'package:base':
##
## all, any, apply, max, min, prod, range, sum
csvlink="https://ndownloader.figshare.com/files/1699436"
tfn=tempfile(fileext='.csv')
if ( ! exists("demog") )
{
demog=download.file(csvlink,tfn)
if ( demog == 0 ) demog=read.csv( tfn )
}
str(demog[,1:10])
## 'data.frame': 183 obs. of 10 variables:
## $ SubID : Factor w/ 120 levels "PEDS007","PEDS008",..: 1 1 1 1 2 2 3 4 4 5 ...
## $ ScanDate : int 20100709 20100801 20110903 20120908 20100711 20101120 20100713 20100722 20101008 20100812 ...
## $ AgeAtScan : num 9.25 9.25 10.5 11.5 10 ...
## $ Sex : Factor w/ 2 levels "F","M": 1 1 1 1 1 1 1 1 1 2 ...
## $ Handedness : Factor w/ 3 levels "Left","N/A","Right": 3 3 3 3 3 3 3 3 3 3 ...
## $ FullScaleIQ : int 99 99 100 101 119 NA 90 111 NA 101 ...
## $ Verbal.IQ : int 96 96 106 111 111 NA 95 112 NA 104 ...
## $ Performance.IQ : int 100 100 95 90 124 NA 89 106 NA 98 ...
## $ Teen.Ladder.SES.score : int 5 5 3 2 6 NA 5 4 NA 8 ...
## $ Teen.Ladder.Community.Score: int 1 1 2 2 1 NA 4 2 NA 2 ...
str(demog[,11:20])
## 'data.frame': 183 obs. of 10 variables:
## $ Income : int 210 210 210 210 NA NA 70 NA NA 90 ...
## $ Father.Education : int 20 20 20 20 16 NA 16 20 NA 18 ...
## $ BV : num 2632317 2614991 2620778 2644788 2148076 ...
## $ CSF : num 402388 387787 395315 412062 333891 ...
## $ Cortex : num 1115803 1109720 1106519 1100008 921221 ...
## $ WM : num 725281 732096 732143 734156 563098 ...
## $ DGM : num 79423 80384 80793 86355 67999 ...
## $ BStem : num 33413 32478 33758 35530 28868 ...
## $ Cerebellum : num 276009 272526 272251 276678 232999 ...
## $ ThickMeanPrecentral_L: num 2.49 2.5 2.57 2.58 2.85 ...
if ( ! exists( "ch2" ) )
ch2 = antsImageRead( getANTsRData("ch2"))
t1fn = paste(bd,
"data/antsExamples/ANTsTutorial/data/template/PTBP_T1_BrainCerebellum.nii.gz",
sep='')
t1tem = antsImageRead( t1fn )
## NULL
## NULL
subnum=11
id=demog$SubID[subnum]
dt=demog$ScanDate[subnum]
exts=c( "mprage_t1.nii.gz" ,
"fa_anatomical.nii.gz",
"MeanCBFWarpedToT1.nii.gz",
"CorticalThickness.nii.gz",
"BrainSegmentation.nii.gz",
"AAL.nii.gz" )
pre=paste( bd,"data/antsExamples/ANTsTutorial/data/Subjects/",id,"/",dt,"/*",sep='')
fns=Sys.glob( paste( pre, exts , sep='') )
## NULL
## NULL
## NULL
## NULL
## NULL
## NULL
## NULL
## NULL
havecompletesubject=rep(FALSE,nrow(demog))
for ( x in 1:nrow(demog) )
{
id = demog$SubID[x]
dt = demog$ScanDate[x]
pre = paste( bd,"data/antsExamples/ANTsTutorial/data/Subjects/",
id,"/",dt,"/*",sep='')
fns = Sys.glob( paste( pre, exts , sep='') )
if ( length( fns ) == 6 ) havecompletesubject[x]=TRUE
}
nsub=sum( havecompletesubject )
We have 1 multiple modality subjects to process and to analyze.
We will use them to:
Build a template with priors via joint label fusion
Normalize, segment, compute thickness
Map other modalities to structural reference
Investigate Eigenanatomy and SCCAN for relating demographic and psychometric measurements to the imaging and …
relating imaging modalities to each other.
Finally, put all this together to create a reproducible analysis document for the PTBP.
Suppose a dataset that already contains segmentations.
Transform these segmentations to the template.
Perform a weighted voting to customize the segmentations for the template.
# t1tem read before
ilist=list()
seglist=list()
for ( x in which( havecompletesubject ) )
{
id = demog$SubID[x]
dt = demog$ScanDate[x]
pre = paste( bd,"data/antsExamples/ANTsTutorial/data/Subjects/",
id,"/",dt,"/*",sep='')
fns = Sys.glob( paste( pre, exts , sep='') )
t1 = antsImageRead( fns[1] )
mytx = antsRegistration( t1tem, t1, "SyN" )
seg = antsImageRead( fns[5] )
segw = antsApplyTransforms( t1tem, seg, mytx$fwd,
interpolator="NearestNeighbor" )
ilist = lappend( ilist, mytx$warpedmovout )
seglist = lappend( seglist, segw )
}
Joint label fusion is a multi-atlas segmentation method.
It performed well in several recent competitions (SATA 2012, SATA 2013)
We use it regularly in our studies to build template priors and to label cortical or deep structures in the brain.
A matrix \(M_x\) is defined by the number of atlas segmentations one has.
\(M_x(i,j)\) measures joint atlas errors wrt a target segmentation at a voxel.
Entries in \(M_x\) relate to the likelihood two atlases make the same error.
The key difference between joint label fusion and other label fusion methods is that it explicitly considers correlations among atlases, i.e., the dependence matrix, into voting weight assignment to reduce bias in the atlas set.
The expected label difference between the consensus solution obtained from weighted voting and the target segmentation is: \(w_x^T M_x w_x\).
Find atlas weights, \(w_x\), for each of \(A^i\) atlases, st \[ w_x^T ( M_x + \alpha \text{Id} ) w_x \] is minimized subject to \(\sum_{i=1}^n w_x(i)=1\).
Define \[ K_m = \langle |~A^{i,m}_N - T^{m}_N~|, |~A^{j,m}_N - T^{m}_N~| \rangle \] then \[ M_x(i,j) = ( \sum_m K_m )^\beta \] with \(N\) representing a neighborhood calculation, \(A^{i,m}\) representing the \(i^\text{th}\) atlas and the \(m^\text{th}\) modality. Lagrange multipliers yield: \[w_x=\frac{M_x^{-1} 1_n}{1_n^t M_x^{-1} 1_n}\]
Finally, local patch search is used to improve the neighborhoods that correspond.
Suppose that a pair of atlases \(A_1\) and \(A_2\) produce statistically independent label errors for a given target image. If \(A_1\) produces a wrong label 50% of the time and \(A_2\) produces a wrong label 20% of the time, we have \[ M_x = \begin{bmatrix} 0.5 & 0.1 \\ 0.1 & 0.2 \end{bmatrix} \] The optimal voting weights are then \(w_x = [0.2, 0.8]^t.\)
if ( !exists("ilist") )
ilist=imageFileNames2ImageList(
Sys.glob( paste(bd,"data/antsExamples/ANTsTutorial/data/JLF/ilist*.nii.gz",sep='')) )
if ( !exists("seglist") )
seglist=imageFileNames2ImageList(
Sys.glob( paste(bd,"data/antsExamples/ANTsTutorial/data/JLF/slist*.nii.gz",sep='')) )
mk=getMask(t1tem)
jlf=jointIntensityFusion( t1tem, mk, ilist,
labelList=seglist, rad=rep(3,3), rSearch=2,
computeProbs=TRUE, nonnegative=TRUE )
# write out results
if ( ! exists("tf") ) tf=tempfile()
antsImageWrite( jlf$segimg, paste(tf,'seg.nii.gz',sep='') )
antsImageWrite( jlf$predimg, paste(tf,'pred.nii.gz',sep='') )
for ( i in 1:length(jlf$probimgs) )
antsImageWrite( jlf$probimgs[[i]],
paste(tf,'prob',i,'.nii.gz',sep='') )
Standard JLF doesnt capture CSF very well.
Let’s fix that.
if ( !exists("mk") ) mk=getMask(t1tem)
ktem=kmeansSegmentation(t1tem,3,mk)
jlcsf=jlf$probimgs[[1]][ mk==1 ]
kmcsf=ktem$probabilityimages[[1]][ mk==1 ]
csfmat=rbind( jlcsf, kmcsf )
newcsfvec = apply( csfmat, MARGIN=2, FUN=max )
newcsf=makeImage( mk, newcsfvec )
plot( jlf$probimgs[[1]],dorot=1,window.img=c(0,4), slices=10 )
## NULL
plot( newcsf,dorot=1,window.img=c(0,4), slices=10 )
## NULL
csfind=1
ncsf=2:length(jlf$probimgs)
jlfmat=imageListToMatrix( jlf$probimgs, mk )
prsums=colSums( jlfmat ) # i=26 for testing
for ( i in 1:ncol(jlfmat) ) {
colvec=jlfmat[,i]
ncsfsum=sum( colvec[ncsf] )
csfval=newcsfvec[i]
colvec[csfind]=csfval # the fix is in!
if ( ncsfsum > 0 )
colvec[ncsf]=colvec[ncsf]/ncsfsum*(1.0-csfval)
else colvec[csfind]=1
jlfmat[,i]=colvec
}
newprobimgs=matrixToImages( jlfmat, mk )
jlf$probimgs[[ 1 ]] = newcsf
newprobimgs2 = renormalizeProbabilityImages(
jlf$probimgs, mk, 1 )
We can use the template, itself, as an example.
Obviously, we skip the registration step.
segnew <- atropos( d = 3, a = t1tem, m = '[0.05,1x1x1]',
c = '[2,0]', i = newprobimgs2, x = mk )
A more complete process is available in antsAtroposN4.sh
which is what we use for production.
We might go further and force the posteriors to be zero where the priors are zero …
## NULL
## NULL
## NULL
Let’s do this as example later …
Briefly reviewed PTBP and the modalities therein.
We showed a quick registration option.
We showed how to get the template “into shape” such that it can be used for processing new data.
JLF is key to this process.
We will discuss modality-specific processing and statistical testing procedures elsewhere.