Skip to content

Commit

Permalink
first commit
Browse files Browse the repository at this point in the history
  • Loading branch information
jdkent committed Apr 26, 2019
0 parents commit 33791e0
Show file tree
Hide file tree
Showing 4 changed files with 279 additions and 0 deletions.
102 changes: 102 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
# xcpEngine on Argon

This guide is based off of [xcpEngine's own tutorial](https://xcpengine.readthedocs.io/config/tutorial.html)

This a guide for running [xcpEngine](https://xcpengine.readthedocs.io/index.html) on [Argon](https://wiki.uiowa.edu/display/hpcdocs/Argon+Cluster).
xcpEngine is a tool that takes the output from [fmriprep](https://fmriprep.readthedocs.io/en/stable/)
and completes a variety of analytical outputs (e.g. reho, alff, etc) as both niftis and tabular data (i.e. CSVs).
The CSV data is data from a per-region/parcel basis from an atlas.
xcpEngine supports several atlases, so you can get output from several atlas in one run.

The first step will be to log into argon (through a terminal)
```
# example ssh [email protected]
ssh <hawkid>@argon.hpc.uiowa.edu
```
and if you're off campus, you can use port 40
```
ssh <hawkid>@argon.hpc.uiowa.edu -p 40
```

While logged into Argon, we will be using the [singularity](https://www.sylabs.io/guides/2.6/user-guide/)
image of xcpEngine so we can run it on Argon without having to worry about installing all the necessary software.
Even though xcpEngine does not have an image on [singularityhub](https://singularity-hub.org/),
we can build a singularity image from xcpEngine's docker image on [dockerhub](https://hub.docker.com/r/pennbbl/xcpengine).

It's better to build from a tagged version of an image (e.g. `1.0`) instead of `lastest` because `latest` represents
the most current version of the image and everytime they change the image, the `latest` tag will now point to a different image.
If we want to be reproducible (it's all the rage these days), using the tag `1.0` should always point to the same
image across eternity and should always give us the same result.

```
# make a place to keep our singularity images
mkdir -p ${HOME}/simgs
# make our singularity image
singularity build ~/simgs/xcpEngine_v1.0.simg docker://pennbbl/xcpengine:1.0
```

To run xcpEngine, we need data, specifically data that has been processed by fmriprep.
Thankfully, the developers of xcpEngine have provided an example dataset.
```
# url of the data
curl -o fmriprep.tar.gz -SL https://ndownloader.figshare.com/files/13598186
# untar (extract) the data
tar xvf fmriprep.tar.gz
```

Next, we need two files: 1) a design file that defines what steps we want to run on our data
and 2) a cohort file that specifies which participants to run.

```
# download the design file (if necessary)
curl -O https://raw.githubusercontent.com/PennBBL/xcpEngine/master/designs/fc-36p.dsn
```

Here are the internals of the cohort file:
```
id0,img
sub-1,fmriprep/sub-1/func/sub-1_task-rest_space-T1w_desc-preproc_bold.nii.gz
```

All we need now is a directory to place the outputs, then we can run xcpEngine.
```
mkdir -p ./xcp_output
```

Now we are ready to run xcpEngine!
I made a little script to help make a "job" file we can submit to the cluster.
It requires our email address as input:
```
./create_job.sh [email protected]
```

`create_job.sh` should create a "job" file named `sample_xcpengine.job` that looks like this:
```
#!/bin/bash
#$ -pe smp 6
#$ -q UI
#$ -m bea
#$ -M [email protected]
#$ -e /Users/jdkent/xcpEngine/fc.err
#$ -o /Users/jdkent/xcpEngine/fc.out
singularity run -H /Users/jdkent/singularity_home \
/Users/jdkent/simgs/xcpEngine_v1.0.simg \
-d /Users/jdkent/xcpEngine/fc-36p.dsn \
-c /Users/jdkent/xcpEngine/func_cohort.csv \
-o /Users/jdkent/xcpEngine/xcp_output \
-t 1 -r /Users/jdkent/xcpEngine
```

and we can submit `sample_xcpengine.job` to the cluster using:
```
qsub sample_xcpengine.job
```

We will get an email when the job starts and finishes.

And you're done running it!

TODO: analyze the outputs

21 changes: 21 additions & 0 deletions create_job.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/bin/bash

email=$1

echo "\
#!/bin/bash
#$ -pe smp 6
#$ -q UI
#$ -m bea
#$ -M ${email}
#$ -e ${PWD}/fc.err
#$ -o ${PWD}/fc.out
singularity run -H ${HOME}/singularity_home \
${HOME}/simgs/xcpEngine_v1.0.simg \
-d ${PWD}/fc-36p.dsn \
-c ${PWD}/func_cohort.csv \
-o ${PWD}/xcp_output \
-t 1 -r ${PWD}
" > sample_xcpengine.job
154 changes: 154 additions & 0 deletions fc-36p.dsn
Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@
#!/usr/bin/env bash

###################################################################
# ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ #
###################################################################


###################################################################
# This design file stores the values of all variables required to
# execute a complete neuroimage processing pipeline. You may
# execute the analysis specified in this design file by calling
# (in any v4 or higher bash terminal):
#
# xcpEngine -d <design> -c <cohort> -o <output> <options>
#
# Variables fall into five general categories:
# * ANALYSIS VARIABLES are used at all stages of this analysis.
# * PIPELINE specifies the modules that comprise the analysis.
# * MODULE VARIABLES are used during one stage of the analysis.
# These are typically array variables with array
# indices equal to the index of the module that
# calls them.
# * OUTPUT VARIABLES may be used at all stages of the analysis.
# These are typically array variables with array
# indices equal to the value of the primary
# subject identifier. They will appear only in
# localised design files.
###################################################################


###################################################################
# ANALYSIS VARIABLES
###################################################################

analysis=fc_$(whoami)
design=/Users/jdkent/xcpEngine/fc-36p.dsn
sequence=anatomical
standard=MNI%2x2x2

###################################################################
# PIPELINE
###################################################################

pipeline=prestats,confound2,regress,fcon,reho,alff,roiquant,norm,qcfc

###################################################################
# 1 PRESTATS
###################################################################

prestats_rerun[1]=1
prestats_cleanup[1]=1
prestats_process[1]=FMP


###################################################################
# 2 CONFOUND2
###################################################################

confound2_rps[2]=1
confound2_rms[2]=0
confound2_wm[2]=1
confound2_csf[2]=1
confound2_gsr[2]=1
confound2_acompcor[2]=0
confound2_tcompcor[2]=0
confound2_aroma[2]=0
confound2_past[2]=0
confound2_dx[2]=1
confound2_sq[2]=2
confound2_custom[2]=
confound2_censor[2]=0
confound2_censor_contig[2]=0
confound2_framewise[2]=rmss:0.083,fds:0.167,dv:2
confound2_rerun[2]=0
confound2_cleanup[2]=1

###################################################################
# 3 REGRESS
###################################################################

regress_tmpf[3]=butterworth
regress_hipass[3]=0.01
regress_lopass[3]=0.08
regress_tmpf_order[3]=1
regress_tmpf_pass[3]=2
regress_tmpf_ripple[3]=0.5
regress_tmpf_ripple2[3]=20
regress_dmdt[3]=2
regress_1ddt[3]=1
regress_smo[3]=6
regress_sptf[3]=susan
regress_usan[3]=default
regress_usan_space[3]=
regress_rerun[3]=0
regress_cleanup[3]=1
regress_process[3]=DMT-TMP-REG

###################################################################
# 4 FCON
###################################################################

fcon_atlas[4]=all
fcon_metric[4]=corrcoef
fcon_thr[4]=N
fcon_pad[4]=FALSE
fcon_rerun[4]=0
fcon_cleanup[4]=1

###################################################################
# 5 REHO
###################################################################

reho_nhood[5]=vertices
reho_roikw[5]=0 # does nothing at the moment
reho_sptf[5]=susan
reho_smo[5]=6
reho_rerun[5]=0
reho_cleanup[5]=1

###################################################################
# 6 ALFF
###################################################################

alff_hipass[6]=0.01
alff_lopass[6]=0.08
alff_sptf[6]=susan
alff_smo[6]=6
alff_rerun[6]=0
alff_cleanup[6]=1

###################################################################
# 7 ROIQUANT
###################################################################

roiquant_atlas[7]=all
roiquant_globals[7]=1
roiquant_vol[7]=0
roiquant_rerun[7]=0
roiquant_cleanup[7]=1

###################################################################
# 8 NORM
###################################################################
norm_primary[8]=1
norm_rerun[8]=0
norm_cleanup[8]=1

##################################################################
# 9 QCFC
###################################################################
qcfc_atlas[9]=power264
qcfc_sig[9]=fdr
qcfc_rerun[9]=0
qcfc_cleanup[9]=1
2 changes: 2 additions & 0 deletions func_cohort.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
id0,img
sub-1,fmriprep/sub-1/func/sub-1_task-rest_space-T1w_desc-preproc_bold.nii.gz

0 comments on commit 33791e0

Please sign in to comment.