MEG analysis on Biowulf
!!Under Construction!!
Biowulf brief intro
Biowulf (biowulf.nih.gov) is the head node of the Biowulf cluster at NIH - https://hpc.nih.gov/docs/userguide.html
Helix - is the storage server attached to the biowulf cluster
Analysis of data should not be performed on the biowulf head node, but run through an sinteractive node or swarm process.
To start with, there are a limited number of commands loaded on the system. To access more programs use module load. To search, use module spider.
e.g. module load afni
SAM MEG Data Analysis
module load afni module load ctf module load samsrcv3/20180713-c5e1042
MNE python data analysis
Making your own python module
Build the python conda environment
It is recommended to create an install script so that this can be sent to a slurm job
# Load conda - if set up according to the HPC page, this should work source /data/${USER}/conda/etc/profile.d/conda.sh; conda activate base # echo mamba create -p ${PATH_TO_OUTPUT} condaPackage1 condaPackage2 conda-forge::condaForgePackage1 -y > installFile.sh # Make sure to include the -y or the job will hang waiting for user response # Also make sure you have an active conda prompt when submitting the swarm, or else it will fail echo mamba create -p /data/ML_MEG/python_modules/mne0.24.1 jupyter ipython conda-forge::mne -y > python_install.sh swarm -f ./python_install.sh -g 4 -t 4
Make a module file
To display most of the contents of a module file run
module display python #For the python module
Output:
---------------------------------------------------------------------------------- /usr/local/lmod/modulefiles/python/3.8.lua: ---------------------------------------------------------------------------------- family("python") prepend_path("PATH","/usr/local/Anaconda/envs/py3.8/bin") pushenv("OMP_NUM_THREADS","1")
Copy Template to your module folder
#MyModule is the family name of the code / ${Version}.lua cp /usr/local/lmod/modulefiles/python/3.8.lua ${myModuleFilesDir}/${MyModule}/0.1.lua