MEG analysis on Biowulf: Difference between revisions
Jump to navigation
Jump to search
Content added Content deleted
(Created page with "!!Under Construction!! ==== Biowulf brief intro ==== Biowulf (biowulf.nih.gov) is the head node of the Biowulf cluster at NIH - https://hpc.nih.gov/docs/userguide.html<br> He...") |
|||
Line 21: | Line 21: | ||
==== Making your own python module ==== |
==== Making your own python module ==== |
||
It is recommended to create an install script so that this can be sent to a slurm job |
|||
# echo mamba create -p ${PATH_TO_OUTPUT} condaPackage1 condaPackage2 conda-forge::condaForgePackage1 -y > installFile.sh |
|||
echo mamba create -p /data/ML_MEG/python_modules/mne0.24.1 jupyter ipython conda-forge::mne -y > python_install.sh |
|||
swarm -f ./python_install.sh -g 4 -t 4 |
Revision as of 10:41, 8 March 2022
!!Under Construction!!
Biowulf brief intro
Biowulf (biowulf.nih.gov) is the head node of the Biowulf cluster at NIH - https://hpc.nih.gov/docs/userguide.html
Helix - is the storage server attached to the biowulf cluster
Analysis of data should not be performed on the biowulf head node, but run through an sinteractive node or swarm process.
To start with, there are a limited number of commands loaded on the system. To access more programs use module load. To search, use module spider.
e.g. module load afni
SAM MEG Data Analysis
module load afni module load ctf module load samsrcv3/20180713-c5e1042
MNE python data analysis
Making your own python module
It is recommended to create an install script so that this can be sent to a slurm job
# echo mamba create -p ${PATH_TO_OUTPUT} condaPackage1 condaPackage2 conda-forge::condaForgePackage1 -y > installFile.sh echo mamba create -p /data/ML_MEG/python_modules/mne0.24.1 jupyter ipython conda-forge::mne -y > python_install.sh swarm -f ./python_install.sh -g 4 -t 4