MNE Python Tutorial 2021: Difference between revisions

From MEG Core
Jump to navigation Jump to search
Content added Content deleted
 
(31 intermediate revisions by 2 users not shown)
Line 1: Line 1:
=== Recording now available for download! ===

[https://megcore.nih.gov/MEG/NIMH-MEG-Core_MNE-Tutorial_10-01-21.mp4 MNE Python Tutorial Download]

=Steps to prepare prior to tutorial=
=Steps to prepare prior to tutorial=
Prepare python environment - For 3D rendering and interactive plots, this should be installed on your local computer <br>
Prepare python environment - For 3D rendering and interactive plots, this should be installed on your local computer <br>
Line 10: Line 14:
conda activate mne
conda activate mne


=Wait until 09/30/21 and then Download the sample data=
=Download the sample data and scripts.=
The data is available from the resources below (Approximately 1Gb compressed and 1.8Gb uncompressed). A final jupyter notebook may be sent over email prior to the tutorial - this will replace the V1 version.
=== NIMH Users: Download here ===
=== NIMH Users: Download here ===
scp <USERNAME>@helix.nih.gov:/data/NIMH_scratch/mne_tutorial/mne_tutorial.tar.gz <NEW PATH>
scp <USERNAME>@helix.nih.gov:/data/NIMH_scratch/mne_tutorial/mne_tutorial.tar.gz ./
cd <NEW PATH>
tar -xvf mne_tutorial.tar.gz
tar -xvf mne_tutorial.tar.gz


Line 20: Line 24:
An email will be sent with the path to the data
An email will be sent with the path to the data


=Start Jupyter Notebook=
=Running tutorial script - starting Jupyter Notebook=
cd mne_tutorial
cd <NEW PATH>/nimh_tutorial_data
conda activate mne
jupyter notebook mne_tutorial_10_01_21.ipynb
jupyter notebook "MNE Python Tutorial-v1.ipynb"


=Prep Non-tutorial Data=
=Prep Non-tutorial Data=
==MRI Prep==
Run Freesurfer on Dataset: recon-all -all -i <MRI Used w/ MEG> -s <SUBJECT>
===Freesurfer reconstruction===
Run mne.bem.make_watershed_bem() on your subject
# Run Freesurfer on Dataset (takes 8+ hrs):
# Does not run on BRIK/HEAD - convert to .nii first -- e.g (module load afni; 3dAFNItoNIFTI anat+orig. )
# (OPTION1) - Install and process freesurfer locally on computer https://surfer.nmr.mgh.harvard.edu/fswiki/rel7downloads
recon-all -all -i <MRI Used w/ MEG> -s <SUBJECT>
# (OPTION2) - INSTRUCTIONS FOR BIOWULF - Run line by line on the bash terminal
subjid= #Set Freesurfer ID
mri= #Set MRI Name - must be a nifti file NOT BRIK/HEAD
export SUBJECTS_DIR= #Set output folder, Make sure this directory exists
module load freesurfer
echo -e '#!/bin/bash\nrecon-all -all -i ' ${mri} -s ${subjid} #VERIFY THIS LOOKS CORRECT
echo -e '#!/bin/bash\nrecon-all -all -i ' ${mri} -s ${subjid} | sbatch --mem=3g --time=24:00:00 #SUBMITS JOB TO SBATCH

===Watershed Processing for Boundary Element Model===
#In bash terminal (5-10minutes)
#Requires freesurfer to be installed
subjid=<SUBJECT>
subjects_dir=<SUBJECTS_DIR>
conda activate mne
python -c "import mne; mne.bem.make_watershed_bem(subject='${subjid}', subjects_dir='${subjects_dir}')"

==MEG Prep==
# The events of interest must be denoted in the MarkerFile.mrk
https://megcore.nih.gov/index.php?title=Pyctf

# The marker file can also be processed with the hv_proc code that is distributed with the data
# A limited demonstration on the day of the tutorial will be given to process the data triggers into the MarkerFile.

==Copy Data To Tutorial Folder==
#Move the subject's freesurfer reconstruction to the mne_tutorial folder. If needed rename the output freesurfer subject folder to the 8 digit meg id found in the meg data
cp -R ${SUBJECTS_DIR}/${subjid} mne_tutorial/SUBJECTS_DIR/
#Copy meg data to top level of mne_tutorial folder
cp -R <MEGDATASET.ds> mne_tutorial/
#Copy dataset used for the coregistration of the MRI
cp {BRIK/HEAD w/ fiducials marked OR Brainsight .txt file } mne_tutorial/

==General Guidelines for processing==
https://mne.tools/stable/overview/cookbook.html

==Additional Code (included w/tutorial data - automatically installed during jupyter notebook processing)==
#Manual installation
#conda activate ${CONDA ENV Name}
pip install git+https://github.com/nih-megcore/pyctf-lite
pip install git+https://github.com/nih-megcore/hv_proc
pip install git+https://github.com/nih-megcore/nih_to_mne

===pyctf-lite===
More info: https://github.com/nih-megcore/pyctf-lite
Utility functions for reading in ctf dataset

===hv_proc===
More info: https://github.com/nih-megcore/hv_proc
hv_proc: Contains functions to evaluate triggers to generate events - write to dataframe - write to ctf MarkerFile.mrk.
MarkerFile.mrk reads into mne as raw.annotations - can be converted to events using events, event_ids = mne.events_from_annotations(raw)

===nih_to_mne===
More info: https://github.com/nih-megcore/nih_to_mne
nih_to_mne:
calc_mnetrans.py -h -- Creates MNE transformation matrix
bstags.py installed on commandline to use w/orthohull

==This section can be ignored==
===NOTES to reproduce===
Backup of tutorial: MegServer ...../mne_tutorial/mne_tutorial.tar.gz
Inside backup tarball - environment/pip_freeze.txt and environment/mne_tutorial_env.yml (conda)
Also backup of full environment (linux:ubuntu) using conda-pack: envs_BAK/mne_tutorial1_env.tar.gz

Latest revision as of 10:28, 11 January 2022

Recording now available for download!

MNE Python Tutorial Download

Steps to prepare prior to tutorial

Prepare python environment - For 3D rendering and interactive plots, this should be installed on your local computer

 !!IF YOU DO NOT HAVE MINICONDA/ANACONDA INSTALLED -  have IT install miniconda under your user account!!
 !!If you already have an mne environment, you can use another name for the environment and adjust accordingly!!
 
 conda activate base 
 conda install -n base mamba -c conda-forge -y  
 mamba create -n mne conda-forge::mne main::pip main:jupyter -y
 conda activate mne

Download the sample data and scripts.

The data is available from the resources below (Approximately 1Gb compressed and 1.8Gb uncompressed). A final jupyter notebook may be sent over email prior to the tutorial - this will replace the V1 version.

NIMH Users: Download here

 scp <USERNAME>@helix.nih.gov:/data/NIMH_scratch/mne_tutorial/mne_tutorial.tar.gz ./
 tar -xvf mne_tutorial.tar.gz

Non-NIMH MEG Users: Download from MEG Data server

 The data is located on the MEG data server and is accessible by all users with accounts
 An email will be sent with the path to the data

Running tutorial script - starting Jupyter Notebook

 cd mne_tutorial
 conda activate mne
 jupyter notebook "MNE Python Tutorial-v1.ipynb"

Prep Non-tutorial Data

MRI Prep

Freesurfer reconstruction

 # Run Freesurfer on Dataset (takes 8+ hrs): 
 # Does not run on BRIK/HEAD - convert to .nii first   -- e.g (module load afni; 3dAFNItoNIFTI  anat+orig. )
 
 # (OPTION1) - Install and process freesurfer locally on computer https://surfer.nmr.mgh.harvard.edu/fswiki/rel7downloads
 recon-all -all -i <MRI Used w/ MEG> -s <SUBJECT>
 
 # (OPTION2) - INSTRUCTIONS FOR BIOWULF - Run line by line on the bash terminal
 subjid=       #Set Freesurfer ID
 mri=          #Set MRI Name - must be a nifti file NOT BRIK/HEAD
 export SUBJECTS_DIR=      #Set output folder, Make sure this directory exists
 module load freesurfer
 echo -e '#!/bin/bash\nrecon-all -all  -i ' ${mri} -s ${subjid}    #VERIFY THIS LOOKS CORRECT
 
 echo -e '#!/bin/bash\nrecon-all -all  -i ' ${mri} -s ${subjid} | sbatch  --mem=3g --time=24:00:00     #SUBMITS JOB TO SBATCH

Watershed Processing for Boundary Element Model

 #In bash terminal (5-10minutes)
 #Requires freesurfer to be installed
 subjid=<SUBJECT>
 subjects_dir=<SUBJECTS_DIR>
 conda activate mne
 python -c "import mne; mne.bem.make_watershed_bem(subject='${subjid}', subjects_dir='${subjects_dir}')"

MEG Prep

 # The events of interest must be denoted in the MarkerFile.mrk
 https://megcore.nih.gov/index.php?title=Pyctf
 # The marker file can also be processed with the hv_proc code that is distributed with the data
 # A limited demonstration on the day of the tutorial will be given to process the data triggers into the MarkerFile.

Copy Data To Tutorial Folder

 #Move the subject's freesurfer reconstruction to the mne_tutorial folder.  If needed rename the output freesurfer subject folder to the 8 digit meg id found in the meg data
 cp -R ${SUBJECTS_DIR}/${subjid} mne_tutorial/SUBJECTS_DIR/    
 
 #Copy meg data to top level of mne_tutorial folder
 cp -R <MEGDATASET.ds> mne_tutorial/
 
 #Copy dataset used for the coregistration of the MRI
 cp {BRIK/HEAD w/ fiducials marked OR Brainsight .txt file } mne_tutorial/

General Guidelines for processing

 https://mne.tools/stable/overview/cookbook.html

Additional Code (included w/tutorial data - automatically installed during jupyter notebook processing)

 #Manual installation
 #conda activate ${CONDA ENV Name}
 pip install git+https://github.com/nih-megcore/pyctf-lite
 pip install git+https://github.com/nih-megcore/hv_proc 
 pip install git+https://github.com/nih-megcore/nih_to_mne

pyctf-lite

 More info: https://github.com/nih-megcore/pyctf-lite
 Utility functions for reading in ctf dataset

hv_proc

 More info:  https://github.com/nih-megcore/hv_proc
 hv_proc: Contains functions to evaluate triggers to generate events - write to dataframe - write to ctf MarkerFile.mrk.  
 MarkerFile.mrk reads into mne as raw.annotations - can be converted to events using events, event_ids = mne.events_from_annotations(raw)

nih_to_mne

 More info:  https://github.com/nih-megcore/nih_to_mne
 nih_to_mne: 
   calc_mnetrans.py -h  -- Creates MNE transformation matrix 
   bstags.py installed on commandline to use w/orthohull

This section can be ignored

NOTES to reproduce

 Backup of tutorial:  MegServer ...../mne_tutorial/mne_tutorial.tar.gz
 Inside backup tarball - environment/pip_freeze.txt  and environment/mne_tutorial_env.yml (conda)
 Also backup of full environment (linux:ubuntu) using conda-pack: envs_BAK/mne_tutorial1_env.tar.gz