ENIGMA MEG Working Group
UNDER CONSTRUCTION
Enigma Project - MEG working group
The enigma project is a large scale neuroimaging project to leverage data across multiple institutes to identify neuroimaging findings that are generally not possible at a single institute.
http://enigma.ini.usc.edu/
MEG Working group
LIST OF PARTICIPATING INSTITUTES
Data Analysis Format
Data analysis will be available in two flavors: 1) upload coregistered anonymized data to NIH, 2) perform analysis at acquisition site using EnigmaMeg scripts
Upload Data to be analyzed on NIH Biowulf cluster
Data anonymization will be performed using
Upload data to the NIH using the win/mac/linux compatible Globus application. This software is designed to handle "sensitive" data and has perpetual sessions that allow for interruption and continued upload.
https://docs.globus.org/ https://docs.globus.org/how-to/share-files/
Data Preperation Prior to Upload
Defacing the MRI prior to upload - suggested software is pydeface - custom scripts will be provided to confirm defacing.
Potential MEG Subanalyses
Healthy Volunteers Epilepsy Alzheimer's and dementia Motor Analysis Language Processing Anxiety Disorders Schizophrenia and related disorders Developmental disorders Traumatic Brain Injury Stroke
Singularity Container
Singularity is a container technology (similar to Docker). We are using containers to allow for easy distribution of the analysis pipeline and analysis consistency. Singularity was chosen becuase it does not require administrative priveledges during runtime and can be run on an HPC system or local computing resources. Being a container, the analysis can be run on any platform (linux, mac, windows, ...) with singularity installed.
For more information visit: https://sylabs.io/singularity/
The singularity def file can be found at:
github.com/........ upload
To build from scratch: sudo singularity build enigma_meg.sif enigma_meg.def
The pre-built singularity container can be downloaded from (recommended/easier):
Under Construction
Calling commands:
The singularity container will be provided in a folder that also includes a ./bin folder The commands within the bin folder are links to functions in the container Commands can be called from the full path: e.g.) /home/jstout/enigma/bin/enigma_rel_power -i /data/my_meg_data/subj1_resting_state.ds Commands can be added to the path and run using the command name: for BASH, add this line to the /home/$USER/.bashrc file and save: export PATH=$PATH:/this/is/the/path/to/enigma/bin enigma_rel_power -i /data/my_meg_data/subj1_resting_state.ds
Freesurfer related operations require a license file (download from https://surfer.nmr.mgh.harvard.edu/fswiki/License). You will need to put your license file in the enigma folder and call it fs_license.txt
Resting State Analysis
The analysis routine has been implemented in MNE python (https://mne.tools/stable/index.html) and packaged into a singularity container. This guarantees that differences in software dependencies and operating system configurations have been eliminated. The analysis has been chunked into steps to make processing straightforward.
v1
1) anatomical_proc.py 2) data_cleanup.py 3) source_analysis.py 4) summary_statistics.py
v2
1) process_anatomical.py 2) process_meg.py
Anatomical Preprocessing:
Surface models: Scalp, Outer Skull, Inner Skull, Pial Surface Coregistration of the MRI and MEG data Parcel extraction (freesurfer autorecon3) Subparcel calculation (mne ....) Source space BEM - single shell
Data Cleanup: (Based on MNE pipeline) (Goal - Remove bad channels, bad segments, correct EOG, correct ECG)
Perform ICA analysis to extract a heartbeat signal Extract heartbeats into epochs Autobad runs on evoked data: Run autobad on ECG epoched data to identify bad channels and bad epochs Perform ICA analysis after excluding bad data Use template to identify and exclude heartbeats and eyeblinks Remove bad ICAs **Save pre and post cleanup noise RMS for use in summary statistics
Source Analysis (and spectral analysis):
Calculate covariance matrix Load anatomical information (source space, bem, ...) Project data to cortical surface (MNE) Extract signal from ROIs Calculate spectrum from ROI and save to csv Calculate relative power by normalizing spectrum by all subject/frequency power
Summary Statistics:
*Evaluate for outliers Compile the single subject CSV files to create a mean and standard deviation at each parcel Save the subject number N and noise level of data
Outputs:
The outputs of the analysis will result in a csv file A csv file for each subject will be created in the subfolder of the singularity directory A final command can be run to calculate the summary statistics
Meta-Analysis
Submission of Results:
After calculating the local institutes summary statistics, the group csv file will be uploaded to the NIMH. The group csv will have mean and standard deviation for each parcel and frequency band A separate demographic csv will also be created with the demographic summary statistics
Meta-Analysis:
Statistical results will be produced using hierarchical meta-analysis to adjust for site-specific variance.
Data Harmonization (examples in MRI):
COMBAT - homepage:www.elsevier.com/locate/neuroimagehttp://dx.doi.org/10.1016/j.neuroimage.2017.08.047 Original Paper - (genetics) Adjusting batch effects in microarray expression data using empirical Bayes methods W.E. Johnson, C. Li and A. Rabinovic Biostatistics, 8 (2007), pp. 118-127 COMBAT Normalization in ENIGMA - https://doi.org/10.1016/j.neuroimage.2020.116956 COVBAT - https://www.biorxiv.org/content/10.1101/858415v2.full.pdf DeepHarmony (deep learning harmonization) - https://doi.org/10.1016/j.mri.2019.05.041 https://www.sciencedirect.com/science/article/pii/S1053811919310419 https://www.biorxiv.org/content/10.1101/2020.04.14.041582v1.abstract