Healthy Volunteer Protocol Upload Process

From MEG Core
Revision as of 08:49, 12 May 2020 by Jstout (talk | contribs)
Jump to navigation Jump to search

HV Bids processing requires

 pyctf
 General Utilities to interface with CTF data using python.  Also provides bids processing utilities.
 https://megcore.nih.gov/index.php/Pyctf  -  A more recent update will be coming available soon
 hv_proc
 Python scripts to extract and mark HV specific stimuli and validate trigger/response timing and data QA.
 **Open Access in development
 NIH MEG Bids processing
 Routines to convert the CTF MEG data into BIDs format using mne_bids and bids_validator
 https://github.com/nih-fmrif/meg_bids/blob/master/1_mne_bids_extractor.ipynb
 mne_bids
 https://mne.tools/mne-bids/stable/index.html
 pip install -U mne
 pip install -U mne-bids
 Afni
 Required for extracting HPI coil locations.
 https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/background_install/install_instructs/index.html

Bids Validator

 https://github.com/bids-standard/bids-validator

BIDS format / OpenNeuro

All data is be converted to BIDS format and uploaded to OpenNeuro as an open access dataset. Data triggers are cleaned using several routines listed below. These have been used to realign stimulus triggers to optical onset of the projector. Datasets that have logfiles have been merged with the trigger data to label triggers and responses.


Processing on Biowulf

 To process the scripts on biowulf:
 pyctf & hv_proc must be in your conda path  (if necessary add filepaths to a .pth file in the conda site-packages folder)
 module load ctf
 module load afni