Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Brain Imaging Data Structure

1,027 views

Published on

INCF NeuroInformatics Congress 2015

Published in: Science
  • Be the first to comment

  • Be the first to like this

Brain Imaging Data Structure

  1. 1. Brain Imaging Data Structure CHRIS  GORGOLEWSKI STANFORD  UNIVERSITY
  2. 2. Getting lost in your data
  3. 3. Getting lost in your data • MRI has been used to study the human brain for over 20 years. • Despite similarities in experimental designs and data types each researcher tends to organize and describe their data in their own way. http://www.nature.com/news/brain-­‐imaging -­‐fmri-­‐2-­‐0-­‐1.10365
  4. 4. Getting lost in your data Heterogeneity in data description practices causes: • problems in sharing data (even within the same lab), • unnecessary manual metadata input when running processing pipelines, • no way to automatically validate completeness of a given dataset.
  5. 5. Brain Imaging Data Structure Brain Imaging Data Structure (BIDS) is a new standard for organizing results of a human neuroimaging experiment.
  6. 6. Who is it for? 1. Lab PIs. It will make handing over one dataset from one student/postdocto another easy. 2. Workflow developers. It’s easier to write pipelines expecting a particular file organization. 3. Database curators. Accepting one dataset format will make curation easier.
  7. 7. Principles behind BIDS 1. Adoption is crucial. 2. Don’t reinvent the wheel. 3. Some meta data is better than no metadata 4. Don’t rely on external software (databases) or complicated file formats (RDF). 5. Aim to capture 80% of experiments but give the remaining 20% space to extend the standard.
  8. 8. Implementation 1. Some metadata is encoded in the folder structure. 2. Some metadata is replicated in the file name for simplicity. 3. Use of tab separated files for tabular data. 4. Use of compressed NIFTI files for imaging data. 5. Use of JSON files for dictionary type metadata. 6. Use of legacy text file formats for b vectors/values and physiological data. 7. Make certain folder hierarchy levels optional for simplicity. 8. Allows for arbitrary files not covered by the spec to be included in any way the researchers deem appropriate.
  9. 9. Features 1. Handles multiple sessions and runs 2. Supports sparse acquisition (via slice timing) 3. Supports contiguous acquisition covariates (breathing, cardiac etc.) 4. Supports multiple field map formats 5. Supports multiple types of anatomical scans 6. Supports function MRI: both task based and resting state. 7. Supports diffusions data (together with corresponding bvec, bval files) 8. Supports behavioral variables on the level of subjects (demographics), sessions, and runs.
  10. 10. Folder organization (simplified) sub-control01/ anat/ sub-control01_T1w.nii.gz sub-control01_T1w.json sub-control01_T2w.nii.gz sub-control01_T2w.json func/ sub-control01_task- nback_bold.nii.gz sub-control01_task-nback_bold.json sub-control01_task-nback_events.tsv sub-control01_task-nback_cont- physio.tsv sub-control01_task-nback_cont- physio.json sub-control01_task- nback_sbref.nii.gz dwi/ sub-control01_dwi.nii.gz sub-control01_dwi.bval sub-control01_dwi.bvec fmap sub-control01_phasediff.nii.gz sub-control01_phasediff.json sub-control01_magnitude1.nii.gz sub-control01_scans.tsv tasks.json participants.tsv dataset_description.json README CHANGES
  11. 11. Example events file onset duration trial_type ResponseTime 1.2 0.6 go 1.435 5.6 0.6 stop 1.739 …
  12. 12. Example metadata file { "RepetitionTime": 3.0, "EchoTime": 0.03, "FlipAngle": 78, "SliceTiming": [0.0, 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, 1.4, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8], "MultibandAccellerationFactor": 4, "ParallelReductionFactorInPlane": 2, "InPlanePhaseEncodingDirection": "AP" }
  13. 13. Example demographics file participant_id age sex sub-001 34 M Sub-002 12 F Sub-003 33 F
  14. 14. Keys to success 1. Make the community involved in the design process. 2. Provide a good validation tool (browser based!). 3. Build tools/workflows/pipelines that make adopting BIDS worthwhile (AA, Nipype, C-PAC etc.) 4. Get support from databases (LORIS, COINS, SciTran, OpenfMRI, XNAT, etc.)
  15. 15. Stanford| Center for Reproducible Neuroscience Analyzing for reproducibility reproducibility.stanford.edu
  16. 16. Acknowledgments The Poldrack Lab @ Stanford Data Sharing Task Force
  17. 17. bids.neuroimaging.io

×