Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

FMRIPREP & MRIQC Focus: MRIQC

419 views

Published on

Slides from the FMRIPREP & MRIQC Focus group held Jan 13, 2017 at Stanford University.

MRIQC provides a series of image processing workflows to extract and compute a series of NR (no-reference), IQMs (image quality metrics) to be used in QAPs (quality assessment protocols) for MRI (magnetic resonance imaging). http://mriqc.readthedocs.org

Published in: Science
  • Be the first to comment

FMRIPREP & MRIQC Focus: MRIQC

  1. 1. Stanford University Focus group: MRIQC Quality Control of structural and functional MRI Oscar Esteban <oesteban@stanford.edu> Poldrack Lab, Stanford University January 13th , 2017
  2. 2. Stanford University “Have no fear of perfection – you’ll never reach it.” Salvador Dalí 1/28
  3. 3. Stanford University 2/28 Introduction MRIQC Visual reports Running MRIQC Questions References References But sometimes, we are way too far from perfection (MRIQC mosaic, courtesy of Joke Durnez)
  4. 4. Stanford University 3/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Habitual suspects in structural MRI: motion http://mriquestions.com/choosing-pefe-direction.html https://www.cis.rit.edu/htbooks/mri/chap-11/k6-12.htm
  5. 5. Stanford University 4/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Habitual suspects in structural MRI: other Intensity Non-Uniformity (INU) Noise
  6. 6. Stanford University 5/28 Introduction MRIQC Visual reports Running MRIQC Questions References References And expect the unexpected
  7. 7. Stanford University 6/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Artifacts in functional MRI N/2 Ghost Spikes
  8. 8. Stanford University 7/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Manual assessment on the ABIDE dataset (N=1102)
  9. 9. Stanford University 8/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Close-up
  10. 10. Stanford University 9/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Manual assessment on the ABIDE dataset (N=1102) - Time consuming - Intra-rater bias - Inter-rater bias - Rater 1: 15% reject
  11. 11. Stanford University 10/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Objectives of Quality Control Exclusion criteria – as objective as possible. Quality Badge – Deciding on using a public dataset (is it appropriate for my design/study?) Diagnosing fixable problems with data acquisition process: Types of sequences Scanner malfunctions Head padding Participant instructions
  12. 12. Stanford University 11/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Image Quality Metrics (IQMs) Physical phantoms (Price et al., 1990) No-reference Image Quality Metrics (IQMs) (Woodard and Carley-Spencer, 2006) Aim at artifacts and analyze noise distribution (Mortamet et al., 2009) Combined general volumetric and artifact-targeted IQMs (Pizarro et al., 2016)
  13. 13. Stanford University 11/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Image Quality Metrics (IQMs) Physical phantoms (Price et al., 1990) No-reference Image Quality Metrics (IQMs) (Woodard and Carley-Spencer, 2006) Aim at artifacts and analyze noise distribution (Mortamet et al., 2009) Combined general volumetric and artifact-targeted IQMs (Pizarro et al., 2016)
  14. 14. Stanford University 11/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Image Quality Metrics (IQMs) Physical phantoms (Price et al., 1990) No-reference Image Quality Metrics (IQMs) (Woodard and Carley-Spencer, 2006) Aim at artifacts and analyze noise distribution (Mortamet et al., 2009) Combined general volumetric and artifact-targeted IQMs (Pizarro et al., 2016)
  15. 15. Stanford University 11/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Image Quality Metrics (IQMs) Physical phantoms (Price et al., 1990) No-reference Image Quality Metrics (IQMs) (Woodard and Carley-Spencer, 2006) Aim at artifacts and analyze noise distribution (Mortamet et al., 2009) Combined general volumetric and artifact-targeted IQMs (Pizarro et al., 2016)
  16. 16. Stanford University 12/28 Introduction MRIQC Visual reports Running MRIQC Questions References References IQMs: Structural MRI Noise measurement Signal-to-noise ratio (SNR) - higher is better Contrast-to-noise ration (CNR) - higher is better Sharpness (full-width half maximum estimations) - smaller FWHM is better Goodness of fit of a noise model into the noise in the background (QI2) - lower is better (Mortamet et al., 2009) Coefficient of Joint Variation (CJV) - lower is better Information theory Foreground-Background Energy Ratio (FBER) - higher is better Entropy Focus Criterion (EFC) - lower is better Artifacts Segmentation using mathematical morphology (QI1) - lower is better Measurements on the estimated INU (intensity non-uniformity) - values around 1.0 Partial Volume Errors (PVE) - lower is better Other: summary statistics, intracranial volume fractions (ICV)
  17. 17. Stanford University 13/28 Introduction MRIQC Visual reports Running MRIQC Questions References References IQMs: Functional MRI Noise measurement: SNR, tSNR, temporal standard deviation Information theory: EFC, FBER Confounds and artifacts: Framewise Displacement (FD) - lower is better (Standardized) DVARS (D referring to temporal derivative of timecourses, VARS referring to RMS variance over voxels) - lower is better Ghost-to-Signal ratio (GSR) - lower is better Global correlation (GCOR) - lower is better Spikes (high frequency and global intensity) AFNI’s outlier detection and quality indexes
  18. 18. Stanford University 14/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Design of MRIQC Inputs: BIDS (Gorgolewski et al., 2016b) Command-line interface: BIDS-Apps (Gorgolewski et al., 2016a) The simplest possible pipeline The fastest possible pipeline Robust: works on all data
  19. 19. Stanford University 15/28 Introduction MRIQC Visual reports Running MRIQC Questions References References MRIQC Features What can be expected from MRIQC: A table of IQMs per subject The group visual report An individual visual report per subject A first-round exercise for the data What is not expected from MRIQC: The triage of participants (WIP) The derivatives of processing Non-standard morphologies: developing brains, pathology, etc.
  20. 20. Stanford University 15/28 Introduction MRIQC Visual reports Running MRIQC Questions References References MRIQC Features What can be expected from MRIQC: A table of IQMs per subject The group visual report An individual visual report per subject A first-round exercise for the data What is not expected from MRIQC: The triage of participants (WIP) The derivatives of processing Non-standard morphologies: developing brains, pathology, etc.
  21. 21. Stanford University 16/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Group Reports
  22. 22. Stanford University 17/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Anatomical Reports
  23. 23. Stanford University 18/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Functional Reports
  24. 24. Stanford University Running MRIQC 19/28
  25. 25. Stanford University 20/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Option 3: “bare-metal” mriqc <bids_dir>/ out/ participant Requires a functional python environment and installation through Pypi or setuptools.
  26. 26. Stanford University 21/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Option 2: Singularity poldracklab_mriqc_0.9.0-rc1-2017-01-12-9d72afa28286.img <bids_dir>/ out/ participant Image available in sherlock: /share/PI/russpold/singularity_images/ poldracklab_mriqc_0.9.0-rc1-2017-01-12-9d72afa28286.img
  27. 27. Stanford University 22/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Option 1: Docker docker run -v <bids_dir>:/data -v <scratch_dir>:/scratch -w /scratch poldracklab/mriqc:latest /data /scratch/out participant
  28. 28. Stanford University 23/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Get involved Documentation: http://mriqc.readthedocs.io Example reports: http://mriqc.org Q&A and support: https://neurostars.org/tags/mriqc Devs: https://github.com/poldracklab/mriqc
  29. 29. Stanford University Questions 24/28
  30. 30. Stanford University 25/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Questions TBD in the focus group Are there any additional quality metrics that you would like to be added? Are there any additional plots that you would like to be added? Would you like to have diffusion MRI IQMs and reports? Would you like to participate in manual triage/rating sessions of s/f/d MRI?
  31. 31. Stanford University 26/28 Introduction MRIQC Visual reports Running MRIQC Questions References References Acknowledgments The PoldrackLab
  32. 32. Stanford University Thanks! 27/28
  33. 33. Stanford University 28/28 Introduction MRIQC Visual reports Running MRIQC Questions References References References I Gorgolewski, Krzysztof J. et al. (2016a). “BIDS Apps: Improving ease of use, accessibility and reproducibility of neuroimaging data analysis methods”. en. In: bioRxiv, p. 079145. DOI: 10.1101/079145. URL: http://biorxiv.org/content/early/2016/10/05/079145. Gorgolewski, Krzysztof J. et al. (2016b). “The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments”. In: Scientific Data 3, p. 160044. ISSN: 2052-4463. DOI: 10.1038/sdata.2016.44. URL: http://www.nature.com/articles/sdata201644. Mortamet, BÃľnÃľdicte et al. (2009). “Automatic quality assessment in structural brain magnetic resonance imaging”. en. In: Magnetic Resonance in Medicine 62.2, pp. 365–372. ISSN: 1522-2594. DOI: 10.1002/mrm.21992. URL: http://onlinelibrary.wiley.com/doi/10.1002/mrm.21992/abstract. Pizarro, Ricardo A. et al. (2016). “Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm”. English. In: Frontiers in Neuroinformatics 10. ISSN: 1662-5196. DOI: 10.3389/fninf.2016.00052. URL: http://journal.frontiersin.org/article/10.3389/fninf.2016.00052/abstract. Price, Ronald R. et al. (1990). “Quality assurance methods and phantoms for magnetic resonance imaging: Report of AAPM nuclear magnetic resonance Task Group No. 1”. In: Medical Physics 17.2, pp. 287–295. ISSN: 0094-2405. DOI: 10.1118/1.596566. URL: http://scitation.aip.org/content/aapm/journal/medphys/17/2/10.1118/1.596566. Woodard, Jeffrey P. and Monica P. Carley-Spencer (2006). “No-Reference image quality metrics for structural MRI”. en. In: Neuroinformatics 4.3, pp. 243–262. ISSN: 1539-2791, 1559-0089. DOI: 10.1385/NI:4:3:243. URL: http://link.springer.com/article/10.1385/NI:4:3:243.

×