• Save
Advanced network modelling 2: connectivity measures, goup analysis
Upcoming SlideShare
Loading in...5
×
 

Advanced network modelling 2: connectivity measures, goup analysis

on

  • 731 views

The 2013 edition of my contribution to the connectome course

The 2013 edition of my contribution to the connectome course

Statistics

Views

Total Views
731
Views on SlideShare
693
Embed Views
38

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 38

https://twitter.com 27
http://gael-varoquaux.info 9
http://www.gael-varoquaux.info 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Advanced network modelling 2: connectivity measures, goup analysis Advanced network modelling 2: connectivity measures, goup analysis Presentation Transcript

  • Advanced network modelling II:connectivity measures, group analysisGa¨el Varoquaux INRIA, ParietalNeurospinLearning objectivesExtraction of thenetwork structure fromthe observationsStatistics for comparingcorrelations structuresInterpret networkstructuresVaroquaux & CraddockNeuroImage 2013
  • Problem setting and vocabularyGiven regions,infer and compareconnectionsGraph: set of nodes and connectionsWeighted or not.Directed or not.Can be represented by anadjacency matrix.G Varoquaux 2
  • Functional network analysis: an outline1 Signal extraction2 Connectivity graphs3 Comparing connections4 Network-level summaryG Varoquaux 3
  • 1 Signal extractionEnforcing specificity to neural signal[Fox 2005]G Varoquaux 4
  • 1 Choice of regionsToo many regions givesharder statistical problem:⇒ ∼ 30 ROIs forgroup-difference analysisNearly-overlapping regionswill mix signalsAvoid too small regions ⇒ ∼ 10mm radiusCapture different functional networksG Varoquaux 5
  • 1 Time-series extractionExtract ROI-average signal:weighted-mean with weightsgiven by grey-matter probabilityOptional low-pass filter(≈ .1 Hz – .3 Hz)Regress out confounds:- movement parameters- CSF and white matter signals- Compcorr: data-driven noise identification[Behzadi 2007]- Global mean?... overhyped discussion (see later)G Varoquaux 6
  • 2 Connectivity graphsFrom correlations to connectionsFunctional connectivity:correlation-based statisticsG Varoquaux 7
  • 2 Correlation, covarianceFor x and y centered:covariance: cov(x, y) =1n ixiyicorrelation: cor(x, y) =cov(x, y)std(x) std(y)Correlation is normalized: cor(x, y) ∈ [−1, 1]Quantify linear dependence between x and yCorrelation matrixfunctional connectivity graphs[Bullmore1996,..., Eguiluz2005, Achard2006...] 1G Varoquaux 8
  • 2 Partial correlationRemove the effect of z by regressing it outx/z = residuals of regression of x on zIn a set of p signals,partial correlation: cor(xi/Z, xj/Z), Z = {xk, k = i, j}partial variance: var(xi/Z), Z = {xk, k = i}Partial correlation matrix[Marrelec2006, Fransson2008, ...]G Varoquaux 9
  • 2 Inverse covarianceK = Matrix inverse of the covariance matrixOn the diagonal: partial varianceOff diagonal: scaled partial correlationKi,j = −cor(xi/Z, xj/Z) std(xi/Z) std(xj/Z)Inverse covariance matrix[Smith 2010, Varoquaux NIPS 2010, ...]G Varoquaux 10
  • 2 Summary: observations and indirect effectsObservationsCorrelation01234+ Variance:amount of observed signalDirect connectionsPartial correlation01234+ Partial varianceinnovation termG Varoquaux 11
  • 2 Summary: observations and indirect effectsObservationsCorrelationDirect connectionsPartial correlation[Fransson 2008]: partial correlations highlight thebackbone of the default modeG Varoquaux 11
  • 2 Summary: observations and indirect effectsObservationsCorrelationDirect connectionsPartial correlation[Fransson 2008]: partial correlations highlight thebackbone of the default modeGlobal signal regressionMatters less on partial correlationsCompCorr confounds Regressing outglobal signalMakes little difference with thechoice of good confoundsBut unspecific, and can make thecovariance matrix ill-conditionedG Varoquaux 11
  • 2 Inverse covariance and graphical modelGaussian graphical modelsZeros in inverse covariance giveconditional independenceΣ−1i,j = 0 ⇔xi, xj independentconditionally on {xk, k = i, j}Robust to the Gaussian assumptionG Varoquaux 12
  • 2 Inverse covariance matrix estimationp nodes, n observations (e.g. fMRI volumes)If not n p2,ambiguities:021021 021021? ?Thresholding partial correlations does notrecover ground truth independence structureG Varoquaux 13
  • 2 Inverse covariance matrix estimationSparse Inverse Covariance estimators:Independence between nodes makes estimation ofpartial correlation easier01234Independencestructure+ 01234ConnectivityvaluesJoint estimationG Varoquaux 14
  • 2 Inverse covariance matrix estimationSparse Inverse Covariance estimators:Independence between nodes makes estimation ofpartial correlation easier01234Independencestructure+ 01234ConnectivityvaluesJoint estimationGroup-sparse inverse covariance: learn differentconnectomes with same independence structure[Varoquaux, NIPS 2010]G Varoquaux 14
  • 3 Comparing connectionsDetecting and localizing differencesG Varoquaux 15
  • 3 Comparing connectionsDetecting and localizing differencesLearning sculpts the spontaneous activity of the restinghuman brain [Lewis 2009]Cor ...learn... cor differencesG Varoquaux 15
  • 3 Pair-wise tests on correlationsCorrelations ∈ [−1, 1]⇒ cannot apply Gaussianstatistics, e.g. T testsZ-transform:Z = arctanh cor =12ln1 + cor1 − corZ(cor) is normaly-distributed:For n observations, Z(cor) = NZ(cor),1√nG Varoquaux 16
  • 3 Indirect effects: to partial or not to partial?0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025Large lesionCorrelation matrices0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025Large lesionPartial correlation matricesSpread-out variability in correlation matricesNoise in partial-correlationsStrong dependence between coefficients[Varoquaux MICCAI 2010]G Varoquaux 17
  • 3 Indirect effects versus noise: a trade off0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025Large lesionCorrelation matrices0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025Large lesionPartial correlation matrices0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025 Control0 5 10 15 20 250510152025Large lesionTangent-space residuals[Varoquaux MICCAI 2010]G Varoquaux 18
  • 4 Network-level summaryComparing distributed networkstructureG Varoquaux 19
  • 4 Graph-theoretical analysisSummarize a graph by a few key metrics, expressingits transport properties [Bullmore & Sporns 2009][Eguiluz 2005]Detect differences on metrics with permutation testUse a good graph (sparse inverse covariance)[Varoquaux NIPS 2010]Correlations are small-word by construction[Zalesky 2012]G Varoquaux 20
  • 4 Integration, within network and accross networksNetwork-wide activityAmount of signal in ΣnetworkDeterminant: |Σnetwork|= generalized varianceNetwork integration: = log |ΣA|Cross-talk between network A and BMutual information= log |ΣAB| − log |ΣA| − log |ΣB|[Marrelec 2008, Varoquaux NIPS 2010]G Varoquaux 21
  • Wrapping up: pitfallsMissing nodesVery-correlated nodes:e.g. nearly-overlapping regionsHub nodes give more noisy partialcorrelationsG Varoquaux 22
  • Wrapping up: take home messagesRegress confounds out from signalsInverse covariance to captureonly direct effectsCorrelations cofluctuate⇒ localization of differencesis difficult 0 5 10 15 20 2505101520250 5 10 15 20 250510152025Networks are interesting units forcomparisonSlides on line http://gael-varoquaux.info
  • References (not exhaustive)[Achard 2006] A resilient, low-frequency, small-world human brainfunctional network with highly connected association cortical hubs, JNeurosci[Behzadi 2007] A component based noise correction method (CompCor)for BOLD and perfusion based fMRI, NeuroImage[Bullmore 2009] Complex brain networks: graph theoretical analysis ofstructural and functional systems, Nat Rev Neurosci[Eguiluz 2005] Scale-free brain functional networks, Phys Rev E[Frasson 2008] The precuneus/posterior cingulate cortex plays a pivotalrole in the default mode network: Evidence from a partial correlationnetwork analysis, NeuroImage[Fox 2005] The human brain is intrinsically organized into dynamic,anticorrelated functional networks, PNAS[Lewis 2009] Learning sculpts the spontaneous activity of the restinghuman brain, PNAS
  • References (not exhaustive)[Marrelec 2006] Partial correlation for functional brain interactivityinvestigation in functional MRI, NeuroImage[Marrelec 2007] Using partial correlation to enhance structural equationmodeling of functional MRI data, Magn Res Im[Marrelec 2008] Regions, systems, and the brain: hierarchical measuresof functional integration in fMRI, Med Im Analys[Smith 2010] Network Modelling Methods for fMRI, NeuroImage[Tononi 1994] A measure for brain complexity: relating functionalsegregation and integration in the nervous system, PNAS[Varoquaux MICCAI 2010] Detection of brain functional-connectivitydifference in post-stroke patients using group-level covariance modeling,Med Imag Proc Comp Aided Intervention[Varoquaux NIPS 2010] Brain covariance selection: better individualfunctional connectivity models using population prior, Neural Inf Proc Sys
  • References (not exhaustive)[Varoquaux 2012] Markov models for fMRI correlation structure: isbrain functional connectivity small world, or decomposable intonetworks?, J Physio Paris[Varoquaux 2013] Learning and comparing functional connectomesacross subjects, NeuroImage[Zalesky 2012] On the use of correlation as a measure of networkconnectivity, NeuroImage