Kyle Myers, PhD

536 views

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
536
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
8
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Major emphasis of our work
  • Major emphasis of our work
  • Nonlinear in most general case
    More on Fourier
  • This approach is routinely used at the University of Arizona. A radioactive source is systematically stepped through the object domain; each source location yields approximately 100,000 measurements. That is, 100,000 measurements for each of about 375,000 source locations, if 2mm object voxels are used! The system matrix is sparse, though, so only about 2% of the elements must be stored.
  • Lost information – need to know if this is crucial to task performance
    Image reconstruction: many algorithms try to put lost information back by using informative priors
  • By performance, we mean how well the system does at giving information to the observer to perform the task! For hardware evaluation and optimization we’d like to assess the information content in the raw data in terms of its ability to allow the best possible observer to perform the task. Thus we make use of the ideal or Bayesian observer.
  • Order of increasing cost and increasing reality. Obviously, the SKE and SKEV paradigms are only relevant to classification tasks.
    As we shall see, the relationship between SKS and SKEV is known for the ideal observer.
    Interestingly, it has been shown that SKEV can be used to predict SKS in experiments involving human observers as well. Much more to be done on this topic.
  • Images from pinhole imagers of various Gaussian blur (single aperture). Estimation task: determine signal width and amplitude. Signal is Gaussian blob, background is lumpy and there is additional photon noise. Usual noise vs. resolution trade-off.
    Note: optimal system is task-dependent!!!
  • Don’t assume that the object is discrete, pixellated. It is a vector in a Hilbert space.
    Noise may be object dependent.
    We observe g and use it to classify image or make inferences about underlying parameters of the object.
    Framework comes from statistical decision theory
  • Kyle Myers, PhD

    1. 1. Knowledge of the is critical for solving an Description of the data for a known object Inference about an underlying object from an image
    2. 2. Knowledge of the Description of the data for a known object  Provides a description of images/data  Noise, resolution, artifacts,…  Optimal classification and estimation depend on likelihood of data given underlying object
    3. 3. Object property being imaged  Acoustic reflectance  Medical ultrasound  Concentration  Nuclear medicine  MRI (spin density)  MRS  Field strength  Biomagnetic imaging  Attenuation  Film densitometry  Transmission x-ray  Scattering properties  Medical ultrasound  Electric, magnetic properties  Impedance tomography  MRI (magnetization  MRI (spin relaxation)  Source strength  Fluorescence microscopy  Index of refraction  Phase-contrast microscopy  Gene expression  DNA chips, microarrays
    4. 4. Image acquisition: a mapping from object space to data space  g = data ( )  H = the imaging process (mapping)  f = tumor/object/patient (what we want to )  Which H is best? What more can we do with possible improvements in H ?
    5. 5. Need models/measures of H to characterize the data
    6. 6. Singular Value Decomposition (SVD): Tool for understanding the forward problem  Basis functions are found by eigenanalysis of Ht H  Continuous-Continuous (CC) system  Linear, shift-invariant (LSIV)  Fourier theory: Basis functions are wavefunctions  MTF describes resolution  NPS describes noise  Continuous-Discrete (CD) system  H is shift-variant  Resolution and noise depend on location  Basis functions may be “natural pixels,”
    7. 7. Measure the mapping… When the object is a point source f (r) = δ(r - r0) , The image is the detector sensitivity function = a component of the mapping H.
    8. 8. Measuring H on FASTSPECT II at U. of AZ
    9. 9. Eigenfunctions of an octagonal SPECT system Barrett et al., IPMI (1991).
    10. 10. Null space: H fnull = 0  If f1 and f2 differ by a null function: H f1 – H f2 = 0  no difference in the image  CC system: where MTF has zeros  CD system examples: finite sampling  Limited-angle tomography  Temporal sampling  Spatial sampling (pixel binning)  All digital systems have null functions  Can’t recover object uniquely from image
    11. 11. Null functions cause artifacts Reconstruction of a brain phantom by filtered backprojection. (Courtesy of C.K. Abbey)
    12. 12. Image reconstruction  Regularization can reduce objectionable artifacts  Can’t put back what’s lost due to null functions  Makes noise nonlocal – contributions from entire image Sequence of reconstructions of a brain phantom by the MLEM algorithm after 10, 20, 50, 100, 200, and 400 iterations. (Courtesy of D.W. Wilson.)
    13. 13. Knowing the forward problem means knowing the null space Barrett et al., IPMI (1991).
    14. 14. Classification tasks: Ideal (Bayesian) observer  Optimal classifier is based on the likelihood ratio:  Performance is determined by statistics of the likelihood ratio  ROC analysis )|(pr )|(pr )( 1 2 H H g g g =Λ Disease present) Disease present)
    15. 15. Estimation  Tumor volume  Requires delineation of border  Tracer uptake  Total or specific activity  Angiogenesis  Vessel tortuosity Bullitt et al., IEEE TMI (2003).
    16. 16. Estimation: Basic concepts  θ is P –D vector of object parameters  pr(θ) is prior probability density; describes underlying randomness in the parameters  pr(g|θ) = mapping from parameters to data = likelihood of data given θ  θ(g)= estimate of parameter vector ^
    17. 17. Estimability  pr(g|θ1) = pr(g|θ2) implies θ1=θ2  Closely linked to null functions  Estimates of pixel values run into problems of estimability  See Barrett and Myers, 2004
    18. 18. Figures of merit  Bias, variance  Mean-square error  Overall fluctuation in the estimate for particular θ  Requires gold standard = true value of parameter  Only meaningful for estimable parameters  Limited by measurement noise, anatomical variation, form of the estimator
    19. 19. Figures of merit – cont’d  Ensemble MSE (EMSE)  Need to know prior on θ  Prior information can be statistical or model-based  Makes problem well-posed
    20. 20. Family of possible tumors  Tumor = t(θt)  Location  Size  Shape  Density  Some unknowns are nuisance parameters  Estimate or marginalize  Key to tractability is knowledge of pr(θt) Courtesy Miguel Eckstein, UCSB
    21. 21. Inhomogeneous backgrounds can mask tumor/margins  Additional source of variability in the data  Degrades tumor detectability, estimation of tumor parameters  Reduced noise, increased resolution may not improve task performance  Many models for pr(θb) to describe random backgrounds
    22. 22. No-gold standard estimation  Use at least 2 modalities to estimate θ OR  Use at least 2 estimators for same data  Regress the estimates from all sources  Requires model for parameter θ, knowledge of pr(g|θ) Hoppin et al., IEEE TMI (2002).
    23. 23. Estimation results Detection results Kupinski et al., SPIE 2003 Optimal acquisition system is task-dependent
    24. 24. Drug response studies using clinical (human) readers  Beware of reader variability 0 .0 0 .0 0 .0 0 .0 0 . 1 0 .1 0 .1 0 .1 0 .2 0 .2 0 .2 0 .2 0 . 3 0 .3 0 .3 0 .3 0 .4 0 .4 0 .4 0 .4 0 .5 0 .5 0 .5 0 .5 0 .6 0 .6 0 .6 0 .6 0 .7 0 . 7 0 .7 0 .7 0 .8 0 .8 0 .8 0 .8 0 .9 0 . 9 0 .9 0 .9 1 .0 1 .0 1 .0 1 .0 F a l s e P o s i t i v e F r a c t i o n T r u e N e g a t i v e F r a c t i o n TruePositiveFraction FalseNegativeFraction TPF vs FPF for 108 US radiologists in study by Beam et al., (1996).
    25. 25. Drug response studies using clinical (human) readers  Adds to sources of variability in the study  Need more cases to power the study  Analyzed via random-effects or multivariate ROC analysis  Multi-reader multi-case (MRMC) ROC methodology is commonly used in CDRH for determining contribution of variability due to range of reader skill, reader threshold, and case difficulty
    26. 26. Why consider display image quality? Image Processing PACS The diagnostic imaging chain is as effective as its weakest component!  Poor display quality can:  reduce effectiveness of diagnostic or screening test  lead to misdiagnosis  cause inconsistent clinical decisions Display Processing X-ray generatio n Object Digital detector (indirect)Filtratio n IMAGE ACQUISITION Courtesy Aldo Badano, CDRH
    27. 27. Choice of image acquisition system and settings will depend on the answers to these questions:  What information about the object is desired from the image?  How will that information be extracted?  What objects/patients will be imaged?  What measure of performance will be used?
    28. 28. Summary  The future: Knowledge of the forward problem will enable well-characterized, patient-specific image-acquisition choices and processing/estimation methods  For now:  Make sure the problem is well-posed and the parameters are estimable  Avoid pixel-based techniques  Use model-based (low-dimensional) methods  Try to keep the human out of the loop  Validate, validate, validate!

    ×