Towards second generation expert systems in telepathology for aid in diagnosis


Published on

Slides of my invited plenary talk at 10th European Congress on Telepathology and 4th International Congress on Virtual Microscopy, in Vilnius, Lithuania, 1-3 July 2010.

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Towards second generation expert systems in telepathology for aid in diagnosis

  1. 1. Towards second generation expert systems in telepathology for aid in diagnosis Prof. Dr. Touradj Ebrahimi [email_address]
  2. 2. <ul><ul><li>Background and Motivations </li></ul></ul><ul><ul><li>Towards a new paradigm in aid in diagnosis </li></ul></ul><ul><ul><li>Example of an existing attempt in deep space imaging </li></ul></ul><ul><ul><li>Outlook in telepathology </li></ul></ul>Outline
  3. 3. Background <ul><li>Image/signal processing have played important roles in modern telepathology and virtual microscopy </li></ul>
  4. 4. Example applications of image processing in telepathology <ul><li>Image and video compression </li></ul><ul><li>Image segmentation </li></ul><ul><li>Feature extraction </li></ul><ul><li>Visualization </li></ul><ul><li>Acquisition </li></ul><ul><li>Search/retrieval </li></ul><ul><li>… </li></ul>
  5. 5. Current paradigm in use of image processing <ul><li>Let machines perform various more or less low level but maybe complex image/signal processing operations on patients data and images to help specialists in their diagnostic tasks </li></ul><ul><ul><li>All actions initiated by specialists/operators </li></ul></ul><ul><ul><li>All operations are exclusively applied to patients data/images </li></ul></ul><ul><ul><li>Any feed-back from specialist /operator to machine is explicit </li></ul></ul><ul><ul><li>Hard to code/record specialists experience/expertise in this process </li></ul></ul>
  6. 6. Trends in use of image/signal processing for telepathology <ul><li>Improve image/signal processing algorithms and rely on Moore’s law </li></ul><ul><ul><li>Higher resolution, components, quality, … </li></ul></ul><ul><ul><li>More advanced and more complex processing, </li></ul></ul><ul><ul><li>Design better human-machine interfaces, … </li></ul></ul><ul><ul><li>Use bigger storage capacity and transmission bandwidth, … </li></ul></ul><ul><ul><li>… </li></ul></ul>
  7. 7. Is there another paradigm? <ul><li>The current trend is of course a good path, full of potential for future improvements </li></ul><ul><li>But is there a different and/or complementary paradigm one could also explore? </li></ul>
  8. 8. Another paradigm ? <ul><li>Apply image/signal processing not only on patients data/images to help specialists better perform their tasks, but also on specialists while making diagnosis , to further help the whole process. </li></ul>
  9. 9. Relationship with Human Computer Interaction <ul><li>There are several similarities between such a paradigm and advanced human machine interaction concepts </li></ul>
  10. 10. Explicit versus implicit interaction
  11. 11. Same paradigm in another application <ul><li>Curiosity Cloning : neural analysis of scientific interest </li></ul><ul><li>Focus of application: deep space exploration imaging </li></ul><ul><ul><li>Several similarities with telepathology </li></ul></ul><ul><li>Collaborative project between: </li></ul><ul><ul><li>Advanced Concepts Team of European Space Agency </li></ul></ul><ul><ul><li>Multimedia Signal Processing Group – EPFL </li></ul></ul><ul><li>See for further information </li></ul>
  12. 12. Deep space exploration application under consideration <ul><li>Autonomy in space exploration is necessary . </li></ul><ul><li>The bandwidth of an exploration mission is very limited . </li></ul><ul><li>Rover to select autonomously what is scientifically relevant . </li></ul><ul><li>NASA MER rovers ( Opportunity and Spirit ) software upgrade was in this direction. </li></ul><ul><li>dust-devil & cloud automatic detection </li></ul><ul><li>Pattern matching </li></ul>
  13. 13. Curiosity cloning in deep space exploration <ul><li>Pure pattern matching, Scientific Richness Index or other classifiers are programmed to find what we already know: the expected . </li></ul><ul><li>Q: Can we code the interest in the unexpected ? </li></ul><ul><li>(Scientific) Curiosity? </li></ul>
  14. 14. Curiosity cloning in deep space exploration <ul><li>An alternative to explicit and specific definition of what we are looking for ...(e.g., dust-devils) </li></ul><ul><li>Present to experts (e.g., experts on mars geology) a lot of images and rate them </li></ul><ul><li>Images/Rating pairs can form a training set for a classifier </li></ul><ul><li>The classifier could be programmed on a rover </li></ul><ul><li>• Robot’s brain would be a “clone” of the scientist’s </li></ul><ul><li>interest, curiosity, expertise </li></ul>
  15. 15. Curiosity cloning in deep space exploration <ul><li>So how can we do it? </li></ul><ul><ul><li>Why not “ push-the-button ” technique? </li></ul></ul><ul><ul><li>Why not “ interrogation ” technique? </li></ul></ul><ul><li>Because: </li></ul><ul><ul><li>We can go much faster </li></ul></ul><ul><ul><li>We can escape conscious filtering </li></ul></ul><ul><li>Find a biometric signature of a </li></ul><ul><li>subject’s curiosity and expertise, which should </li></ul><ul><li>arguably be embedded in the brain waves </li></ul>
  16. 16. Three image sets in three different experiments
  17. 17. Image presentation based on P300 detection approach <ul><li>CCViewer software was developed ( freely available for download ). </li></ul><ul><li>This software was controlled via Matlab (MEX interface). </li></ul><ul><li>Inputs and outputs: </li></ul><ul><li>- Experiment configuration file </li></ul><ul><li>- Log file of each experiment </li></ul>Start point T1 T2 T1 T1 T2 T2 S S T
  18. 18. Overview of experiments environment Matlab Code start at T* EEG Acquisition device Eye Tracker Image Presentation Software
  19. 19. Subjects <ul><li>Most of naïve (non specialists) subjects in experiments were PhD student at EPFL. </li></ul><ul><li>In experiment 1 using trivial images, different values of T1 and T2 were applied and impact studied. No specialists were used. </li></ul><ul><li>In experiment 2, using physics phenomena graphs, a specialist was added to the set of subjects </li></ul><ul><li>In experiment 3 using using deep space images, a specialist was also added to subjects </li></ul>
  20. 20. Operator performing experiments
  21. 21. EEG signal acquisition The average signal from the T7 and T8 electrodes was used for referencing to achieve full 80 dB CMRR.
  22. 22. Experimental protocol <ul><li>Every subject completed two recording sessions. For all subjects the time between the first and the last sessions was less than one week. </li></ul><ul><li>Subjects were asked to perform a covert task, namely counting silently how often a target appeared (P300 detection) </li></ul><ul><li>At the start of every session, a gray image was displayed on the screen followed by a countdown from five to zero. </li></ul><ul><li>As soon as the countdown was finished, a random sequence of images was presented and EEG was recorded, along with eye tracking data (which were not used in the analysis of results). </li></ul><ul><li>After each run, subjects were asked what their counting result was. </li></ul>
  23. 23. Influence of the rate (experiment 1) Averaged signals of PZ electrodes for different image display rates (top left: 500 ms to bottom right: 50 ms). Horizontal axis is the time after stimulus onset and vertical axis amplitude of the P300 signal . Target (red) , Non-Target (blue) stimulus
  24. 24. Influence of the position of EEG sensor (experiment 1) Averaged signal target (red) and non-target (blue) taken from electrodes P3, PZ, PO3, PO4, P4, CZ (top left to bottom right). Horizontal axis corresponds to the time after the stimulus onset and vertical axis shows the amplitude of the P300 signal .
  25. 25. Specialist versus naïve subjects (experiment 3) Averaged PZ electrodes for subjects. Top left: specialist, Others: naïve subjects. Horizontal axis is the time after stimulus onset and vertical axis amplitude of the P300 signal . Scientifically interesting Target (red) , Non-Target (blue) , Non-obvious target (dashed black) stimulus.
  26. 26. Conclusions of this study <ul><li>In this study, the possibility of selecting scientifically interesting images using EEG signal was assessed in deep space imaging application. </li></ul><ul><li>It has been shown that relatively good results can be obtained, when the ISI is long enough. </li></ul><ul><li>As the image display rate increases, the amplitude of P300 signal decreases. </li></ul><ul><li>It has been observed that there is a clear difference in P300 signals in specialist subject when compared to naïve subjects observing non-obvious target stimuli. </li></ul><ul><li>The P300 patterns of different subjects are different in terms of amplitude and the dominant sensors, and hence developing a general classifier is challenging problem but solvable </li></ul>
  27. 27. Can these results be extended to telepathology? <ul><li>The only way to answer is to test it! </li></ul><ul><li>But if it does there will be very interesting educational , research , and clinical impacts in telepathology </li></ul><ul><li>Ingredients needed for a first set of experiments to check it: </li></ul><ul><ul><li>Adequate dataset/images exhibiting various labeled pathologies </li></ul></ul><ul><ul><li>Experimental set up (the current protocol in place should be enough for now) </li></ul></ul><ul><ul><li>Pathology specialists </li></ul></ul><ul><li>So, if you are a specialist in pathology, or know a specialist interested in taking part in these experiments, please contact me. </li></ul>
  28. 28. Acknowledgements <ul><li>COST IC0604 EURO-TELEPATH </li></ul><ul><li>The State Secretariat for Education and Research in Switzerland </li></ul><ul><li>European Space Agency and in its Advanced Concepts Team </li></ul><ul><li>Ashkan Yazdani, my Ph.D. student, who carried out various tests and analyses presented </li></ul><ul><li>Naïve and expert subjects who spent time and efforts making these results possible </li></ul>
  29. 29. Thank you for your attention <ul><li>Questions? </li></ul><ul><li>Remarks? </li></ul><ul><li>Reactions? </li></ul>