Augmented reality in spine surgery


Published on

A presentation about the use of augmented reality in spine surgery

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Augmented reality in spine surgery

  1. 1. Augmented Reality in Spine Surgery By : Esam Elkhatib MD Assist. Prof. of Neurosurgery Suez Canal Medical School Egypt .
  2. 2. Definition of Augmented Reality <ul><li>Virtual Environments (VE): Completely replaces the real world </li></ul><ul><li>Augmented Reality (AR): User sees real environment; combines virtual with real </li></ul><ul><li>Supplements reality, instead of completely replacing it </li></ul><ul><li>Photorealism not necessarily a goal </li></ul>
  3. 3. Definition of AR continued <ul><li>Blends real and virtual, in real environment. </li></ul><ul><li>Real-time interactive. </li></ul><ul><li>Registered in 3-D. </li></ul><ul><li>Includes idea of removing part of real environment (a.k.a. mediated or diminished reality) </li></ul>
  4. 4. Why.. Using AR <ul><li>1- Enhance perception of and interaction with the real world </li></ul><ul><li>2- Potential for productivity improvements in real-world tasks </li></ul><ul><li>3- Relatively new field with many problems, but much progress has occurred recently </li></ul>
  5. 5. History of AR <ul><li>1960’s Sutherlands / sproulls first Head mounted display “HMD” system. </li></ul><ul><li>Early 1990's: Boeing coined the term &quot;AR&quot; </li></ul><ul><li>Early to mid 1990's: UNC ultrasound visualization project </li></ul><ul><li>1994: Motion stabilized display [Azuma] </li></ul><ul><li>1994: Fiducial tracking in video see-through [Bajura / Neumann] </li></ul><ul><li>1996: UNC hybrid magnetic-vision tracker </li></ul>
  6. 6. Growth of field: projects <ul><li>2000: Custom see-through HMDs </li></ul><ul><li>Mixed Reality Systems Laboratory (Japan) </li></ul><ul><ul><li> </li></ul></ul><ul><li>- (Germany) </li></ul><ul><ul><li>2003: Project ARVIKA </li></ul></ul>
  7. 8. Applications: medical <ul><li>&quot;X-ray vision&quot; for surgeons </li></ul><ul><li>Aid visualization, minimally-invasive operations, UNC Chapel Hill. </li></ul>
  8. 9. Optical see through HMD
  9. 10. HMD
  10. 11. Video see through HMD
  11. 12. Video see through HMD
  12. 13. Video monitor AR
  13. 14. Optical Strengths <ul><li>Simpler (cheaper) </li></ul><ul><li>Direct view of real world </li></ul><ul><li>Full resolution, no time delay (for real world) </li></ul><ul><li>Safety </li></ul><ul><li>Lower distortion </li></ul><ul><li>No eye displacement. </li></ul>
  14. 15. Video Strengths <ul><li>Digitized image of real world </li></ul><ul><li>Matchable time delays </li></ul><ul><li>More registration, calibration strategies </li></ul><ul><li>Medical: video for calibration strategies is superior </li></ul>
  15. 16. Registration and Tracking <ul><li>Tracking is the basic technology for AR. </li></ul><ul><ul><li>Without accurate tracking you can't generate the merged real-virtual environment </li></ul></ul><ul><li>Tracking is significantly more difficult in AR than in Virtual Environments </li></ul><ul><li>&quot;Tracking is the stepchild that nobody talks about.&quot; - Henry Sowizral, Dec 1994 Scientific American </li></ul>
  16. 17. The Registration Problem <ul><li>Virtual & Real must stay properly aligned </li></ul><ul><li>If not: </li></ul><ul><ul><li>Compromises illusion that the two coexist </li></ul></ul><ul><li>Prevents acceptance of many applications </li></ul><ul><li>Silly Q? Do you want a surgeon cutting into you if the virtual cut-marks are misaligned? </li></ul>
  17. 18. Vision-based Approaches <ul><li>Fiducials in environment (LEDs, colored dots) </li></ul><ul><li>Template matching </li></ul><ul><li>Restricted environment with known objects </li></ul><ul><li>More sensors (e.g. laser rangefinder) </li></ul><ul><li>Keep user in the loop (manual identification) </li></ul>
  18. 19. Research Directions in tracking & Registration <ul><li>Hybrid tracking systems </li></ul><ul><ul><li>Combine approaches, cover weaknesses </li></ul></ul><ul><ul><li>Systems built for greater input variety and bandwidth </li></ul></ul><ul><li>Hybrid systems and techniques -e.g. use multiple registration techniques </li></ul>
  19. 20. Real Time....Does it exist? <ul><li>True real-time systems </li></ul><ul><li>Must synchronize with the real world </li></ul><ul><li>Time becomes a first class citizen -Time critical rendering </li></ul><ul><li>Goal: Accurate tracking at long ranges, in unstructured environments </li></ul>
  20. 21. Tracking Technologies <ul><li>Active sources </li></ul><ul><li>Optical, magnetic, ultrasonic </li></ul><ul><li>Requires structured, controlled environment </li></ul><ul><li>Restricted range </li></ul><ul><li>Magnetic vulnerable to distortions </li></ul><ul><li>Ultrasonic: ambient temperature variations </li></ul><ul><li>Optical is often expensive </li></ul>
  21. 22. Markerless Tracking <ul><li>Tracking without adding special markers </li></ul><ul><li>Provides: Real-time operation </li></ul><ul><li>Enables: Outdoor operation </li></ul>
  22. 23. Face Augmentation Markerless Tracking <ul><li>Lepetit, et al: Fully Automated and Stable Registration for Augmented Reality applications 2003 (Tokyo). </li></ul><ul><li>Uses one generic 3D face model, grabs image texture of user's. </li></ul>
  23. 24. Head orientation changes and backgrounds
  24. 25. Intraoperative AR <ul><li>AR for intra-operative visualization and navigation has been a subject of intensive research and development during the last decade. </li></ul><ul><li>Besides accuracy and speed of the system we have to improve the 3D visualization systems </li></ul><ul><li>In-situ visualization offers a higher degree of freedom for the programmer than classical on screen visualization </li></ul>
  25. 26. Intraoperative AR continued <ul><li>Before the surgery: a model of the patient's anatomy from a CT scan has to be derived </li></ul><ul><li>During surgery tracking the location of the surgical instruments in relation to patient and anatomy model </li></ul><ul><li>Helping the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view </li></ul>
  26. 27. Pedicle Screw Placement under Video-Augmented Flouroscopic Control <ul><li>Set up a camera of augmented mobile c-arm (CAMC) and used it to perform pedicle screw placement under video control . </li></ul><ul><li>Using a prototypic Iso-C-3D fluoroscope (Siemens Medical Solutions, Erlangen) with an additionally attached CCD-camera. </li></ul>
  27. 28. Steps <ul><li>During the workflow, two markers for position control are attached. </li></ul><ul><li>After taking a single x-ray-shot, the image is merged with the video image </li></ul>
  28. 29. <ul><li>The entry point and placement of the tool-tip accordingly are carried out under video-control. </li></ul><ul><li>After alignment of the tool’s axis with the optical axis, drilling “down the beam” can be started. </li></ul><ul><li>If markers do not coincide in video and x-ray images, the patient has moved and therefore a new x-ray image has to be taken. </li></ul>
  29. 30. Advantages <ul><li>In pedicle approach a considerable reduction in flouroscopy time and thus radiation dose. </li></ul><ul><li>A single-shot x-ray image is sufficient as soon as the pedicle axis is recognized. </li></ul><ul><li>A new x-ray can be taken at any time for updating the intervention (movement, implants). </li></ul><ul><li>Pedicle identification and needle insertion is possible and tracking of the instrument enables the visualization of the tip of the instrument even after entering the surface (skin/draping). </li></ul>