Your SlideShare is downloading. ×
Augmented reality in spine surgery
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Augmented reality in spine surgery

763

Published on

A presentation about the use of augmented reality in spine surgery

A presentation about the use of augmented reality in spine surgery

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
763
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Augmented Reality in Spine Surgery By : Esam Elkhatib MD Assist. Prof. of Neurosurgery Suez Canal Medical School Egypt . ealkhatib@yahoo.com
  • 2. Definition of Augmented Reality
    • Virtual Environments (VE): Completely replaces the real world
    • Augmented Reality (AR): User sees real environment; combines virtual with real
    • Supplements reality, instead of completely replacing it
    • Photorealism not necessarily a goal
  • 3. Definition of AR continued
    • Blends real and virtual, in real environment.
    • Real-time interactive.
    • Registered in 3-D.
    • Includes idea of removing part of real environment (a.k.a. mediated or diminished reality)
  • 4. Why.. Using AR
    • 1- Enhance perception of and interaction with the real world
    • 2- Potential for productivity improvements in real-world tasks
    • 3- Relatively new field with many problems, but much progress has occurred recently
  • 5. History of AR
    • 1960’s Sutherlands / sproulls first Head mounted display “HMD” system.
    • Early 1990's: Boeing coined the term "AR"
    • Early to mid 1990's: UNC ultrasound visualization project
    • 1994: Motion stabilized display [Azuma]
    • 1994: Fiducial tracking in video see-through [Bajura / Neumann]
    • 1996: UNC hybrid magnetic-vision tracker
  • 6. Growth of field: projects
    • 2000: Custom see-through HMDs
    • Mixed Reality Systems Laboratory (Japan)
      • http://www.mr-system.co.jp
    • - (Germany) http://www.arvika.de/
      • 2003: Project ARVIKA
  • 7.  
  • 8. Applications: medical
    • "X-ray vision" for surgeons
    • Aid visualization, minimally-invasive operations, UNC Chapel Hill.
  • 9. Optical see through HMD
  • 10. HMD
  • 11. Video see through HMD
  • 12. Video see through HMD
  • 13. Video monitor AR
  • 14. Optical Strengths
    • Simpler (cheaper)
    • Direct view of real world
    • Full resolution, no time delay (for real world)
    • Safety
    • Lower distortion
    • No eye displacement.
  • 15. Video Strengths
    • Digitized image of real world
    • Matchable time delays
    • More registration, calibration strategies
    • Medical: video for calibration strategies is superior
  • 16. Registration and Tracking
    • Tracking is the basic technology for AR.
      • Without accurate tracking you can't generate the merged real-virtual environment
    • Tracking is significantly more difficult in AR than in Virtual Environments
    • "Tracking is the stepchild that nobody talks about." - Henry Sowizral, Dec 1994 Scientific American
  • 17. The Registration Problem
    • Virtual & Real must stay properly aligned
    • If not:
      • Compromises illusion that the two coexist
    • Prevents acceptance of many applications
    • Silly Q? Do you want a surgeon cutting into you if the virtual cut-marks are misaligned?
  • 18. Vision-based Approaches
    • Fiducials in environment (LEDs, colored dots)
    • Template matching
    • Restricted environment with known objects
    • More sensors (e.g. laser rangefinder)
    • Keep user in the loop (manual identification)
  • 19. Research Directions in tracking & Registration
    • Hybrid tracking systems
      • Combine approaches, cover weaknesses
      • Systems built for greater input variety and bandwidth
    • Hybrid systems and techniques -e.g. use multiple registration techniques
  • 20. Real Time....Does it exist?
    • True real-time systems
    • Must synchronize with the real world
    • Time becomes a first class citizen -Time critical rendering
    • Goal: Accurate tracking at long ranges, in unstructured environments
  • 21. Tracking Technologies
    • Active sources
    • Optical, magnetic, ultrasonic
    • Requires structured, controlled environment
    • Restricted range
    • Magnetic vulnerable to distortions
    • Ultrasonic: ambient temperature variations
    • Optical is often expensive
  • 22. Markerless Tracking
    • Tracking without adding special markers
    • Provides: Real-time operation
    • Enables: Outdoor operation
  • 23. Face Augmentation Markerless Tracking
    • Lepetit, et al: Fully Automated and Stable Registration for Augmented Reality applications 2003 (Tokyo).
    • Uses one generic 3D face model, grabs image texture of user's.
  • 24. Head orientation changes and backgrounds
  • 25. Intraoperative AR
    • AR for intra-operative visualization and navigation has been a subject of intensive research and development during the last decade.
    • Besides accuracy and speed of the system we have to improve the 3D visualization systems
    • In-situ visualization offers a higher degree of freedom for the programmer than classical on screen visualization
  • 26. Intraoperative AR continued
    • Before the surgery: a model of the patient's anatomy from a CT scan has to be derived
    • During surgery tracking the location of the surgical instruments in relation to patient and anatomy model
    • Helping the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view
  • 27. Pedicle Screw Placement under Video-Augmented Flouroscopic Control
    • Set up a camera of augmented mobile c-arm (CAMC) and used it to perform pedicle screw placement under video control .
    • Using a prototypic Iso-C-3D fluoroscope (Siemens Medical Solutions, Erlangen) with an additionally attached CCD-camera.
  • 28. Steps
    • During the workflow, two markers for position control are attached.
    • After taking a single x-ray-shot, the image is merged with the video image
  • 29.
    • The entry point and placement of the tool-tip accordingly are carried out under video-control.
    • After alignment of the tool’s axis with the optical axis, drilling “down the beam” can be started.
    • If markers do not coincide in video and x-ray images, the patient has moved and therefore a new x-ray image has to be taken.
  • 30. Advantages
    • In pedicle approach a considerable reduction in flouroscopy time and thus radiation dose.
    • A single-shot x-ray image is sufficient as soon as the pedicle axis is recognized.
    • A new x-ray can be taken at any time for updating the intervention (movement, implants).
    • Pedicle identification and needle insertion is possible and tracking of the instrument enables the visualization of the tip of the instrument even after entering the surface (skin/draping).
  • 31.  
  • 32.  

×