• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Augmented reality in spine surgery
 

Augmented reality in spine surgery

on

  • 2,972 views

A presentation about the use of augmented reality in spine surgery

A presentation about the use of augmented reality in spine surgery

Statistics

Views

Total Views
2,972
Views on SlideShare
1,642
Embed Views
1,330

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 1,330

http://www.engadget.com 1329
http://translate.googleusercontent.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Augmented reality in spine surgery Augmented reality in spine surgery Presentation Transcript

    • Augmented Reality in Spine Surgery By : Esam Elkhatib MD Assist. Prof. of Neurosurgery Suez Canal Medical School Egypt
    • Definition of Augmented Reality
      • Virtual Environments (VE): Completely replaces the real world
      • Augmented Reality (AR): User sees real environment; combines virtual with real
      • Supplements reality, instead of completely replacing it
      • Photorealism not necessarily a goal
    • Definition of AR continued
      • Blends real and virtual, in real environment.
      • Real-time interactive.
      • Registered in 3-D.
      • Includes idea of removing part of real environment (a.k.a. mediated or diminished reality)
    • Why.. Using AR
      • 1- Enhance perception of and interaction with the real world
      • 2- Potential for productivity improvements in real-world tasks
      • 3- Relatively new field with many problems, but much progress has occurred recently
    • History of AR
      • 1960’s Sutherlands / sproulls first Head mounted display “HMD” system.
      • Early 1990's: Boeing coined the term "AR"
      • Early to mid 1990's: UNC ultrasound visualization project
      • 1994: Motion stabilized display [Azuma]
      • 1994: Fiducial tracking in video see-through [Bajura / Neumann]
      • 1996: UNC hybrid magnetic-vision tracker
    • Growth of field: projects
      • 2000: Custom see-through HMDs
      • Mixed Reality Systems Laboratory (Japan)
        • http://www.mr-system.co.jp
      • - (Germany) http://www.arvika.de/
        • 2003: Project ARVIKA
    •  
    • Applications: medical
      • "X-ray vision" for surgeons
      • Aid visualization, minimally-invasive operations, UNC Chapel Hill.
    • Optical see through HMD
    • HMD
    • Video see through HMD
    • Video see through HMD
    • Video monitor AR
    • Optical Strengths
      • Simpler (cheaper)
      • Direct view of real world
      • Full resolution, no time delay (for real world)
      • Safety
      • Lower distortion
      • No eye displacement.
    • Video Strengths
      • Digitized image of real world
      • Matchable time delays
      • More registration, calibration strategies
      • Medical: video for calibration strategies is superior
    • Registration and Tracking
      • Tracking is the basic technology for AR.
        • Without accurate tracking you can't generate the merged real-virtual environment
      • Tracking is significantly more difficult in AR than in Virtual Environments
      • "Tracking is the stepchild that nobody talks about." - Henry Sowizral, Dec 1994 Scientific American
    • The Registration Problem
      • Virtual & Real must stay properly aligned
      • If not:
        • Compromises illusion that the two coexist
      • Prevents acceptance of many applications
      • Silly Q? Do you want a surgeon cutting into you if the virtual cut-marks are misaligned?
    • Vision-based Approaches
      • Fiducials in environment (LEDs, colored dots)
      • Template matching
      • Restricted environment with known objects
      • More sensors (e.g. laser rangefinder)
      • Keep user in the loop (manual identification)
    • Research Directions in tracking & Registration
      • Hybrid tracking systems
        • Combine approaches, cover weaknesses
        • Systems built for greater input variety and bandwidth
      • Hybrid systems and techniques -e.g. use multiple registration techniques
    • Real Time....Does it exist?
      • True real-time systems
      • Must synchronize with the real world
      • Time becomes a first class citizen -Time critical rendering
      • Goal: Accurate tracking at long ranges, in unstructured environments
    • Tracking Technologies
      • Active sources
      • Optical, magnetic, ultrasonic
      • Requires structured, controlled environment
      • Restricted range
      • Magnetic vulnerable to distortions
      • Ultrasonic: ambient temperature variations
      • Optical is often expensive
    • Markerless Tracking
      • Tracking without adding special markers
      • Provides: Real-time operation
      • Enables: Outdoor operation
    • Face Augmentation Markerless Tracking
      • Lepetit, et al: Fully Automated and Stable Registration for Augmented Reality applications 2003 (Tokyo).
      • Uses one generic 3D face model, grabs image texture of user's.
    • Head orientation changes and backgrounds
    • Intraoperative AR
      • AR for intra-operative visualization and navigation has been a subject of intensive research and development during the last decade.
      • Besides accuracy and speed of the system we have to improve the 3D visualization systems
      • In-situ visualization offers a higher degree of freedom for the programmer than classical on screen visualization
    • Intraoperative AR continued
      • Before the surgery: a model of the patient's anatomy from a CT scan has to be derived
      • During surgery tracking the location of the surgical instruments in relation to patient and anatomy model
      • Helping the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view
    • Pedicle Screw Placement under Video-Augmented Flouroscopic Control
      • Set up a camera of augmented mobile c-arm (CAMC) and used it to perform pedicle screw placement under video control .
      • Using a prototypic Iso-C-3D fluoroscope (Siemens Medical Solutions, Erlangen) with an additionally attached CCD-camera.
    • Steps
      • During the workflow, two markers for position control are attached.
      • After taking a single x-ray-shot, the image is merged with the video image
      • The entry point and placement of the tool-tip accordingly are carried out under video-control.
      • After alignment of the tool’s axis with the optical axis, drilling “down the beam” can be started.
      • If markers do not coincide in video and x-ray images, the patient has moved and therefore a new x-ray image has to be taken.
    • Advantages
      • In pedicle approach a considerable reduction in flouroscopy time and thus radiation dose.
      • A single-shot x-ray image is sufficient as soon as the pedicle axis is recognized.
      • A new x-ray can be taken at any time for updating the intervention (movement, implants).
      • Pedicle identification and needle insertion is possible and tracking of the instrument enables the visualization of the tip of the instrument even after entering the surface (skin/draping).
    •  
    •