Phase Recognition during Surgical Procedures using Embedded and Body-worn Sensors.

1,244 views

Published on

Presentation of paper on the PerCom 2011 conference in Seattle.

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,244
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • When it comes to sensor-based activity recognition existing work has focused on activities of individula persons in their daily routines. A surgical procedure involves several people collaborating on the same task. Collaborative –activities in workplaces using sensors has not been widely studied
  • When it comes to sensor-based activity recognition existing work has focused on activities of individula persons in their daily routines. A surgical procedure involves several people collaborating on the same task. Collaborative –activities in workplaces using sensors has not been widely studied A surgical procedure involves several people collaborating on the same task
  • When it comes to sensor-based activity recognition existing work has focused on activities of individula persons in their daily routines. A surgical procedure involves several people collaborating on the same task. Collaborative –activities in workplaces using sensors has not been widely studied
  • In this work we wanted to study how accurately we can recognize phases of a surgery
  • In this work we wanted to study how accurately we can recognize phases of a surgery
  • In this work we wanted to study how accurately we can recognize phases of a surgery
  • Anesthesia related actions are done in l1, l2, and l3; and operation related actions are performed in l3 and l4. For example, prepartion of anesthetic and anesthesia instruments is done near the anesthesia cabinet (l2) whereas preparation of operation instruments is carried out in l4. The team members move between these zones. For example, depending on the action, the anesthesia nurse switches between l1, l2, and l3. The frequency of movements between different zones depends on the operation phase. During the preparation and ending phases, the clinicians move between zones more frequently than during the surgery
  • Before the operation is scheduled to start, the anesthesia nurse prepares for surgery by checking the anesthesia devices and arranging medicine and the anesthesia instruments. Meanwhile, the surgical nurse and the circulating nurse (SOSU) prepare the surgical instruments and devices by placing them on the operating trolley next to the operating table. When the patient arrives, she or he is being anesthetized. When the surgeon(s) enters the OR, the operation starts. During the operation, the anesthesia nurse constantly monitors the patient’s condition, and transfuses blood and medicine as needed. The surgical nurse assists the surgeon and hands him the instruments and material. When approaching the end of the operation, the anesthesia nurse starts waking the patient, and the surgical nurse starts collecting all surgical instruments. Finally, the patient is moved to the recovery department
  • To track the location of people inside the OR, we used the Ubisense real time location tracking system (RTLS). When a person is wearing a tag, this RTLS is able to track him with a theoretical accuracy of 10 cm. in space (i.e., x, y, z coordinates). Moreover, the tracking system allow us to divide the OR into different zones, reflecting the four main areas identified in previous section. This is illustrated in Figure 4. Because the Ubisense system is not completely accurate, we introduced a buffer in the order of 50 cm. on each side of the zones. Furthermore, since the staff member will be positioned for instance in front of a table, the zone is extended in the direction where the person is most likely to be positioned. This makes the zones overlap and it is therefore possible for a staff member to be in more than one zone at the same time. In the OR, instruments are placed on the anesthesia table and the operating trolley. To track the instruments and objects on the tables, all instruments were tagged with passive RFID tags, and tables had built-in RFID readers. Since the OR contains many metallic instruments and tables, ultra-high frequency (UHF) RFID technology was used. UHF is more robust in such an environment, but issues with reflection and shielding still exist. Figure 5 shows the anesthesia nurse preparing instruments at the anesthesia table, which has a built-in RFID reader. The third sensor is a wireless, palm-based RFID sensor that is able to detect which instruments and objects a clinician is holding in his or her hand. This sensor is composed of a micro controller board (Arduino Duemilanove1), an RFID reader module (ID-12 Innovations2), and a wireless unit composed of a shield (Arduino XBee Shield Empty) and a wireless module (XBee 1mW Chip Antenna3), that allows the micro controller board to communicate wirelessly over a modified ZigBee protocol to the server
  • central server for collecting, filtering, time stamping, synchronizing, and storing sensor readings
  • Since the sensors could n be used during real surgeries we did the experiment in a simulated operating room in a hospital Surgical simulation a common method in medical practice
  • We tagged real surgical instruments and performed the operations on a fictive patient. The scenarios were based on the video recorded operations and designed in close collaboration with domain experts, i.e. surgeons, anesthesiologists, and nurses. Exact timing of different steps was extracted from the original video recordings. The scenarios varied in terms of the exact timing of the steps, e.g., when the surgical nurse starts the preparations. Moreover, some of the activities were performed with slight variations. For example, simulating that intubation fails and is completed with a laryngeal mask instead
  • Used decision trees: the impact of sensors could be established by looking at the tree, fast inference for a real time system As the sensor readings only provide information about the current state of the operation within a given second it is difficult to distinguish two identical states. For instance, it is not really possible to know whether a surgeon is picking up the scalpel in the start or at the end of a surgery added a so-called historical feature for each sensor feature that is equal to the number of times {0, 1, 2, . . .} that the sensor feature has been 1 we labeled the correct phase for each collected feature instance. For this purpose, we used an application that is able to display the collected data as well as show the video recordings.
  • As the sensor readings only provide information about the current state of the operation within a given second it is difficult to distinguish two identical states. For instance, it is not really possible to know whether a surgeon is picking up the scalpel in the start or at the end of a surgery. One way to address this problem is to add wall clock time to the feature set. However, the wall clock time interval of a phase can vary a lot between surgeries. It depends on how fast the staff is, as well as how difficult the patient is to operate. Instead, we added a so-called historical feature for each sensor feature that is equal to the number of times {0, 1, 2, . . .} that the sensor feature has been 1. An example of a historical feature is the total number of seconds that the anesthesiologist has been in anesthesia machine zone, at the point in time the feature instance is logged. The sensor platform logs historical features simultaniously with ordinary sensor features.
  • evaluate the effect of different sensors in achieving accurate phase recognition
  • Phase Recognition during Surgical Procedures using Embedded and Body-worn Sensors.

    1. 1. Jakob Bardram, Afsaneh Doryab , Rune M.Jensen, Poul M. Lange, Kristian L. G.Nielsen, and Soeren T. Petersen
    2. 2. <ul><li>Motivation </li></ul><ul><li>Field studies </li></ul><ul><li>Sensor platform </li></ul><ul><li>The experiment </li></ul><ul><li>The results </li></ul><ul><li>Discussion and Conclusion </li></ul>Outline
    3. 3. Motivation <ul><li>Existing work has mainly focused on daily routines of an individual person in home or outdoor </li></ul>
    4. 4. MOTIVATION <ul><li>Hospital work is collaborative which involves constant coordination and communication </li></ul>
    5. 5. MOTIVATION <ul><li>Automatic recognition of surgical procedure can be used for: </li></ul><ul><ul><li>Coordination, communication and planning </li></ul></ul><ul><ul><li>Patient safety </li></ul></ul><ul><ul><li>Context-aware information management and retrieval </li></ul></ul>
    6. 6. Motivation How can we automatically recognize the phases of a surgery using sensors?
    7. 7. Motivation How can we automatically recognize the phases of a surgery using sensors? What should be sensed?
    8. 8. Motivation How can we automatically recognize the phases of a surgery using sensors? What should be sensed? Which sensors provide more significant input ?
    9. 9. <ul><li>What clinicians do in a surgery? </li></ul><ul><li>How do they move around? </li></ul>Field study
    10. 10. 4 important activity zones
    11. 11. <ul><li>Who is involved in a surgery? </li></ul>Field study
    12. 12. <ul><li>Which tools and instruments they use? </li></ul><ul><li>Where they use them? </li></ul>Field study
    13. 13. Field study <ul><li>97% of tasks involved at least one physical instrument and 78% involved several instruments </li></ul><ul><li>Direct relation between physical instruments and physical tasks </li></ul>
    14. 14. Field study <ul><li>97% of tasks involved at least one physical instrument and 78% involved several instruments </li></ul><ul><li>Direct relation between physical instruments and physical tasks </li></ul>
    15. 15. Temporal and sequential procedure
    16. 16. <ul><li>Based on our detailed study, these parameters seemed important to track: </li></ul>
    17. 17. <ul><li>Based on our detailed study, these parameters seemed important to track: </li></ul><ul><li>The location of clinicians and the patient </li></ul>
    18. 18. <ul><li>Based on our detailed study, these parameters seemed important to track: </li></ul><ul><li>The location of clinicians and the patient </li></ul><ul><li>The location of objects on different tables </li></ul>
    19. 19. <ul><li>Based on our detailed study, these parameters seemed important to track: </li></ul><ul><li>The location of clinicians and the patient </li></ul><ul><li>The location of objects on different tables </li></ul><ul><li>The use of objects and instruments by clinicians </li></ul>
    20. 20. Sensor platform - Hardware TCP Serial TCP TCP USB XBee Ubisense server Platform server Arduino hub Arduino Arm-wrist object tracking Ubisense Person-tracking Alien Short-range- object tracking Icode Short-range- object tracking
    21. 21. <ul><li>Synchronize input from different sensors </li></ul><ul><li>store raw observations (one instance per second) </li></ul>Sensor platform- Software
    22. 22. Sensor platform – Feature vector
    23. 23. Sensor Platform – Feature vector
    24. 24. The Experiment
    25. 25. The Experiment
    26. 26. The Experiment
    27. 27. The Experiment
    28. 30. Analysis Test the feasibility of the sensor platform
    29. 31. Analysis Test the feasibility of the sensor platform How accurate phase recognition could be using sensed data
    30. 32. Analysis Test the feasibility of the sensor platform How accurate phase recognition could be using sensed data Identify the impact of each sensor in achieving high accuracy
    31. 33. 4 simulated operations – (4 datasets) Leave-one-out cross validation Using decision trees for classification Analysis - Phase Recognition
    32. 34. Analysis - Decision tree
    33. 35. Analysis – Initial Results
    34. 36. Analysis – Feature Set Features Feature Instances at time t 1 -t n Laryngoscope in anesthesia table zone Anesthesia nurse in anesthesia machine zone Tube in anesthesia table zone … time 0 1 0 … t1 0 1 1 … t2 0 1 1 … t3 . . . . . . . . . . 1 1 0 … tn
    35. 37. Results – Phase Recognition Features Feature Instances at time t 1 -t n Laryngoscope in anesthesia table zone Anesthesia nurse in anesthesia machine zone Tube in anesthesia table zone … time 0 1 0 … t1 0 1 1 … t2 0 1 1 … t3 . . . . . . . . . . 1 1 0 … tn Laryngoscope in anesthesia table zone Laryngoscope in anesthesia table zone- acc Anesthesia nurse in anesthesia machine zone Anesthesia nurse in anesthesia machine zone- acc Tube in anesthesia table zone Tube in anesthesia table zone- acc … time 0 0 1 1 0 0 … t1 0 0 1 2 1 1 … t2 0 0 1 3 1 2 … t3 . . . . . . . . . . . . 1 10 1 13 0 2 … tn
    36. 38. Feature Processing Laryngoscope in anesthesia table zone Anesthesia nurse in anesthesia machine zone Tube in anesthesia table zone … time 0 1 0 … t 1 0 1 1 … t 2 0 1 1 … t 3 . . . . . . . . . . 1 1 0 … t n
    37. 39. Feature Processing Laryngoscope in anesthesia table zone laryngoscope in anesthesia table zone- accumulated time 0 0 t 1 1 1 t 2 1 2 t 3 0 2 t 4 0 2 t 5 1 3 t 6 Laryngoscope in anesthesia table zone Anesthesia nurse in anesthesia machine zone Tube in anesthesia table zone … time 0 1 0 … t 1 0 1 1 … t 2 0 1 1 … t 3 . . . . . . . . . . 1 1 0 … t n
    38. 40. Results – Phase Recognition With historical features Without historical features
    39. 41. Sensor Significance
    40. 42. Sensor Significance Ubisense only
    41. 43. Sensor Significance Ubisense only Tables+ wristband
    42. 44. Sensor Significance Ubisense only Tables+ Wristband Wristbands only
    43. 45. Location important or not? Using palm based sensors in the OR? Using sensors or images? Classical machine learning enough? Discussion
    44. 46. Conclusion <ul><li>Possible to achieve relatively high classification accuracy in phase recognition during surgical procedures using machine learning techniques </li></ul>
    45. 47. Conclusion <ul><li>Possible to achieve relatively high classification accuracy in phase recognition during surgical procedures using machine learning techniques </li></ul><ul><li>The experiment hepled to analyze the weight and hence the importance of different sensors </li></ul>
    46. 48. Conclusion <ul><li>Possible to achieve relatively high classification accuracy in phase recognition during surgical procedures using machine learning techniques </li></ul><ul><li>The experiment hepled to analyze the weight and hence the importance of different sensors </li></ul><ul><li>Our study gives important input for further research in design of suitable sensors for the OR </li></ul>
    47. 49. Thank you

    ×