Successfully reported this slideshow.

Siggraph asia 2019 Course slides

1

Share

Upcoming SlideShare
VRhab Poster
VRhab Poster
Loading in …3
×
1 of 109
1 of 109

Siggraph asia 2019 Course slides

1

Share

Download to read offline

The growing interest of Augmented Reality (AR) together with the renaissance of
Virtual Reality (VR) has opened new approaches and techniques on how professionals interact with medical imagery, plan, train and perform surgeries and also help people with special needs in Rehabilitation tasks. Indeed, many medical specialties already rely on 2D and 3D image data for diagnosis, surgical planning, surgical navigation, medical education or patient-clinician communication.
However, the vast majority of current medical interfaces and interaction techniques continue
unchanged, while the most innovative solutions have not unleashed the full potential of VR and
AR. This is probably because extending conventional workstations to accommodate VR and AR
interaction paradigms is not free of challenges. Notably, VR and AR-based workstations,
besides having to render complex anatomical data in interactive frame rates, must promote
proper anatomical insight, boost visual memory through seamless visual collaboration between
professionals, free interaction from being seated at a desk (e.g., using mouse and
keyboard) to adopt non-stationary postures and freely walk within a workspace, and must also
support a fluid exchange of image data and 3D models as this fosters interesting discussions
to solve clinical cases. Moreover, VR and AR-based techniques must also be designed
according to good human-computer interaction principles since it is well known that medical
professionals can be resistant to changes in their workflow. In this course, we will survey recent approaches to healthcare, including diagnosis, surgical training, planning, and followup as well as AR/MR/VR tools for patient rehabilitation. We discuss challenges, techniques, and principles in applying Extended Reality in these contexts and outline opportunities for future research. References: https://dl.acm.org/citation.cfm?id=3359418 This course was also presented in Toronto in two additional talks.

The growing interest of Augmented Reality (AR) together with the renaissance of
Virtual Reality (VR) has opened new approaches and techniques on how professionals interact with medical imagery, plan, train and perform surgeries and also help people with special needs in Rehabilitation tasks. Indeed, many medical specialties already rely on 2D and 3D image data for diagnosis, surgical planning, surgical navigation, medical education or patient-clinician communication.
However, the vast majority of current medical interfaces and interaction techniques continue
unchanged, while the most innovative solutions have not unleashed the full potential of VR and
AR. This is probably because extending conventional workstations to accommodate VR and AR
interaction paradigms is not free of challenges. Notably, VR and AR-based workstations,
besides having to render complex anatomical data in interactive frame rates, must promote
proper anatomical insight, boost visual memory through seamless visual collaboration between
professionals, free interaction from being seated at a desk (e.g., using mouse and
keyboard) to adopt non-stationary postures and freely walk within a workspace, and must also
support a fluid exchange of image data and 3D models as this fosters interesting discussions
to solve clinical cases. Moreover, VR and AR-based techniques must also be designed
according to good human-computer interaction principles since it is well known that medical
professionals can be resistant to changes in their workflow. In this course, we will survey recent approaches to healthcare, including diagnosis, surgical training, planning, and followup as well as AR/MR/VR tools for patient rehabilitation. We discuss challenges, techniques, and principles in applying Extended Reality in these contexts and outline opportunities for future research. References: https://dl.acm.org/citation.cfm?id=3359418 This course was also presented in Toronto in two additional talks.

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Related Audiobooks

Free with a 14 day trial from Scribd

See all

Siggraph asia 2019 Course slides

  1. 1. Approaches and Challenges to Virtual and Augmented Reality in Health Care and Rehabilitation SIGGRAPH Asia 2019 Course Notes Joaquim Jorge Daniel Lopes Pedro Campos INESC-ID & IST / U Lisboa November 2019, Brisbane, Australia
  2. 2. Joaquim Jorge Instituto Superior Técnico Universidade de Lisboa Visualization and Multimodal Interfaces @ INESC-ID Lisboa http://web.ist.utl.pt/jorgej ACM/SIGGRAPH Special Conferences Committee Chair Editor Computers & Graphics Research Interests: Calligraphic Interaction, Multimodal Interfaces, Graphical Modeling
  3. 3. IEEE VR 2020 27th IEEE Conference on Virtual Reality & 3D User Interfaces March 22nd - 26th, 2020 Atlanta, Georgia, USA.
  4. 4. Pedro F. Campos University of Madeira Associate Professor with Habil. Interactive Technologies Institute www.experienceaugmentation.com PI, SUAVE project: Nudging users towards healthier lifestyles PI, Augmented Crutches: AR for helping patients to learn how to walk with crutches PI, SENSE:SEAT, Inducing Improved Mood and Cognition through Multisensorial Priming Research Interests: Virtual Reality Health and Well-being Persuasive Computing Behavior Change Technologies Multimodal Interaction
  5. 5. Daniel Simões Lopes Instituto Superior Técnico UT Austin / Portugal Assistant Professor Heads Biomedical Research Line PI of project IT-MEDEX http://web.ist.utl.pt/daniel.s.lopes Research Interests: Computational Geometry, Interactive Computer Graphics, Computer-Aided Healthcare, Interactive Visualization
  6. 6. Introduction and Motivation
  7. 7. Medical UI/UX
  8. 8. Reality–Virtuality Continuum Real Reality Immersive VR Sem i-im m ersive VR D im inished R ealitySpatialAR See-Through AR Augm ented Virtuality Mixed Reality Augmented Reality Extended Reality
  9. 9. Challenges Building a network of healthcare professionals Identifying current practices Linguistic Barrier Big Picture but no Specs “Automagic worshippers”
  10. 10. Motivation
  11. 11. Motivation
  12. 12. http://jnm.snmjournals.org/content/45/2/279/F2.expansion.html http://jnm.snmjournals.org/content/52/10/1501/F1.expansion.html http://jnm.snmjournals.org/content/51/8/1198/F3.expansion.html Motivation
  13. 13. Extend the Reality: Learning
  14. 14. Healthcare Scenarios and Collaboration
  15. 15. Scenario: Radiodiagnostics
  16. 16. VRRRRoom: Virtual Reality for Radiologists in the Reading Room CHI 2017
  17. 17. VR for Radiologists in the Reading ROOM
  18. 18. VR for Radiologists in the Reading Room
  19. 19. VR for Radiologists in the Reading Room
  20. 20. VR for Radiologists in the Reading Room https://www.youtube.com/watch?v=bsNwV4BhJOw
  21. 21. VR 4 Radiologists in the Reading ROOM Evaluation Radiology Neuroradiology Gynecology Surgery Dental Implantology Obstetrics Medical Specialities 5 no previous experience in VR
  22. 22. Virtual Colonoscopy
  23. 23. Problem
  24. 24. Problematic
  25. 25. Interaction Techniques for Immersive CT Colonography: A Professional Assessment MICCAI / VCBM 2018
  26. 26. https://youtu.be/_9EMhjI3kYw
  27. 27. Professional Assessment 5M+2F Doctors Diagnostic: Minutes << HOURS!
  28. 28. Evaluation + immersion + freedom of movement + World In Miniature navigation - limited camera control - Improvements needed
  29. 29. Augmented Dissection Table Table for two … plus one
  30. 30. Anatomy Studio: Augmented Virtual Dissection Sessions
  31. 31. Anatomy Studio: Augmented Virtual Dissection Sessions
  32. 32. Anatomy Studio: Augmented Virtual Dissection Sessions
  33. 33. Healthcare Scenarios Unesco Chair Of Digital Anatomy Université Paris Descartes Unité de recherche en Développement, Imagerie et Anatomie - Prof. Vincent Delmas et Dr. J.F Uhl
  34. 34. Forthcoming Springer Book Digital Anatomy
  35. 35. VR in the Operations Room
  36. 36. Contactless Surgical Navigation Germ-free interfaces
  37. 37. ? Conventional Systems: Dimensional Gap
  38. 38. Source: O’Hara et al., 2014 VOXEL Explorer: Asepsis
  39. 39. VOXEL Explorer: Simple Gestures
  40. 40. Novel Spatial Interaction Techniques for Exploring 3D Medical Images 42 VOXEL Explorer: Track your Body
  41. 41. VOXEL Explorer: Gameboy Metaphor
  42. 42. Augmented Surgery More than meets the scalpel
  43. 43. The Promise of an Augmented Reality
  44. 44. Motivation
  45. 45. Laparoscopic HUD Augmented Laparoscopic Navigation
  46. 46. Motivation
  47. 47. Motivation
  48. 48. Motivation
  49. 49. Laparoscopic Head Mounted Display
  50. 50. Interactive Rehabilitation Easing the patient’s patience
  51. 51. Motivation
  52. 52. REPETITION – FEEDBACK – MOTIVATION Motivation
  53. 53. $ COST WEARABLES Motivation
  54. 54. COMPENSATORY MOVEMENTS COLLABORATION Motivation
  55. 55. SleeveAR: Augmented Reality for Rehabilitation using Realtime Feedback João Vieira, Maurício Sousa, Artur Arsénio and Joaquim Jorge
  56. 56. Motivation
  57. 57. Augmented Reality
  58. 58. SleeveAR
  59. 59. Sleeve
  60. 60. Process Recording Movement Guidance Performance Review
  61. 61. Locomotiver Exergames for Locomotion Rehabilitation in VR
  62. 62. VR for Locomotion
  63. 63. Augmented Crutches
  64. 64. Augmented Crutches Mobile-projection system, aimed at helping people to learn how to walk with crutches
  65. 65. Personal Experience 1
  66. 66. Research 1 Personal Experience 2 Physical Therapist 2
  67. 67. How it works 1 2 3 Answer questionnaire Connect to projector Walk with projected visual cues 3
  68. 68. Como funciona
  69. 69. If you want to know moreBeatriz Peres, Pedro F. Campos, Aida Azadegan, A Persuasive Approach in Using Visual Cues to Facilitate Mobility Using Forearm Crutches. BCSS@PERSUASIVE 2019 Beatriz Peres, Pedro F. Campos, Aida Azadegan , Digitally Augmenting the Physical Ground Space with Timed Visual Cues for Crutch-Assisted Walking. CHI Extended Abstracts 2019 Beatriz Peres, Pedro F. Campos, Aida Azadegan , A Digitally-Augmented Ground Space with Timed Visual Cues for Facilitating Forearm Crutches’ Mobility, Full paper , Interact 2019 .
  70. 70. Challenges Ahead
  71. 71. Nonstandard Equipment
  72. 72. Funny User Interfaces
  73. 73. Disorientation
  74. 74. Fatigue
  75. 75. Nausea
  76. 76. Bright Future Ahead
  77. 77. Raskar | MIT Media Lab Multi-ModeMulti-Format IoT Perception Ergo, Hygiene Gestural Language
  78. 78. THE NEED FOR NEW DESIGN DISCIPLINES
  79. 79. Design is Hard Society Technology People
  80. 80. Design is Hard Society Technology People UI UXUE
  81. 81. Challenges in New Modalities
  82. 82. Speech Interfaces 50 years ago 2001 – A Space Odissey 1969
  83. 83. Speech Interfaces today
  84. 84. Errors = Design Opportunities
  85. 85. Fusing Multiple Modalities
  86. 86. Synergistic Parsing
  87. 87. Machine Learning to the Rescue
  88. 88. Task Analysis à Learning UM
  89. 89. Design Towards Simplicity
  90. 90. Design Towards Simplicity
  91. 91. The future?
  92. 92. Interactive Tablets for Collaborative Scenarios Related to 3D Medical Image Exploration jorgej@acm.org http://it-medex.inesc-id.pt/home
  93. 93. Acknowledgments ARCADE Afonso Faria, André Domingues LOCOMOTIVER Alexandre Gordo VRRRRoom Maurício Sousa, Daniel Mendes, Soraia Paulo CAVE COLON Pedro Borges, Daniel Medeiros, Soraia Paulo Voxel TIPs Rita Mendes, Pedro Parreira, Soraia Paulo Anatomy Studio Maurício Sousa, Daniel Mendes, Rafael dos Anjos, Soraia Paulo Voxel Explorer Pedro Parreira, Soraia Paulo TOOTHFAIRY Filipe Relvas, Soraia Paulo Laparoscopic HUD Miguel Belo, Soraia Paulo Minimally Invasive Surgery José Miguel Gomes ImplantAR Tiago Jerónimo Plausible Box Figures Nuno Matias Imperceptible Breathing Filipe Marques
  94. 94. Special Thanks to Mauricio de Sousa Daniel Pires de Sá Medeiros Daniel Tavares Mendes Rafael Kuffner dos Anjos Soraia Figueiredo Paulo Beatriz Peres
  95. 95. Questions?
  96. 96. jorgej@acm.org
  97. 97. References
  98. 98. D.S. Lopes, D. Medeiros, S.F. Paulo, P.B. Borges, V. Nunes, V. Mascarenhas, M. Veiga, J.A. Jorge, Interaction Techniques for Immersive CT Colonography: A Professional Assessment, In: Frangi A., Schnabel J., Davatzikos C., Alberola- López C., Fichtinger G. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2018. MICCAI 2018. Lecture Notes in Computer Science, v o l 11 0 7 1 , P a g e s 6 2 9 – 6 3 7 , S p r i n g e r, C h a m , 2 0 1 8 . D O I : 10.1007/978-3-030-00934-2_70 M. Sousa, D. Mendes, S. Paulo, N. Matela, J. Jorge, D.S. Lopes, VRRRRoom: Virtual Reality for radiologists in the reading room, Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems (CHI 2017), New York: ACM Press, 2017. DOI: 10.1145/3025453.3025566 D.S. Lopes, P.F. Parreira, S.F. Paulo, V. Nunes, P.A. Rego, M.C. Neves, P.S. Rodrigues, J.A. Jorge, On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface, Journal of Biomedical Informatics, 72, Pages 140–149, 2017. DOI: 10.1016/j.jbi.2017.07.009 Daniel Simões Lopes, Pedro F. Parreira, Ana R. Mendes, Vasco M. Pires, Soraia F. Paulo, Carlos Sousa, Joaquim A. Jorge, Explicit design of transfer functions for volume-rendered images by combining histograms, thumbnails, and sketch-based interaction, The Visual Computer, December 2018, Volume 34, Issue 12, pp 1713–1723. DOI: 10.1007/s00371-017-1448-8
  99. 99. Soraia Figueiredo Paulo, Filipe Relvas, Hugo Nicolau, Yosra Rekik, Vanessa Machado, João Botelho, José João Mendes, Laurent Grisoni, Joaquim Jorge, Daniel Simões Lopes, Touchless interaction with medical images based on 3D hand cursors supported by single-foot input: A case study in dentistry, Journal of Biomedical Informatics, Volume 100, 2019, 103316, DOI: 10.1016/ j.jbi.2019.103316 Daniel Simões Lopes, Filipe Relvas, Soraia Paulo, Yosra Rekik, Laurent Grisoni, Joaquim Jorge. 2019. FEETICHE: FEET Input for Contactless Hand gEsture Interaction. In The 17th International Conference on Virtual Reality Continuum and its Applications in Industry (VRCAI ’19), November14–16, 2019, Brisbane, Q L D , A u s t r a l i a . A C M , N e w Yo r k , N Y, U S A , 1 0 p a g e s . D O I : 10.1145/3359997.3365704   Ezequiel R. Zorzal, Maurício Sousa, Daniel Mendes, Rafael Kuffner dos Anjos, Daniel Medeiros, Soraia Figueiredo Paulo, Pedro Rodrigues, José João Mendes, Vincent Delmas, Jean-Francois Uhl, José Mogorrón, Joaquim  Armando Jorge, Daniel Simões Lopes, Anatomy Studio: a Tool for Virtual Dissection Through Augmented 3D Reconstruction, Computers & Graphics,  DOI: 10.1016/ j.cag.2019.09.006

×