Field Testing - Keynote Speech - November2011


Published on

This was a keynote speech delivered by Dr Sanjoy Sanyal to the students, faculty and administration at the Caribbean Research Symposium in Fall (November) 2011. Two clinical systems, developed by the author himself, were also demonstrated; Automatic Staging of Breast Cancer and Determining the Risk of Cardiovascular Event in 5 Years.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • These are all Testing Processes . Before that, there are Testing Methods ; Black Box Testing and White Box Testing , done by the engineers themselves. Then there are Unit Testing etc done before releasing to public.
  • Field Testing - Keynote Speech - November2011

    1. 1. Technological Systems in Medical Settings – Beta Testing and Demonstration Dr. Sanjoy Sanyal Associate Professor and Course Director of Neurosciences MUA
    2. 2. Contents <ul><li>Hitchhiker’s Guide to the Galaxy – Insanely great! </li></ul><ul><li>Technological Systems in Medical Settings </li></ul><ul><li>Systems Development Processes </li></ul><ul><li>Systems Testing Processes </li></ul><ul><li>Usability / Usefulness, Usability Testing Instruments </li></ul><ul><li>InQsit Online Examination System: CSUQ, ETC, Beta testing, Score Grading, Preliminary results </li></ul><ul><li>InQsit for Faculty </li></ul><ul><li>Demonstration of 2 Clinical Systems (External) </li></ul><ul><li>Conclusion – What was that again? </li></ul>
    3. 5. <ul><li>SMART : A project should be </li></ul><ul><li>S imple </li></ul><ul><li>M easurable </li></ul><ul><li>A chievable </li></ul><ul><li>R ealistic </li></ul><ul><li>T ime-bound </li></ul><ul><li>Three projects are described </li></ul><ul><li>Beta testing of an Online Examination system </li></ul><ul><li>System for Automated staging of Breast cancer </li></ul><ul><li>System for giving adjusted risk of CV event </li></ul>
    4. 6. Systems defined
    5. 7. Technological systems <ul><li>Medical care settings : Strategic (DSS, EIS); Tactical (MIS); Operational (PAS; Clinical EMR / EHR / EPR, CPOE, ETP, LIS , PACS ) </li></ul>
    6. 8. Technological Systems <ul><li>Telemedicine – Telematics systems </li></ul><ul><li>Subject / Task based systems (OT, Admission-Discharge systems); Simulation systems (Acute capability model, Pollution Asthma project, Renal services simulation) </li></ul><ul><li>Medical education setting : Many of above; LMS ( Moodle ); Online lecture systems ; Online Exam systems ( Questionmark , InQsit ) </li></ul>
    7. 9. Systems Development
    8. 10. Systems Development <ul><li>Multiple iterations till Alpha / Internal Acceptance Testing </li></ul>
    9. 11. Testing Processes <ul><li>Alpha ( α ) testing : Internal Acceptance Testing </li></ul><ul><li>Simulated / actual operational testing by independent team at developers' site </li></ul><ul><li>It is a form of Internal Acceptance testing , before it goes to Beta ( β ) testing </li></ul><ul><li>Beta ( β ) testing : </li></ul><ul><li>Comes after Alpha ( α ) testing </li></ul><ul><li>β versions released to limited audience outside programming team , or made available to open public to increase number of feedback </li></ul><ul><li>Ensures product has few faults / ‘bugs’ </li></ul>
    10. 12. Beta testing
    11. 13. Testing processes <ul><li>(External) Acceptance testing : </li></ul><ul><li>Can be conducted by end-user, customer or client </li></ul><ul><li>Whether or not to accept product </li></ul><ul><li>Regression testing : </li></ul><ul><li>After modifying S/W, re-run previously passing tests </li></ul><ul><li>Ensures modifications have not caused regression of previous functionality </li></ul><ul><ul><li>Sanity testing : Quickly checking for bizarre behaviour </li></ul></ul><ul><ul><li>Smoke testing : Testing for basic functionality </li></ul></ul>
    12. 14. INQSIT online exam system
    13. 15. Usability – Usefulness – Users – Qs <ul><li>Usability : How easy is it for me to use the system? </li></ul><ul><li>Usefulness : Does it make my work easier? </li></ul><ul><li>Users : Consultative; Representative; Consensus </li></ul>
    14. 16. Usability Testing Instruments <ul><li>QUIS (27) : Questionnaire for User Interface Satisfaction – Chin et al </li></ul><ul><li>PUEU (12) : Perceived Usefulness and Ease of Use – Davis </li></ul><ul><li>NAU (5) : Nielsen’s Attributes of Usability – Nielsen </li></ul><ul><li>NHE (10) : Nielsen’s Heuristic Evaluation – Nielsen </li></ul><ul><li>CSUQ (19) : Lewis </li></ul><ul><li>ASQ (3) : After-scenario Questionnaire – Lewis </li></ul><ul><li>PHUE (13) : Practical Heuristics for Usability Evaluation – Perlman </li></ul><ul><li>PUTQ (100) : Purdue Usability Testing Questionnaire – Lin et </li></ul>
    15. 17. CSUQ – Lewis
    16. 18. CSUQ – Lewis
    17. 26. Score grading We are happy to note that you are satisfied with the new online exam system. We can assure you, your faith in the system is quite justified. Satisfied 96 - 133 It seems you are still double-minded about the new online exam system. Is there anything we can do to help you make up your mind? Neither +/- 57 - 95 We are concerned that you are not entirely satisfied with the online examination system. We would like you tell us about it in more detail. Dis-satisfied Score 19 - 56
    18. 27. Beta test preliminary results
    19. 28. Beta Test Preliminary Results <ul><li>Average question-wise score = 5.33 / 7 (Mildly-Moderately agree) </li></ul><ul><li>Average total score = 101.2 / 133 (Satisfied) </li></ul>
    20. 29. Beta Test Preliminary Results <ul><li>Qs 9: The system gives error messages that clearly tells me how to fix problems (This was an ambiguous question) </li></ul>
    21. 30. INQSIT for faculty
    22. 31. INQSIT for faculty
    23. 32. Conclusion – What was that, again?