U-9_e-Learning Browser Comparison


Published on

Published in: Technology, Health & Medicine
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

U-9_e-Learning Browser Comparison

  1. 1. Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered MethodologyINTRODUCTIONElectronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching.Interactivity holds students’ attention longer, enables easier understanding, and its proactive natureengenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D-imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visualdisplays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, thechaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3]Anatomy of human brain is the Waterloo of most medical students. We therefore decided to criticallyevaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini-study was conducted at the University of Seychelles, American Institute of Medicine (USAIM)[https://web.usaim.edu] from May 2006 to June 2006.MATERIALSTwo interfaces were selected from projects related to Visible Human Dataset of National Library ofMedicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface,an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible HumanExperience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals withwhole-body anatomy, but for comparison with the second browser in this study, only brain interface wasselected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University ofOregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML-based content to the client browser.[7,8]Colorado browser interfaceThis interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5]opened in a new window. This has to be open for the whole proceedings. The link ‘Interactive Atlas’ led tothe dynamic webpage in same window. Finally, ‘Launch the Interactive Atlas’ link on the page initiated theJava-applet (infra) to load the applet-windows [Figure-1]. Non-payment registration RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 1 Tutor: KW Lam; Student: Sanjoy Sanyal
  2. 2. Figure-1: Composite screenshots showing opening of the Interactive Atlas browser in Visible Human Java details Experience website, from the CHS website. See also Java.Java installationInteractive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) wasdownloaded, installed from Sun’s Java website (http://www.java.com) and enabled [Figure-2]. Figure-2: Composite screenshots showing Java download, installation and enabling in the computer. This is an essential pre- requisite for the browsers. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 2 Tutor: KW Lam; Student: Sanjoy Sanyal
  3. 3. Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the3D interactive atlas browser was launched, the status bar showed the sequence; ‘Applet web3d loaded’,‘Applet web3d inited’, ‘Applet web3d started’, before the 3-in-1 Java-applet windows simultaneouslyopened on the whole screen [Figure-3]. Model list / Oblique section window 3D model window; the actual browser Tools window for manipulating above Figure-3: Opening of initial Interactive Atlas 3-in-1 applet window.Applet-windowsThe upper-right window gives a comprehensive list of 3D images. Under ‘Model Available’, ‘All’ wasselected from the drop-down list. Double-clicking on the ‘Brain’ option opened a 3D interactive brainsimulation in upper-left window through a ‘Building Brain’ sequence. This is the actual browser interface. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 3 Tutor: KW Lam; Student: Sanjoy Sanyal
  4. 4. This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual ‘planeof section’ to ‘slice’ the brain in any plane/axis.Under ‘Display’ in the bottom ‘Tools’ window, ‘3D and Oblique’ option was selected from the drop-downlist. This generated a ‘Getting oblique slice’ sequence in the upper-right window and depicted ‘slices’ ofbrain, selected through the upper-left window. The bottom window is the control panel containing radio-buttons/list-boxes to customize user’s interactivity choices [Figure-4]. Virtual brain model with virtual plane of section; this is for manipulation Alpha server output in response to queries sent through upper-left window Control tools for manipulating browser Figure-4: The final appearance of the browser and output windows. These windows provided the interfaces for the study.Oregon browser interfaceThe 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabledclient for online viewing of the webpage. This was downloaded, installed and enabled over about 45minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar,and the status bar indicates ‘Applet Sushi loaded’. Once the applet had read the data, 3 sectional images ofthe brain appeared in the same window, indicated by ‘Applet Sushi started’ in the status bar. This wasactivated by clicking anywhere on the window [Figure-5]. Java applet loading indicator Progress bar Figure-5: Oregon 3D brain browser applet loading sequence; note the indication on the status barThe window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section ofthe brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonalgridlines, their colours being those of linings of other two squares. Moving any crosshair in any squaredynamically updates the figures in other two squares to show the appearance of the brain in those sections.There is a fourth optional square for viewing any arbitrary ‘slice’ of brain, selected by checking the ‘Arb RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 4 Tutor: KW Lam; Student: Sanjoy Sanyal
  5. 5. slice’ check-box. Another check-box enables ‘depth cuing’ of images. Different radio-buttons allowvisualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9] Fig-6: Axial, coronal, sagittal brain sections (counter-clockwise), enclosed in red, green, blue squares, respectively. Cross-hairs in each square are of other two colours. At start-up, clicking anywhere Fig-7: Showing arbitrary slice, in window activates the controls enclosed in cyan and magenta Fig-8: Showing MRI- type of appearance. Fig-9: Showing Infrared type of appearanceAll applets are stored in a special folder for quick viewing later [Figure-10].Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewingMETHODSWe adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. Theunderpinning principle was to check the interfaces against the following healthcare user interface designprinciples; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability,input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 5 Tutor: KW Lam; Student: Sanjoy Sanyal
  6. 6. times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen’s usability engineering[13]and in TechDis accessibility/usability precepts.[14]Usability inquiryThe first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed thefirst six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulnessof the two interfaces, both individually and comparatively, were the evaluation objectives. Students fromPre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browserinterfaces were opened online in a computer that had been prepared by loading/enabling Java applets.Students were demonstrated the use of both interfaces, in small groups and individually. Then each of themwas given 30-45 minutes to work on the interfaces, in the students’ library. In some cases pairs of studentsworked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks,viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internetconnection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix]QuestionnaireWe modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIHwebsite,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-sevenclose-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulnessissues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3-point scale.[22] The data was analysed, tabulated and represented graphically.[9,21]Last six questions were open-ended qualitative types.[22] The responses were analysed and categorizedaccording to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25]pertaining to ISO principles of design.[12]Usability inspectionThe second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The authoracted as usability-specialist (user interface ‘heuristic expert’); judging user interface and systemfunctionality against a set of heuristics to see whether they conformed to established principles of usabilityand good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approachusing the relatively inexperienced students.Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber’s project[29][Appendix]. For eachinterface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next,depending on frequency, impact and persistence of usability problem, a level of problem severity wasassigned according to following rating scale.[30](Box-1)Box-1Automated testingIn the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32]and WebXACT.[33] These tools utilize automated ‘Web-crawlers’ to check webpages/stylesheets for errorsin underlying code and accessibility issues. We used the main page of each resource for the tests.[8] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 6 Tutor: KW Lam; Student: Sanjoy Sanyal
  7. 7. LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable,and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2[www.minervation.com/validation] to automatically generate the accessibility scores. The usability andreliability scores were calculated ‘by hand’, and tabulated. Figure-11: Screenshot of Minervation site, Figure-12: Screenshot of W3C site, showing showing LIDA validation tool Markup Validation ServiceMarkup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance toW3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-levelto each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatorycompliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2[http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations.Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the nameWebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility andprivacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance withSection-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It isgood for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicatingthat the site has been ‘endorsed’ in some way by another organization.[Figure-13] Bobby-approved kite-mark, taken from BDA website: http://www.bda- dyslexia.org.uk Figure-13: Screenshot of Watchfire site, showing WebXACT validation tool. Inset: Sample of Bobby approved kitemarkWebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general,quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate WebAccessibility Barrier (WAB) score.[8] The steps are summarised in Box-2.Box-2: Simplified steps for calculating WAB RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 7 Tutor: KW Lam; Student: Sanjoy Sanyal
  8. 8. Colour testingFinally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challengedindividuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objectsappear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38]VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted andinstalled to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, thecorresponding ‘colour-blind appearance’ was noted and displayed for comparison purposes.RESULTSQuestionnaire analysisUser demographicsThirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix-Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionatelymore females (86% vs53%) in 18-19 age-groups.Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7hours’ Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93%(28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connectedInternet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b] Gender-based age distribution 100% 90% 80% 70% % of students 60% 50% 40% 30% 20% 10% 0% Age 19 20 21 22 or Total Female (years) 18 above MaleFigure-14: 100% Stacked Column showing age-gender distribution of respondents.SearchabilitySixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to15/30 (50%) through Oregon interface. Nearly four times more students found searchability through thelatter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficultyin searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 8 Tutor: KW Lam; Student: Sanjoy Sanyal
  9. 9. Searchability 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Male Female Both Male Female Both Easy / (Very) Acceptable difficulty Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Very) / Difficult Figure-15: 100% 3D Stacked Column showing ease of search for information through either interface, divided gender-wise. Speed Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There was no appreciable gender difference[Appendix-Table-1d; Figure-16]. Perception of browser speed (Colorado) 3% 0% 10% Very fast Perception of browser speed (Oregon) Very fast Moderately fast Moderately fast Moderately slow 0% Moderately slow 13% Very slow Very slow 37% 50% 87%Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender. Success rate Success in finding the required information/‘slice’ of brain was considered a resultant of interface- effectiveness, reliability, arrangement of information and output. There were no failures with Colorado browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser, 47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b]. Success rate (Colorado) Fig 17a 0% 30% From 1st attempt After 1+ failure Not successful 70% RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 9 Tutor: KW Lam; Student: Sanjoy Sanyal
  10. 10. Success rate (Oregon) Fig 17b 27% 30% From 1st attempt After 1+ failure Figures-17a,b: 3D exploded pie Not successful charts showing success / failure rates with either interface, 43% irrespective of gender.Ease of useHardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required morehelp than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed morehelp, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18]. Ease of use and help requirements 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Need more help Male Female Both Male Female Both Easy, instructions useful Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Easy, no help neededFigure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and helprequirements with either interface.Information qualityInformation quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output wasuseful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, withequal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19] Good information quality 100% 90% 80% % Respondents 70% 60% 50% 40% 30% 20% 10% 0% Disagree / (Strongly) Male Female Both Male Female Both Amiguous Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Strongly) / AgreeFigure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 10 Tutor: KW Lam; Student: Sanjoy Sanyal
  11. 11. Information overloadThirty-percent (9/30) felt moderately/severely overloaded by information provided through Coloradointerface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) feltoverwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Coloradoinformation output (M:F=47%:13%).[Appendix-Table-1h; Figure-20] Information overload 100% 90% 80% 70%% Respondents 60% 50% 40% 30% 20% 10% 0% Significant / Extreme problem Male Female Both Male Female Both Moderate problem Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight problemFigure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overloadOverall usefulnessSimilar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon=47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremelyuseful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharingthe same feeling.[Appendix-Table-1i; Figure-21] Comparative usefulness of both browser interfaces 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very much / extremely Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Not at all / slightlyFigure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overallusefulness of either interface.Definitive resourceRegarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated thatthey would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 11 Tutor: KW Lam; Student: Sanjoy Sanyal
  12. 12. Perceived usefulness of either/both as definitive resource 3% 33% Figure-22: 3D exploded pie chart showing overall distribution of opinion about using either or both browser (Strongly) / Disagree64% Amiguous interface as a definitive Neuroanatomy Agree / (Strongly) resource.Actual usageWhich browser the students actually used to carry out their task provided an estimate of both interfaces’combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregonbrowser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23]. Interactive 3D atlas (Colorado) Actual usage proportions 3D brain browser (Oregon) Both interfaces equally 23% 44% Figure-23: 3D exploded pie showing overall 33% distribution of users who actually used either / both interface(s) for performing a task.Future prospectsStudents’ opinion regarding future prospects of these interfaces considered aspects like usability, usefulness,robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very goodfuture prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females thanmales felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied toOregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24]. Perceived future prospects 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very / Extreme Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / SlightFigure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospectsof either interface.Questionnaire qualitative analysis RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 12 Tutor: KW Lam; Student: Sanjoy Sanyal
  13. 13. Appropriate sample user comment(s) (both positive and negative) about each browser interface, and thecorresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table-2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cuttingacross gender-divide showed greater preference for Colorado browser interface.Heuristic violation severityAverage heuristic violation severity rating for Oregon interface was three times as much as Coloradointerface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severelycompromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25] Usability Severity Rating 4 3 Violation severity rating 2 1 0 n n ll n n ts es om n s e ld rs us s ca tio ig io io io er rd us in or ro od es at ta at re at nt ed us da tra w er of ig td st m en ve rm fr e an an ns ry al av m em y is of m re fo th nc re na Figure-25: Clustered st co N nd ro al cu rp in se st di ie er rf im d d al la do of sy ro an an U fic or th ve in ic tr o Er Column showing e ra tra ef d m of ys co em cy ur an on d n Ex d Ph y en ct re ti o st an l it an p rc ru st sy severity of heuristic bi s el ni ty St er se ti c si si H og n i li us on Vi U he ee ib ec C violation for each of p st ex tw R el Ae Interactive 3-D Atlas (Colorado) Fl be H 15 heuristics, in each ch Heurestics at 3-D Brain Brow ser Interface (Oregon)M browser interface.Automated test resultsLIDABoth browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26] Figure-26: Composite screenshots from LIDA tests showing failure of both interface sites to meet UK legal standards.Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4.There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs.67%); with comparable means and standard deviations. Probability associated with Student’s t test (2 tailed RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 13 Tutor: KW Lam; Student: Sanjoy Sanyal
  14. 14. distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showedsubstantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29]. LIDA Results 90% 80% 70% 60% 50% 40% Figure-27: Clustered Column showing 30% accessibility, usability and reliability results of 20% Interactive 3-D both websites, as analysed by LIDA tool. Overall 10% Atlas (Colorado) results do not show any significant difference, 0% 3-D Brain Accessibility Usability Reliability Overall score Brow ser (Oregon) apparently. Break up of accessibility results 100% 90% 80% 70% % Score 60% 50% 40% 30% 20% 10% 0% Figure-28: Clustered 3D Column showing tu p ns e gs n l it y break-up of Accessibility results. This was od Ta ti o bi Se tio C tra si ge st ric te d or e is es automatically generated by LIDA tool, except the Pa da C eg cc Re ut l in R la ss O ub al Interactive 3D atlas (Colorado) last parameter. Differences between two sites ce D ver Ac O 3D brain brow ser (Oregon) are more apparent. Break up of usability 100% 90% 80% 70% 60% % Score 50% 40% 30% 20% 10% 0% ri t y nc y l ity i li t y il it y Interactive 3D atlas Figure-29: Clustered 3D Column showing la te na ib ab C s is tio g ag us (Colorado) break-up of Usability results. Differences on nc En ll C Fu ra 3D brain brow ser ve (Oregon) between two sites are even more apparent. OValidation ServiceBoth sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively.Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found[Figures-30,31]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 14 Tutor: KW Lam; Student: Sanjoy Sanyal
  15. 15. Figure-30: Screenshot from W3C Markup Validation Service showing result for Colorado site.This page is not Valid HTML 4.01 Strict! Figure-31: Screenshot from W3C Markup Validation Service showing result for Oregon site.This page is not Valid (no Doctype found)!WebXACTBoth interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non-serious page encryption level, no P3P compact policy, and issues about third party content. Additionally,Colorado browser site had no author and keywords in metadata summary, and elements missing height-width attributes (page efficiency).[Appendix-Table-5]WAB scoreThere were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, andseveral instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32].Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6] Colorado page Figure-32: Composite screenshots showing Priority 1,2,3 automatic and manual checkpoint errors and warnings in both Web pages, as determined by WebXACT. There is no significant difference between them. Oregon pageVischeck resultsAppearances of each output under normal vision and under red/green/blue-blindness are demonstrated inFigures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible toprotanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses thesecolour-combinations, are also unappreciable to the colour-blind. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 15 Tutor: KW Lam; Student: Sanjoy Sanyal
  16. 16. Figure-33: Composite screenshots showing appearance of Colorado applet windows under normal and colour-deficit visions; from left to right, clockwise – Normal, Protanopic and Tritanopic appearances; Deuteranopic appearance is almost same as protanopicA: Normal Oregon browser window B: Protanopic appearance (red missing)C: Deuteranopic appearance (green missing) D: Tritanopic appearance (blue missing)Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each typeof blindness, the outer square lines and internal cross-hairs of that particular colour are invisible.Colours of squares and cross-hairs are essential components of the interface.A: Normal appearance (infrared type) B: Protanopic appearance RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 16 Tutor: KW Lam; Student: Sanjoy Sanyal
  17. 17. C: Deuteranopic appearance D: Tritanopic appearanceFigure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normallyand in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by adifferent colour-scheme.Summary of resultsAll tests results are comparatively summarized in Appendix-Table-7 and Figure-36. All result summary 100% 90% % students and % of absolute values 80% 70% 60% 50% 40% 30% 20% 10% 0% ilit y ilit y st st te te qd e ss ad ul e ts n lit y ilit y lit y re rs rs gs e fa fa ra ra re us ne er l o ef ag pec at io bi bi co rro erro rnin or ab hab ely V re s s lp of ul us us s ol si ab l ia ls Ce sc ch c at ilu cce he se se f ov V d o vi s us re al nt a AB ar ar er fa se pr tic ce A er 3 oi w se t se su tra E a fo u fo ba ture ac ID A ov W kp nt W od s k In - ris ID oi sy l M Ta sk Ex In sk eu A L L A ec kp ea cu Fu LI D D ch ec ff i Ta Ta H LI V di to ch Oregon browser V A u u al Qaire, Heuristic, LIDA, W3C, WebXACT, WAB tests an Colorado browser MFigure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute tototal of each score in each test category. First 13 are results of questionnaire, next is heuristic violationscore, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results,last is Web Accessibility Barrier score.DISCUSSIONQuestionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since ourinterfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to studentswas the most appropriate first step. When an appropriate questionnaire already exists, adapting the same forthe current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos’questionnaire.[19] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 17 Tutor: KW Lam; Student: Sanjoy Sanyal
  18. 18. We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, alarger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had sevenpages instead of two.Our last six open-ended questions provided valuable qualitative input vis-à-vis users’ perceptions ofusability and usefulness of the two interfaces. This played a significant role in recommending practicalchanges to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and indexthe patterns and themes would have rendered analysis of our qualitative data more efficient.[25]The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic‘expert’s’ perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristicevaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, providesindication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The idealheuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was notpossible in our ‘mini’ study.Implications of automated testsAutomated tools are designed to validate WebPages vis-à-vis their underlying codes, and check theiraccessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleadingfindings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scoresshowed Colorado accessibility was poorer and usability better than Oregon. However, most students foundColorado interface superior in most categories. Heuristic evaluation also demonstrated three times higherheuristic violation in Oregon interface. However, automated tests served two purposes; they provided meansfor triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later.Four-legged table modelOur study reinforced an established principle of evaluation studies; triangulation by several methods is betterthan one method, because any single method does not give a complete evaluation.[9] The ideal usabilityevaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usabilityinquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-userusability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitiveexperts) and heuristic evaluation (heuristic experts)[16] provide usability from ‘expert’s’ perspective. Theyconstitute third leg of the table. The automated methods give numerical figures for accessibility, usabilityand reliability, and constitute fourth leg of the table. Therefore one method complements the other in asynergistic way, identifying areas of deficiency that have slipped through the cracks of other methods,besides cross-checking each others validity. We have tried to fit this model as closely as possible byemploying a multi-tiered methodology.[9-11]Lessons learned from studyEnd-user characteristicsTechnological excellence does not necessarily correlate with usability/usefulness. The award-winning 3DOregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations.However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were toosmall, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre-clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions andhints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool forplaying but not for serious Neuroanatomy study. This was the finding both from end-user perspective aswell from heuristic analysis.Gender differencesMost usability studies do not explicitly consider gender-differences, as we did. This provided valuableinsight [Box-3]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 18 Tutor: KW Lam; Student: Sanjoy Sanyal
  19. 19. Box-3: Gender-based differences gleaned from studyIn general terms this relates to improving searchability, providing more help functions, improvinginformation quality, reducing information overload and improving the interface as a whole. These applymore to female students; more to Oregon interface, but also for Colorado interface. The proposedimprovements have been considered more explicitly below.Colour-blind studentsApproximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More mayhave temporary alterations in perception of blue [Box-4].[38,41,42]Box-4: Spectrum of colour-deficits in the populationThe Oregon interface had red, green and blue as essential components. Our Vischeck simulation exerciseproved that such an interface would be useless to the colour-blind. Our school of approximately 300 studentshas about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore theimpact is likely to be substantial.Implications for user interfacesColour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students.Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided.Alternatively, there should be provision to Daltonize the images (projecting red/green variations intolightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38]One should also use secondary cues to convey information to the chromatically-challenged; subtle gray-scale differentiation, different graphic or different text-label associated with each colour.[42]Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should betested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an onlinetool for this purpose.[43]Implications for evaluation exercisesAll accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibilitythrough colour-checking engines like Vischeck. The systems should be Java-enabled.[38]Practical recommendations RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 19 Tutor: KW Lam; Student: Sanjoy Sanyal
  20. 20. Colorado interfaceThe recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37. “You could always improve anything” (User comment) Provide functionality Provide function Give notes/explanations for each item button as alternative to table list All 3 applet windows should load in <10 seconds; Or, provide page loading progress bar 1. This applet window should be larger Give a right-click ‘What’s This?’ type of help function for 2. Fonts of menu items should be at least 10 points each of these menu buttons Help function is too cumbersome; render it user-friendly Zoom function is ornamental; render it functional 1. Add clinical correlations, anatomical and functional connections between structures 2. Make search blocks, labelled diagrams 3. Blank areas of labeling should be filled up 4. Correct the errors given by the slices while locating a particular area 5. Give audio help (like a doctor speaking when click on a part)Figure-37: Composite screenshots showing all the recommendations for improvements to the Coloradobrowser are based on user comments and heuristic evaluation studies.The following recommendations are based on results of automated tests:Improving accessibility/usability[31] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 20 Tutor: KW Lam; Student: Sanjoy Sanyal
  21. 21. -Incorporate HTTP-equivalent content-type in header -Insert table summaries for visually-impaired visitors -Increase font size to at least 10 -Incorporate search facilityPriority-1/2/3 checkpoints[33] -Provide extended description for images conveying important information -Ensure that pages are still readable/usable in spite of unsupported style sheets. -Add a descriptive title to links -Provide alternative searches for different skills/preferencesOregon interfaceFigure-38 highlights the recommendations, based on user-feedback and heuristic evaluation. 3. Add following items 4. Give explanations for items 5. Provide good labeling 6. Give better views 7. Enlarge images (Fitt’s law) 8. Colour-blind feature (see text) Save Search 2. Include under Run Daltonize! 1. Provide right-click informationFigure-38: All the recommendations for improvements to the Oregon browser are based on user commentsand heuristic evaluation studies.Image sizeThis was the most common complaint by students. Fitt’s law states pointing time to target is inverselyproportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size wouldreduce effort, time and cognitive load.The following recommendations are based on results of automated tests:Improving accessibility/usability[31] -Eliminate body background colour -Include clear purpose statement in the beginning -Make ‘block of text’ scannable, in short easy-to-understand paragraphs -Include navigation tools for moving through text -Reduce user cognitive loadW3C markup validation[32] -Place a DOCTYPE declaration [Box-5]. Box-5: Document Type Definition RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 21 Tutor: KW Lam; Student: Sanjoy Sanyal
  22. 22. Priority-1/2/3 checkpoints[33] -Ensure usability of WebPages even if programmatic objects do not function -Provide accessible alternatives to information in Java 1.1 applet -Use CSS to control layout/presentation -Avoid obsolete language featuresBoth interfacesThe following recommendations are based on results of automated tests:Improving accessibility/usability[31] -Add HTML language definition -Add Dublin core title tags -Present material without necessitating plug-insPriority-1/2/3 checkpoints[33] -Use more simple/straightforward language -Identify language of text -Foreground-background colors should contrast -Validate document to formal published grammars -Provide description of general site layout, access features and usage instructions -Allow user-customisation -Provide metadata that identifies documents location in collectionConducting better evaluation techniquesUsing Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given theresources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing andInquiry approach. The former would include Performance Measurement of user combined with Question-Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automaticLogging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combinedmethodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiencyand satisfaction are covered. We can obtain quantitative and qualitative data and the process can beconducted remotely.[15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 22 Tutor: KW Lam; Student: Sanjoy Sanyal
  23. 23. Performance Question-asking protocol Measurement Two-way microphone USABILITY USERS TESTER Pre-amplifier / Sound mixer User’s Computer VCR VIDEO CAMERA PC-VCR Converter Logging actual use Interface log (Keyboard, Mouse driver etc) Record user’s facial Automatically collect Audio-video tape of expressions, statistics about detailed computer screen + QA reactions etc use of system protocol conversationFigure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q-A Protocol and Logging Actual Use.CONCLUSIONMulti-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for allinterfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents foundColorado interface much easier to search with than Oregon interface, and former moderately faster thanlatter. Nobody failed to perform required task with Colorado browser. Very few required extra help withColorado browser. Majority found the Colorado information useful. More utilized the former for performingtask than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregoninterface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, butOregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higheraccessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability fromusers’ perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado outputwas not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to varioustypes of chromatically-challenged individuals.ACKNOWLEDGEMENTSThe President and Dean of University of Seychelles American Institute of Medicine kindly permitted thisstudy and the infectious enthusiasm of students of USAIM made this possible.CONFLICTS OF INTERESTAuthor is employed by USAIM. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 23 Tutor: KW Lam; Student: Sanjoy Sanyal
  24. 24. REFERENCE1. Bearman M. Centre of Medical Informatics, Monash University [homepage on the Internet]. Monash, Au:Monash University; © 1997 [cited 2006 July 1]. Why use technology?; [about 3 pages]. Available from:http://archive.bibalex.org/web/20010504064004/med.monash.edu.au/informatics/techme/whyuse.htm.2. University of Colorado Health Science Center [homepage on the Internet]. Colorado: UCHSC; [cited2006 July 1]. Overview; [about 2 screens]. Available from:http://www.uchsc.edu/sm/chs/overview/overview.html.3. Computer Science, University of Maryland [homepage on the Internet]. Bethesda, MD: UMD; [cited 2006July 1]. Visualization; [about 1 screen]. Available from:http://www.cs.umd.edu/hcil/research/visualization.shtml.4. National Library of Medicine, National Institutes of Health [homepage on the Internet]. Bethesda, MD:NIH; [updated 2003 September 11; cited 2006 July 1]. The Visible Human Project – Overview; [about 1page]. Available from: http://www.nlm.nih.gov/research/visible/visible_human.html.5. Center for Human Simulation, University of Colorado. Visible Human Experience [homepage on theInternet]. Denver, CO: University of Colorado; [cited 2006 July 1]. Available from:http://www.visiblehumanexperience.com/.6. Conlin T. Sushi Applet. University of Oregon; [modified 2003 September 19; cited 2006 July 1].Available from: http://www.cs.uoregon.edu/~tomc/jquest/SushiPlugin.html.7. Boulos MNK. Internet in Health and Healthcare. Bath, UK: University of Bath; [cited 2006 July 1].Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt.8. Zeng X, Parmanto B. Web Content Accessibility of Consumer Health Information Web Sites for Peoplewith Disabilities: A Cross Sectional Evaluation. J Med Internet Res [serial on the Internet]. 2004 June 21 [lastupdate 2006 February 11, cited 2006 July 1]; 6(2):e19: [about 20 pages]. Available from:http://www.jmir.org/2004/2/e19/index.htm.9. Boulos MNK. A two-method evaluation approach for Web-based health information services: TheHealthCyberMap experience. MEDNET-2003; 2003 December 5; University Hospital of Geneva; [cited2006 July 1] Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/KamelBoulos_MEDNET2003.ppt.10. Beuscart-Zéphir M-C, Anceaux F, Menu H, Guerlinger S, Watbled L, Evrard F. User-centred,multidimensional assessment method of Clinical Information Systems: a case-study in anaesthesiology. Int JMed Inform [serial on the Internet]. 2004 September15 [cited 2006 July 1]; [about 10 pages]. Availablefrom: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID20041021151346705.11. Curé O. Evaluation methodology for a medical e-education patient-oriented information system. MedInform Internet Med [serial on the Internet]. 2003 March [cited 2006 July 1]; 28(1):1-5 [about 5 pages].Available from:http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=12851053.12. International standards for HCI and usability. UsabilityNet; ©2006 [cited 2006 July 1]. Available from:http://www.usabilitynet.org/tools/r_international.htm.13. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. Available from: http://www.useit.com/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 24 Tutor: KW Lam; Student: Sanjoy Sanyal
  25. 25. 14. TechDis [homepage on the Internet]. Sussex, UK: University of Sussex Institute of Education; (c) 2000-2002 [last major update 2002 Oct 26; cited 2006 July 1]. Web Accessibility & Usability Resource. Availablefrom: http://www.techdis.ac.uk/seven/.15. Zhang Z. Usability Evaluation [homepage on the Internet]. US: Drexel University; [cited 2006 July 1].Available from: http://www.usabilityhome.com/.16. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinicalinformation systems. J Biomed Inform [serial on the Internet]. 2004 Feb; [published online 2004 Feb 21;cited 2006 July 1]; 37:56-76:[about 20 pages]. Available from:http://www.sciencedirect.com/science/journal/15320464.17. Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, Goland R, Shea S, Starren J.Usability in the real world: assessing medical information technologies in patients homes. J Biomed Inform[serial on the Internet]. 2003 Feb-Apr; [published online 2003 Sept 4; cited 2006 July 1]; 36(1-2):45-60:[about 16 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464.18. Kushniruk AW, Triola M M, Borycki EM, Stein B, Kannry JL. Technology induced error and usability:The relationship between usability problems and prescription errors when using a handheld application. Int JMed Inf [serial on the Internet]. 2005 August; [available online 2005 April 8; cited 2006 July 1]; 74(7-8):519-26:[about 8 pages]. Available from:http://www.sciencedirect.com/science?_ob=GatewayURL&_origin=CONTENTS&_method=citationSearch&_piikey=S1386505605000110&_version=1&md5=e950841f1dbf4dd207d9a5d47d311908.19. Boulos MNK. HealthCyberMap [homepage on the Internet]. HealthCyberMap.org; © 2001, 2002 [lastrevised 2002 April 17; cited 2006 July 1]. Formative Evaluation Questionnaire of HealthCyberMap PilotImplementation; [about 6 pages]. Available from: http://healthcybermap.semanticweb.org/questionnaire.asp.20. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-2. SAMPLE SURVEY OF WEBMASTERS; [about 15 pages]. Available from:http://irm.cit.nih.gov/itmra/weptest/app_a2.htm.21. Boulos MNK. Royal College of Surgeons of Edinburgh [homepage on the Internet]. Edinburgh, UK:RCSED; [published 2004 June 16; cited 2006 July 1]. Notes on Evaluation Methods (Including UserQuestionnaires and Server Transaction Logs) for Web-based Medical/Health Information and KnowledgeServices; [about 6 screens]. Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/MNKB_evaluation.pdf.22. Eric Bonharme, White I. Napier University [homepage on the Internet]. Marble; [last update 1996 June18; cited 2006 July 1]. Questionnaires; [about 1 screen]. Available from: http://web.archive.org/web/20040228081205/www.dcs.napier.ac.uk/marble/Usability/Questionnaires.html.23. Bailey B. Usability Updates from HHS. Usability.gov; 2006 March [cited 2006 July 1]. Getting theComplete Picture with Usability Testing; [about 1 screen]. Available from:http://www.usability.gov/pubs/030106news.html.24. Kuter U, Yilmaz C. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate ResearchMethods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited2006 July 1]. Survey Methods: Questionnaires and Interviews; [about 6 screens]. Available from:http://www.otal.umd.edu/hci-rm/survey.html. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 25 Tutor: KW Lam; Student: Sanjoy Sanyal
  26. 26. 25. Ash JS, Gorman PN, Lavelle M, Payne TH, Massaro TA, Frantz GL, Lyman JA. A Cross-siteQualitative Study of Physician Order Entry. J Am Med Inform Assoc [serial on the Internet]. 2003 Mar-Apr;[cited 2006 July 1]; 10(2):[about 13 pages]. Available from:http://www.jamia.rcsed.ac.uk/cgi/reprint/10/2/188.pdf.26. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. How to Conduct a Heuristic Evaluation; [about 6 pages]. Availablefrom: http://www.useit.com/papers/heuristic/heuristic_evaluation.html.27. National Institute of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE; [about 5pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm.28. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. Ten Usability Heuristics; [about 2 pages]. Available from:http://www.useit.com/papers/heuristic/heuristic_list.html.29. Barber C. Interaction Design [homepage on the Internet]. Sussex, UK: [cited 2006 July 1]. InteractiveHeuristic Evaluation Toolkit; [about 9 pages]. Available from: http://www.id-book.com/catherb/index.htm.30. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 June 14]. Characteristics of Usability Problems Found by Heuristic Evaluation;[about 2 pages]. Available from: http://www.useit.com/papers/heuristic/usability_problems.html.31. Minervation [homepage on the Internet]. Oxford, UK: Minervation Ltd; © 2005 [modified 2005 June 6;cited 2006 July 1]. The LIDA Instrument; [about 13 pages]. Available from:http://www.minervation.com/mod_lida/minervalidation.pdf.32. World Wide Web Consortium [homepage on the Internet]. W3C®; © 1994-2006 [updated 2006 Feb 20;cited 2006 June 14]. W3C Markup Validation Service v0.7.2; [about 3 screens]. Available from:http://validator.w3.org/.33. Watchfire Corporation. WebXACT [homepage on the Internet]. Watchfire Corporation; © 2003-2004[cited 2006 July 1]. Available from: http://webxact.watchfire.com/.34. Badenoch D, Tomlin A. How electronic communication is changing health care. BMJ [serial on theInternet]. 2004 June 26; [cited 2006 July 1]; 328:1564[about 2 screens]. Available from:http://bmj.bmjjournals.com/cgi/content/full/328/7455/1564.35. World Wide Web Consortium [homepage on the Internet]. W3C; © 1999 [cited 2006 July 1]. WebContent Accessibility Guidelines 1.0 – W3C Recommendation 5-May-1999; [about 24 pages]. Availablefrom: http://www.w3.org/TR/WCAG10/.36. The Access Board [homepage on the Internet]. The Access Board; [updated 2001 June 21; [cited 2006July 1]. Web-based Intranet and Internet Information and Applications (1194.22); [about 15 pages].Available from: http://www.access-board.gov/sec508/guide/1194.22.htm.37. Ceaparu I, Thakkar P. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate ResearchMethods [homepage on the Internet]. College Park, MD: University of Maryland; [last updated 2001October 28; cited 2006 July 1]. Logging & Automated Metrics; [about 8 screens]. Available from:http://www.otal.umd.edu/hci-rm/logmetric.html.38. Vischeck [homepage on the Internet]. Stanford, CA: Stanford University; [last modified 2006 Mar 8;cited 2006 July 1]. Information & Links; [about 7 pages]. Available from: http://www.vischeck.com/info/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 26 Tutor: KW Lam; Student: Sanjoy Sanyal
  27. 27. 39. Perlman G. ACM; [cited 2006 July 1]. Web-Based User Interface Evaluation with Questionnaires;[about 4 pages]. Available from: http://www.acm.org/~perlman/question.html.40. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-9: IMPLEMENTATION DETAILS OF WEB SITE EVALUATION METHODOLOGIES;[about 1 page]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a9.htm.41. Hess R. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corp; © 2006[published 2000 October 9; cited 2006 July 1]. Can Color-Blind Users See Your Site?; [about 7 pages].Available from: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnhess/html/hess10092000.asp.42. Tognazzini B. AskTog; copyright 2003 [cited 2006 July 1]. First Principles of Interaction Design; [about7 pages]. Available from: http://www.asktog.com/basics/firstPrinciples.html.43. Browsershots.org [homepage on the Internet]. Browsershots.org; [cited 2006 July 1]. Test your webdesign in different browser; [about 1 page]. Available from: http://v03.browsershots.org/.44. Giacoppo SA. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods[homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July1]. The Role of Theory in HCI; [about 11 screens]. Available from: http://www.otal.umd.edu/hci-rm/theory.html.45. Usability.gov. Methods for Designing Usable Web Sites. Usability.gov; 2006 March [cited 2006 July 1].Conducting and Using Usability Tests; [about 3 screens]. Available from:http://www.usability.gov/methods/usability_testing.html.46. Berkun S. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corporation;© 2006 [published 1999 Nov-Dec; cited 2006 July 1]. The Power of the Usability Lab; [about 3 printedpages]. Available from: http://msdn.microsoft.com/library/en-us/dnhfact/html/hfactor8_6.asp.LIST OF ABBREVIATIONS3D: Three DimensionalCAST: Center for Applied Special TechnologyCGI: Common Gateway InterfaceCHS: Center for Human Simulation (University of Colorado)CSS: Cascading Style SheetsGHz: Giga hertzHCM: Health CyberMapHTML: HyperText Markup LanguageMS: MicrosoftIE: Internet ExplorerIEEE: Institute of Electrical and Electronic EngineersISM: Instrumentation Scientific and MedicalISO: International Organization of StandardizationNIH: National Institutes of Health, Bethesda, MarylandQDA: Qualitative Data AnalysisQSR: Qualitative Solutions and ResearchSP: Service PackUCHSC: University of Colorado Health Science CenterUSAIM: University of Seychelles American Institute of Medicinev: Version RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 27 Tutor: KW Lam; Student: Sanjoy Sanyal