• Share
  • Email
  • Embed
  • Like
  • Private Content
U-9_e-Learning Browser Comparison
 

U-9_e-Learning Browser Comparison

on

  • 1,460 views

 

Statistics

Views

Total Views
1,460
Views on SlideShare
1,460
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    U-9_e-Learning Browser Comparison U-9_e-Learning Browser Comparison Document Transcript

    • Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered MethodologyINTRODUCTIONElectronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching.Interactivity holds students’ attention longer, enables easier understanding, and its proactive natureengenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D-imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visualdisplays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, thechaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3]Anatomy of human brain is the Waterloo of most medical students. We therefore decided to criticallyevaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini-study was conducted at the University of Seychelles, American Institute of Medicine (USAIM)[https://web.usaim.edu] from May 2006 to June 2006.MATERIALSTwo interfaces were selected from projects related to Visible Human Dataset of National Library ofMedicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface,an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible HumanExperience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals withwhole-body anatomy, but for comparison with the second browser in this study, only brain interface wasselected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University ofOregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML-based content to the client browser.[7,8]Colorado browser interfaceThis interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5]opened in a new window. This has to be open for the whole proceedings. The link ‘Interactive Atlas’ led tothe dynamic webpage in same window. Finally, ‘Launch the Interactive Atlas’ link on the page initiated theJava-applet (infra) to load the applet-windows [Figure-1]. Non-payment registration RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 1 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Figure-1: Composite screenshots showing opening of the Interactive Atlas browser in Visible Human Java details Experience website, from the CHS website. See also Java.Java installationInteractive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) wasdownloaded, installed from Sun’s Java website (http://www.java.com) and enabled [Figure-2]. Figure-2: Composite screenshots showing Java download, installation and enabling in the computer. This is an essential pre- requisite for the browsers. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 2 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the3D interactive atlas browser was launched, the status bar showed the sequence; ‘Applet web3d loaded’,‘Applet web3d inited’, ‘Applet web3d started’, before the 3-in-1 Java-applet windows simultaneouslyopened on the whole screen [Figure-3]. Model list / Oblique section window 3D model window; the actual browser Tools window for manipulating above Figure-3: Opening of initial Interactive Atlas 3-in-1 applet window.Applet-windowsThe upper-right window gives a comprehensive list of 3D images. Under ‘Model Available’, ‘All’ wasselected from the drop-down list. Double-clicking on the ‘Brain’ option opened a 3D interactive brainsimulation in upper-left window through a ‘Building Brain’ sequence. This is the actual browser interface. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 3 Tutor: KW Lam; Student: Sanjoy Sanyal
    • This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual ‘planeof section’ to ‘slice’ the brain in any plane/axis.Under ‘Display’ in the bottom ‘Tools’ window, ‘3D and Oblique’ option was selected from the drop-downlist. This generated a ‘Getting oblique slice’ sequence in the upper-right window and depicted ‘slices’ ofbrain, selected through the upper-left window. The bottom window is the control panel containing radio-buttons/list-boxes to customize user’s interactivity choices [Figure-4]. Virtual brain model with virtual plane of section; this is for manipulation Alpha server output in response to queries sent through upper-left window Control tools for manipulating browser Figure-4: The final appearance of the browser and output windows. These windows provided the interfaces for the study.Oregon browser interfaceThe 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabledclient for online viewing of the webpage. This was downloaded, installed and enabled over about 45minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar,and the status bar indicates ‘Applet Sushi loaded’. Once the applet had read the data, 3 sectional images ofthe brain appeared in the same window, indicated by ‘Applet Sushi started’ in the status bar. This wasactivated by clicking anywhere on the window [Figure-5]. Java applet loading indicator Progress bar Figure-5: Oregon 3D brain browser applet loading sequence; note the indication on the status barThe window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section ofthe brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonalgridlines, their colours being those of linings of other two squares. Moving any crosshair in any squaredynamically updates the figures in other two squares to show the appearance of the brain in those sections.There is a fourth optional square for viewing any arbitrary ‘slice’ of brain, selected by checking the ‘Arb RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 4 Tutor: KW Lam; Student: Sanjoy Sanyal
    • slice’ check-box. Another check-box enables ‘depth cuing’ of images. Different radio-buttons allowvisualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9] Fig-6: Axial, coronal, sagittal brain sections (counter-clockwise), enclosed in red, green, blue squares, respectively. Cross-hairs in each square are of other two colours. At start-up, clicking anywhere Fig-7: Showing arbitrary slice, in window activates the controls enclosed in cyan and magenta Fig-8: Showing MRI- type of appearance. Fig-9: Showing Infrared type of appearanceAll applets are stored in a special folder for quick viewing later [Figure-10].Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewingMETHODSWe adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. Theunderpinning principle was to check the interfaces against the following healthcare user interface designprinciples; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability,input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 5 Tutor: KW Lam; Student: Sanjoy Sanyal
    • times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen’s usability engineering[13]and in TechDis accessibility/usability precepts.[14]Usability inquiryThe first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed thefirst six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulnessof the two interfaces, both individually and comparatively, were the evaluation objectives. Students fromPre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browserinterfaces were opened online in a computer that had been prepared by loading/enabling Java applets.Students were demonstrated the use of both interfaces, in small groups and individually. Then each of themwas given 30-45 minutes to work on the interfaces, in the students’ library. In some cases pairs of studentsworked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks,viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internetconnection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix]QuestionnaireWe modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIHwebsite,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-sevenclose-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulnessissues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3-point scale.[22] The data was analysed, tabulated and represented graphically.[9,21]Last six questions were open-ended qualitative types.[22] The responses were analysed and categorizedaccording to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25]pertaining to ISO principles of design.[12]Usability inspectionThe second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The authoracted as usability-specialist (user interface ‘heuristic expert’); judging user interface and systemfunctionality against a set of heuristics to see whether they conformed to established principles of usabilityand good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approachusing the relatively inexperienced students.Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber’s project[29][Appendix]. For eachinterface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next,depending on frequency, impact and persistence of usability problem, a level of problem severity wasassigned according to following rating scale.[30](Box-1)Box-1Automated testingIn the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32]and WebXACT.[33] These tools utilize automated ‘Web-crawlers’ to check webpages/stylesheets for errorsin underlying code and accessibility issues. We used the main page of each resource for the tests.[8] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 6 Tutor: KW Lam; Student: Sanjoy Sanyal
    • LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable,and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2[www.minervation.com/validation] to automatically generate the accessibility scores. The usability andreliability scores were calculated ‘by hand’, and tabulated. Figure-11: Screenshot of Minervation site, Figure-12: Screenshot of W3C site, showing showing LIDA validation tool Markup Validation ServiceMarkup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance toW3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-levelto each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatorycompliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2[http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations.Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the nameWebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility andprivacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance withSection-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It isgood for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicatingthat the site has been ‘endorsed’ in some way by another organization.[Figure-13] Bobby-approved kite-mark, taken from BDA website: http://www.bda- dyslexia.org.uk Figure-13: Screenshot of Watchfire site, showing WebXACT validation tool. Inset: Sample of Bobby approved kitemarkWebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general,quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate WebAccessibility Barrier (WAB) score.[8] The steps are summarised in Box-2.Box-2: Simplified steps for calculating WAB RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 7 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Colour testingFinally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challengedindividuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objectsappear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38]VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted andinstalled to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, thecorresponding ‘colour-blind appearance’ was noted and displayed for comparison purposes.RESULTSQuestionnaire analysisUser demographicsThirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix-Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionatelymore females (86% vs53%) in 18-19 age-groups.Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7hours’ Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93%(28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connectedInternet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b] Gender-based age distribution 100% 90% 80% 70% % of students 60% 50% 40% 30% 20% 10% 0% Age 19 20 21 22 or Total Female (years) 18 above MaleFigure-14: 100% Stacked Column showing age-gender distribution of respondents.SearchabilitySixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to15/30 (50%) through Oregon interface. Nearly four times more students found searchability through thelatter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficultyin searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 8 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Searchability 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Male Female Both Male Female Both Easy / (Very) Acceptable difficulty Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Very) / Difficult Figure-15: 100% 3D Stacked Column showing ease of search for information through either interface, divided gender-wise. Speed Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There was no appreciable gender difference[Appendix-Table-1d; Figure-16]. Perception of browser speed (Colorado) 3% 0% 10% Very fast Perception of browser speed (Oregon) Very fast Moderately fast Moderately fast Moderately slow 0% Moderately slow 13% Very slow Very slow 37% 50% 87%Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender. Success rate Success in finding the required information/‘slice’ of brain was considered a resultant of interface- effectiveness, reliability, arrangement of information and output. There were no failures with Colorado browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser, 47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b]. Success rate (Colorado) Fig 17a 0% 30% From 1st attempt After 1+ failure Not successful 70% RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 9 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Success rate (Oregon) Fig 17b 27% 30% From 1st attempt After 1+ failure Figures-17a,b: 3D exploded pie Not successful charts showing success / failure rates with either interface, 43% irrespective of gender.Ease of useHardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required morehelp than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed morehelp, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18]. Ease of use and help requirements 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Need more help Male Female Both Male Female Both Easy, instructions useful Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Easy, no help neededFigure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and helprequirements with either interface.Information qualityInformation quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output wasuseful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, withequal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19] Good information quality 100% 90% 80% % Respondents 70% 60% 50% 40% 30% 20% 10% 0% Disagree / (Strongly) Male Female Both Male Female Both Amiguous Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) (Strongly) / AgreeFigure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 10 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Information overloadThirty-percent (9/30) felt moderately/severely overloaded by information provided through Coloradointerface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) feltoverwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Coloradoinformation output (M:F=47%:13%).[Appendix-Table-1h; Figure-20] Information overload 100% 90% 80% 70%% Respondents 60% 50% 40% 30% 20% 10% 0% Significant / Extreme problem Male Female Both Male Female Both Moderate problem Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / Slight problemFigure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overloadOverall usefulnessSimilar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon=47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremelyuseful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharingthe same feeling.[Appendix-Table-1i; Figure-21] Comparative usefulness of both browser interfaces 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very much / extremely Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) Not at all / slightlyFigure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overallusefulness of either interface.Definitive resourceRegarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated thatthey would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 11 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Perceived usefulness of either/both as definitive resource 3% 33% Figure-22: 3D exploded pie chart showing overall distribution of opinion about using either or both browser (Strongly) / Disagree64% Amiguous interface as a definitive Neuroanatomy Agree / (Strongly) resource.Actual usageWhich browser the students actually used to carry out their task provided an estimate of both interfaces’combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregonbrowser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23]. Interactive 3D atlas (Colorado) Actual usage proportions 3D brain browser (Oregon) Both interfaces equally 23% 44% Figure-23: 3D exploded pie showing overall 33% distribution of users who actually used either / both interface(s) for performing a task.Future prospectsStudents’ opinion regarding future prospects of these interfaces considered aspects like usability, usefulness,robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very goodfuture prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females thanmales felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied toOregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24]. Perceived future prospects 100% 90% 80% 70% % Respondents 60% 50% 40% 30% 20% 10% 0% Very / Extreme Male Female Both Male Female Both Somew hat Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon) No / SlightFigure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospectsof either interface.Questionnaire qualitative analysis RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 12 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Appropriate sample user comment(s) (both positive and negative) about each browser interface, and thecorresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table-2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cuttingacross gender-divide showed greater preference for Colorado browser interface.Heuristic violation severityAverage heuristic violation severity rating for Oregon interface was three times as much as Coloradointerface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severelycompromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25] Usability Severity Rating 4 3 Violation severity rating 2 1 0 n n ll n n ts es om n s e ld rs us s ca tio ig io io io er rd us in or ro od es at ta at re at nt ed us da tra w er of ig td st m en ve rm fr e an an ns ry al av m em y is of m re fo th nc re na Figure-25: Clustered st co N nd ro al cu rp in se st di ie er rf im d d al la do of sy ro an an U fic or th ve in ic tr o Er Column showing e ra tra ef d m of ys co em cy ur an on d n Ex d Ph y en ct re ti o st an l it an p rc ru st sy severity of heuristic bi s el ni ty St er se ti c si si H og n i li us on Vi U he ee ib ec C violation for each of p st ex tw R el Ae Interactive 3-D Atlas (Colorado) Fl be H 15 heuristics, in each ch Heurestics at 3-D Brain Brow ser Interface (Oregon)M browser interface.Automated test resultsLIDABoth browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26] Figure-26: Composite screenshots from LIDA tests showing failure of both interface sites to meet UK legal standards.Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4.There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs.67%); with comparable means and standard deviations. Probability associated with Student’s t test (2 tailed RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 13 Tutor: KW Lam; Student: Sanjoy Sanyal
    • distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showedsubstantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29]. LIDA Results 90% 80% 70% 60% 50% 40% Figure-27: Clustered Column showing 30% accessibility, usability and reliability results of 20% Interactive 3-D both websites, as analysed by LIDA tool. Overall 10% Atlas (Colorado) results do not show any significant difference, 0% 3-D Brain Accessibility Usability Reliability Overall score Brow ser (Oregon) apparently. Break up of accessibility results 100% 90% 80% 70% % Score 60% 50% 40% 30% 20% 10% 0% Figure-28: Clustered 3D Column showing tu p ns e gs n l it y break-up of Accessibility results. This was od Ta ti o bi Se tio C tra si ge st ric te d or e is es automatically generated by LIDA tool, except the Pa da C eg cc Re ut l in R la ss O ub al Interactive 3D atlas (Colorado) last parameter. Differences between two sites ce D ver Ac O 3D brain brow ser (Oregon) are more apparent. Break up of usability 100% 90% 80% 70% 60% % Score 50% 40% 30% 20% 10% 0% ri t y nc y l ity i li t y il it y Interactive 3D atlas Figure-29: Clustered 3D Column showing la te na ib ab C s is tio g ag us (Colorado) break-up of Usability results. Differences on nc En ll C Fu ra 3D brain brow ser ve (Oregon) between two sites are even more apparent. OValidation ServiceBoth sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively.Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found[Figures-30,31]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 14 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Figure-30: Screenshot from W3C Markup Validation Service showing result for Colorado site.This page is not Valid HTML 4.01 Strict! Figure-31: Screenshot from W3C Markup Validation Service showing result for Oregon site.This page is not Valid (no Doctype found)!WebXACTBoth interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non-serious page encryption level, no P3P compact policy, and issues about third party content. Additionally,Colorado browser site had no author and keywords in metadata summary, and elements missing height-width attributes (page efficiency).[Appendix-Table-5]WAB scoreThere were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, andseveral instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32].Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6] Colorado page Figure-32: Composite screenshots showing Priority 1,2,3 automatic and manual checkpoint errors and warnings in both Web pages, as determined by WebXACT. There is no significant difference between them. Oregon pageVischeck resultsAppearances of each output under normal vision and under red/green/blue-blindness are demonstrated inFigures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible toprotanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses thesecolour-combinations, are also unappreciable to the colour-blind. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 15 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Figure-33: Composite screenshots showing appearance of Colorado applet windows under normal and colour-deficit visions; from left to right, clockwise – Normal, Protanopic and Tritanopic appearances; Deuteranopic appearance is almost same as protanopicA: Normal Oregon browser window B: Protanopic appearance (red missing)C: Deuteranopic appearance (green missing) D: Tritanopic appearance (blue missing)Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each typeof blindness, the outer square lines and internal cross-hairs of that particular colour are invisible.Colours of squares and cross-hairs are essential components of the interface.A: Normal appearance (infrared type) B: Protanopic appearance RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 16 Tutor: KW Lam; Student: Sanjoy Sanyal
    • C: Deuteranopic appearance D: Tritanopic appearanceFigure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normallyand in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by adifferent colour-scheme.Summary of resultsAll tests results are comparatively summarized in Appendix-Table-7 and Figure-36. All result summary 100% 90% % students and % of absolute values 80% 70% 60% 50% 40% 30% 20% 10% 0% ilit y ilit y st st te te qd e ss ad ul e ts n lit y ilit y lit y re rs rs gs e fa fa ra ra re us ne er l o ef ag pec at io bi bi co rro erro rnin or ab hab ely V re s s lp of ul us us s ol si ab l ia ls Ce sc ch c at ilu cce he se se f ov V d o vi s us re al nt a AB ar ar er fa se pr tic ce A er 3 oi w se t se su tra E a fo u fo ba ture ac ID A ov W kp nt W od s k In - ris ID oi sy l M Ta sk Ex In sk eu A L L A ec kp ea cu Fu LI D D ch ec ff i Ta Ta H LI V di to ch Oregon browser V A u u al Qaire, Heuristic, LIDA, W3C, WebXACT, WAB tests an Colorado browser MFigure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute tototal of each score in each test category. First 13 are results of questionnaire, next is heuristic violationscore, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results,last is Web Accessibility Barrier score.DISCUSSIONQuestionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since ourinterfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to studentswas the most appropriate first step. When an appropriate questionnaire already exists, adapting the same forthe current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos’questionnaire.[19] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 17 Tutor: KW Lam; Student: Sanjoy Sanyal
    • We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, alarger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had sevenpages instead of two.Our last six open-ended questions provided valuable qualitative input vis-à-vis users’ perceptions ofusability and usefulness of the two interfaces. This played a significant role in recommending practicalchanges to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and indexthe patterns and themes would have rendered analysis of our qualitative data more efficient.[25]The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic‘expert’s’ perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristicevaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, providesindication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The idealheuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was notpossible in our ‘mini’ study.Implications of automated testsAutomated tools are designed to validate WebPages vis-à-vis their underlying codes, and check theiraccessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleadingfindings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scoresshowed Colorado accessibility was poorer and usability better than Oregon. However, most students foundColorado interface superior in most categories. Heuristic evaluation also demonstrated three times higherheuristic violation in Oregon interface. However, automated tests served two purposes; they provided meansfor triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later.Four-legged table modelOur study reinforced an established principle of evaluation studies; triangulation by several methods is betterthan one method, because any single method does not give a complete evaluation.[9] The ideal usabilityevaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usabilityinquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-userusability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitiveexperts) and heuristic evaluation (heuristic experts)[16] provide usability from ‘expert’s’ perspective. Theyconstitute third leg of the table. The automated methods give numerical figures for accessibility, usabilityand reliability, and constitute fourth leg of the table. Therefore one method complements the other in asynergistic way, identifying areas of deficiency that have slipped through the cracks of other methods,besides cross-checking each others validity. We have tried to fit this model as closely as possible byemploying a multi-tiered methodology.[9-11]Lessons learned from studyEnd-user characteristicsTechnological excellence does not necessarily correlate with usability/usefulness. The award-winning 3DOregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations.However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were toosmall, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre-clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions andhints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool forplaying but not for serious Neuroanatomy study. This was the finding both from end-user perspective aswell from heuristic analysis.Gender differencesMost usability studies do not explicitly consider gender-differences, as we did. This provided valuableinsight [Box-3]. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 18 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Box-3: Gender-based differences gleaned from studyIn general terms this relates to improving searchability, providing more help functions, improvinginformation quality, reducing information overload and improving the interface as a whole. These applymore to female students; more to Oregon interface, but also for Colorado interface. The proposedimprovements have been considered more explicitly below.Colour-blind studentsApproximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More mayhave temporary alterations in perception of blue [Box-4].[38,41,42]Box-4: Spectrum of colour-deficits in the populationThe Oregon interface had red, green and blue as essential components. Our Vischeck simulation exerciseproved that such an interface would be useless to the colour-blind. Our school of approximately 300 studentshas about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore theimpact is likely to be substantial.Implications for user interfacesColour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students.Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided.Alternatively, there should be provision to Daltonize the images (projecting red/green variations intolightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38]One should also use secondary cues to convey information to the chromatically-challenged; subtle gray-scale differentiation, different graphic or different text-label associated with each colour.[42]Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should betested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an onlinetool for this purpose.[43]Implications for evaluation exercisesAll accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibilitythrough colour-checking engines like Vischeck. The systems should be Java-enabled.[38]Practical recommendations RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 19 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Colorado interfaceThe recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37. “You could always improve anything” (User comment) Provide functionality Provide function Give notes/explanations for each item button as alternative to table list All 3 applet windows should load in <10 seconds; Or, provide page loading progress bar 1. This applet window should be larger Give a right-click ‘What’s This?’ type of help function for 2. Fonts of menu items should be at least 10 points each of these menu buttons Help function is too cumbersome; render it user-friendly Zoom function is ornamental; render it functional 1. Add clinical correlations, anatomical and functional connections between structures 2. Make search blocks, labelled diagrams 3. Blank areas of labeling should be filled up 4. Correct the errors given by the slices while locating a particular area 5. Give audio help (like a doctor speaking when click on a part)Figure-37: Composite screenshots showing all the recommendations for improvements to the Coloradobrowser are based on user comments and heuristic evaluation studies.The following recommendations are based on results of automated tests:Improving accessibility/usability[31] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 20 Tutor: KW Lam; Student: Sanjoy Sanyal
    • -Incorporate HTTP-equivalent content-type in header -Insert table summaries for visually-impaired visitors -Increase font size to at least 10 -Incorporate search facilityPriority-1/2/3 checkpoints[33] -Provide extended description for images conveying important information -Ensure that pages are still readable/usable in spite of unsupported style sheets. -Add a descriptive title to links -Provide alternative searches for different skills/preferencesOregon interfaceFigure-38 highlights the recommendations, based on user-feedback and heuristic evaluation. 3. Add following items 4. Give explanations for items 5. Provide good labeling 6. Give better views 7. Enlarge images (Fitt’s law) 8. Colour-blind feature (see text) Save Search 2. Include under Run Daltonize! 1. Provide right-click informationFigure-38: All the recommendations for improvements to the Oregon browser are based on user commentsand heuristic evaluation studies.Image sizeThis was the most common complaint by students. Fitt’s law states pointing time to target is inverselyproportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size wouldreduce effort, time and cognitive load.The following recommendations are based on results of automated tests:Improving accessibility/usability[31] -Eliminate body background colour -Include clear purpose statement in the beginning -Make ‘block of text’ scannable, in short easy-to-understand paragraphs -Include navigation tools for moving through text -Reduce user cognitive loadW3C markup validation[32] -Place a DOCTYPE declaration [Box-5]. Box-5: Document Type Definition RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 21 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Priority-1/2/3 checkpoints[33] -Ensure usability of WebPages even if programmatic objects do not function -Provide accessible alternatives to information in Java 1.1 applet -Use CSS to control layout/presentation -Avoid obsolete language featuresBoth interfacesThe following recommendations are based on results of automated tests:Improving accessibility/usability[31] -Add HTML language definition -Add Dublin core title tags -Present material without necessitating plug-insPriority-1/2/3 checkpoints[33] -Use more simple/straightforward language -Identify language of text -Foreground-background colors should contrast -Validate document to formal published grammars -Provide description of general site layout, access features and usage instructions -Allow user-customisation -Provide metadata that identifies documents location in collectionConducting better evaluation techniquesUsing Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given theresources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing andInquiry approach. The former would include Performance Measurement of user combined with Question-Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automaticLogging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combinedmethodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiencyand satisfaction are covered. We can obtain quantitative and qualitative data and the process can beconducted remotely.[15] RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 22 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Performance Question-asking protocol Measurement Two-way microphone USABILITY USERS TESTER Pre-amplifier / Sound mixer User’s Computer VCR VIDEO CAMERA PC-VCR Converter Logging actual use Interface log (Keyboard, Mouse driver etc) Record user’s facial Automatically collect Audio-video tape of expressions, statistics about detailed computer screen + QA reactions etc use of system protocol conversationFigure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q-A Protocol and Logging Actual Use.CONCLUSIONMulti-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for allinterfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents foundColorado interface much easier to search with than Oregon interface, and former moderately faster thanlatter. Nobody failed to perform required task with Colorado browser. Very few required extra help withColorado browser. Majority found the Colorado information useful. More utilized the former for performingtask than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregoninterface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, butOregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higheraccessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability fromusers’ perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado outputwas not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to varioustypes of chromatically-challenged individuals.ACKNOWLEDGEMENTSThe President and Dean of University of Seychelles American Institute of Medicine kindly permitted thisstudy and the infectious enthusiasm of students of USAIM made this possible.CONFLICTS OF INTERESTAuthor is employed by USAIM. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 23 Tutor: KW Lam; Student: Sanjoy Sanyal
    • REFERENCE1. Bearman M. Centre of Medical Informatics, Monash University [homepage on the Internet]. Monash, Au:Monash University; © 1997 [cited 2006 July 1]. Why use technology?; [about 3 pages]. Available from:http://archive.bibalex.org/web/20010504064004/med.monash.edu.au/informatics/techme/whyuse.htm.2. University of Colorado Health Science Center [homepage on the Internet]. Colorado: UCHSC; [cited2006 July 1]. Overview; [about 2 screens]. Available from:http://www.uchsc.edu/sm/chs/overview/overview.html.3. Computer Science, University of Maryland [homepage on the Internet]. Bethesda, MD: UMD; [cited 2006July 1]. Visualization; [about 1 screen]. Available from:http://www.cs.umd.edu/hcil/research/visualization.shtml.4. National Library of Medicine, National Institutes of Health [homepage on the Internet]. Bethesda, MD:NIH; [updated 2003 September 11; cited 2006 July 1]. The Visible Human Project – Overview; [about 1page]. Available from: http://www.nlm.nih.gov/research/visible/visible_human.html.5. Center for Human Simulation, University of Colorado. Visible Human Experience [homepage on theInternet]. Denver, CO: University of Colorado; [cited 2006 July 1]. Available from:http://www.visiblehumanexperience.com/.6. Conlin T. Sushi Applet. University of Oregon; [modified 2003 September 19; cited 2006 July 1].Available from: http://www.cs.uoregon.edu/~tomc/jquest/SushiPlugin.html.7. Boulos MNK. Internet in Health and Healthcare. Bath, UK: University of Bath; [cited 2006 July 1].Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt.8. Zeng X, Parmanto B. Web Content Accessibility of Consumer Health Information Web Sites for Peoplewith Disabilities: A Cross Sectional Evaluation. J Med Internet Res [serial on the Internet]. 2004 June 21 [lastupdate 2006 February 11, cited 2006 July 1]; 6(2):e19: [about 20 pages]. Available from:http://www.jmir.org/2004/2/e19/index.htm.9. Boulos MNK. A two-method evaluation approach for Web-based health information services: TheHealthCyberMap experience. MEDNET-2003; 2003 December 5; University Hospital of Geneva; [cited2006 July 1] Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/KamelBoulos_MEDNET2003.ppt.10. Beuscart-Zéphir M-C, Anceaux F, Menu H, Guerlinger S, Watbled L, Evrard F. User-centred,multidimensional assessment method of Clinical Information Systems: a case-study in anaesthesiology. Int JMed Inform [serial on the Internet]. 2004 September15 [cited 2006 July 1]; [about 10 pages]. Availablefrom: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID20041021151346705.11. Curé O. Evaluation methodology for a medical e-education patient-oriented information system. MedInform Internet Med [serial on the Internet]. 2003 March [cited 2006 July 1]; 28(1):1-5 [about 5 pages].Available from:http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=12851053.12. International standards for HCI and usability. UsabilityNet; ©2006 [cited 2006 July 1]. Available from:http://www.usabilitynet.org/tools/r_international.htm.13. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. Available from: http://www.useit.com/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 24 Tutor: KW Lam; Student: Sanjoy Sanyal
    • 14. TechDis [homepage on the Internet]. Sussex, UK: University of Sussex Institute of Education; (c) 2000-2002 [last major update 2002 Oct 26; cited 2006 July 1]. Web Accessibility & Usability Resource. Availablefrom: http://www.techdis.ac.uk/seven/.15. Zhang Z. Usability Evaluation [homepage on the Internet]. US: Drexel University; [cited 2006 July 1].Available from: http://www.usabilityhome.com/.16. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinicalinformation systems. J Biomed Inform [serial on the Internet]. 2004 Feb; [published online 2004 Feb 21;cited 2006 July 1]; 37:56-76:[about 20 pages]. Available from:http://www.sciencedirect.com/science/journal/15320464.17. Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, Goland R, Shea S, Starren J.Usability in the real world: assessing medical information technologies in patients homes. J Biomed Inform[serial on the Internet]. 2003 Feb-Apr; [published online 2003 Sept 4; cited 2006 July 1]; 36(1-2):45-60:[about 16 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464.18. Kushniruk AW, Triola M M, Borycki EM, Stein B, Kannry JL. Technology induced error and usability:The relationship between usability problems and prescription errors when using a handheld application. Int JMed Inf [serial on the Internet]. 2005 August; [available online 2005 April 8; cited 2006 July 1]; 74(7-8):519-26:[about 8 pages]. Available from:http://www.sciencedirect.com/science?_ob=GatewayURL&_origin=CONTENTS&_method=citationSearch&_piikey=S1386505605000110&_version=1&md5=e950841f1dbf4dd207d9a5d47d311908.19. Boulos MNK. HealthCyberMap [homepage on the Internet]. HealthCyberMap.org; © 2001, 2002 [lastrevised 2002 April 17; cited 2006 July 1]. Formative Evaluation Questionnaire of HealthCyberMap PilotImplementation; [about 6 pages]. Available from: http://healthcybermap.semanticweb.org/questionnaire.asp.20. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-2. SAMPLE SURVEY OF WEBMASTERS; [about 15 pages]. Available from:http://irm.cit.nih.gov/itmra/weptest/app_a2.htm.21. Boulos MNK. Royal College of Surgeons of Edinburgh [homepage on the Internet]. Edinburgh, UK:RCSED; [published 2004 June 16; cited 2006 July 1]. Notes on Evaluation Methods (Including UserQuestionnaires and Server Transaction Logs) for Web-based Medical/Health Information and KnowledgeServices; [about 6 screens]. Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/MNKB_evaluation.pdf.22. Eric Bonharme, White I. Napier University [homepage on the Internet]. Marble; [last update 1996 June18; cited 2006 July 1]. Questionnaires; [about 1 screen]. Available from: http://web.archive.org/web/20040228081205/www.dcs.napier.ac.uk/marble/Usability/Questionnaires.html.23. Bailey B. Usability Updates from HHS. Usability.gov; 2006 March [cited 2006 July 1]. Getting theComplete Picture with Usability Testing; [about 1 screen]. Available from:http://www.usability.gov/pubs/030106news.html.24. Kuter U, Yilmaz C. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate ResearchMethods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited2006 July 1]. Survey Methods: Questionnaires and Interviews; [about 6 screens]. Available from:http://www.otal.umd.edu/hci-rm/survey.html. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 25 Tutor: KW Lam; Student: Sanjoy Sanyal
    • 25. Ash JS, Gorman PN, Lavelle M, Payne TH, Massaro TA, Frantz GL, Lyman JA. A Cross-siteQualitative Study of Physician Order Entry. J Am Med Inform Assoc [serial on the Internet]. 2003 Mar-Apr;[cited 2006 July 1]; 10(2):[about 13 pages]. Available from:http://www.jamia.rcsed.ac.uk/cgi/reprint/10/2/188.pdf.26. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. How to Conduct a Heuristic Evaluation; [about 6 pages]. Availablefrom: http://www.useit.com/papers/heuristic/heuristic_evaluation.html.27. National Institute of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE; [about 5pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm.28. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 July 1]. Ten Usability Heuristics; [about 2 pages]. Available from:http://www.useit.com/papers/heuristic/heuristic_list.html.29. Barber C. Interaction Design [homepage on the Internet]. Sussex, UK: [cited 2006 July 1]. InteractiveHeuristic Evaluation Toolkit; [about 9 pages]. Available from: http://www.id-book.com/catherb/index.htm.30. Nielsen J. useit.com: Jakob Nielsens Website [homepage on the Internet]. Fremont, CA: NielsenNorman Group; [cited 2006 June 14]. Characteristics of Usability Problems Found by Heuristic Evaluation;[about 2 pages]. Available from: http://www.useit.com/papers/heuristic/usability_problems.html.31. Minervation [homepage on the Internet]. Oxford, UK: Minervation Ltd; © 2005 [modified 2005 June 6;cited 2006 July 1]. The LIDA Instrument; [about 13 pages]. Available from:http://www.minervation.com/mod_lida/minervalidation.pdf.32. World Wide Web Consortium [homepage on the Internet]. W3C®; © 1994-2006 [updated 2006 Feb 20;cited 2006 June 14]. W3C Markup Validation Service v0.7.2; [about 3 screens]. Available from:http://validator.w3.org/.33. Watchfire Corporation. WebXACT [homepage on the Internet]. Watchfire Corporation; © 2003-2004[cited 2006 July 1]. Available from: http://webxact.watchfire.com/.34. Badenoch D, Tomlin A. How electronic communication is changing health care. BMJ [serial on theInternet]. 2004 June 26; [cited 2006 July 1]; 328:1564[about 2 screens]. Available from:http://bmj.bmjjournals.com/cgi/content/full/328/7455/1564.35. World Wide Web Consortium [homepage on the Internet]. W3C; © 1999 [cited 2006 July 1]. WebContent Accessibility Guidelines 1.0 – W3C Recommendation 5-May-1999; [about 24 pages]. Availablefrom: http://www.w3.org/TR/WCAG10/.36. The Access Board [homepage on the Internet]. The Access Board; [updated 2001 June 21; [cited 2006July 1]. Web-based Intranet and Internet Information and Applications (1194.22); [about 15 pages].Available from: http://www.access-board.gov/sec508/guide/1194.22.htm.37. Ceaparu I, Thakkar P. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate ResearchMethods [homepage on the Internet]. College Park, MD: University of Maryland; [last updated 2001October 28; cited 2006 July 1]. Logging & Automated Metrics; [about 8 screens]. Available from:http://www.otal.umd.edu/hci-rm/logmetric.html.38. Vischeck [homepage on the Internet]. Stanford, CA: Stanford University; [last modified 2006 Mar 8;cited 2006 July 1]. Information & Links; [about 7 pages]. Available from: http://www.vischeck.com/info/. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 26 Tutor: KW Lam; Student: Sanjoy Sanyal
    • 39. Perlman G. ACM; [cited 2006 July 1]. Web-Based User Interface Evaluation with Questionnaires;[about 4 pages]. Available from: http://www.acm.org/~perlman/question.html.40. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1].APPENDIX A-9: IMPLEMENTATION DETAILS OF WEB SITE EVALUATION METHODOLOGIES;[about 1 page]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a9.htm.41. Hess R. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corp; © 2006[published 2000 October 9; cited 2006 July 1]. Can Color-Blind Users See Your Site?; [about 7 pages].Available from: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnhess/html/hess10092000.asp.42. Tognazzini B. AskTog; copyright 2003 [cited 2006 July 1]. First Principles of Interaction Design; [about7 pages]. Available from: http://www.asktog.com/basics/firstPrinciples.html.43. Browsershots.org [homepage on the Internet]. Browsershots.org; [cited 2006 July 1]. Test your webdesign in different browser; [about 1 page]. Available from: http://v03.browsershots.org/.44. Giacoppo SA. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods[homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July1]. The Role of Theory in HCI; [about 11 screens]. Available from: http://www.otal.umd.edu/hci-rm/theory.html.45. Usability.gov. Methods for Designing Usable Web Sites. Usability.gov; 2006 March [cited 2006 July 1].Conducting and Using Usability Tests; [about 3 screens]. Available from:http://www.usability.gov/methods/usability_testing.html.46. Berkun S. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corporation;© 2006 [published 1999 Nov-Dec; cited 2006 July 1]. The Power of the Usability Lab; [about 3 printedpages]. Available from: http://msdn.microsoft.com/library/en-us/dnhfact/html/hfactor8_6.asp.LIST OF ABBREVIATIONS3D: Three DimensionalCAST: Center for Applied Special TechnologyCGI: Common Gateway InterfaceCHS: Center for Human Simulation (University of Colorado)CSS: Cascading Style SheetsGHz: Giga hertzHCM: Health CyberMapHTML: HyperText Markup LanguageMS: MicrosoftIE: Internet ExplorerIEEE: Institute of Electrical and Electronic EngineersISM: Instrumentation Scientific and MedicalISO: International Organization of StandardizationNIH: National Institutes of Health, Bethesda, MarylandQDA: Qualitative Data AnalysisQSR: Qualitative Solutions and ResearchSP: Service PackUCHSC: University of Colorado Health Science CenterUSAIM: University of Seychelles American Institute of Medicinev: Version RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 27 Tutor: KW Lam; Student: Sanjoy Sanyal
    • VHE: Visible Human ExperienceW3C: World Wide Web ConsortiumW3CAG: W3C Accessibility GuidelinesWAB: Web Accessibility BarrierWAI: Web Accessibility InitiativeXHTML: eXtensible HTML, a hybrid between HTML and XMLXML: eXtensible Markup LanguageAPPENDIXAppendix-Table-1a: Gender-based age distribution of respondents Age (years) Male (%) Female (%) 18 5 (33%) 8 (53%) 19 3 (20%) 5 (33%) 20 2 (13%) 1 (7%) 21 1 (7%) 22 or above 4 (27%) 1 (7%) Total 15 15Appendix-Table-1b: Students’ computer/Internet knowledge, skills and experience Male FemalePC at home Yes 12 13 No 3 2Duration of computer usage Not at all 1 Few weeks 1 3 2-6 months 2 6-24 months 1 2 >2years 12 8Hours/day Internet usage <1hour 8 8 >1-<2hours 4 6 >2-<3hours 1 1 >3-<4hours >4hours 2Type of Internet connection 2 non-responders 1 non-responder Dialup/modem 5 5 Broadband/always connected 8 9Web browser Microsoft Internet Explorer 13 15 Netscape Navigator Mozilla Firefox 1 Other(s) 1Desktop screen resolution 1 non-responder 3 non-responders 640x480pixels 4 3 800x600pixels 7 1 1024x768pixels 3 8Operating system Windows 15 15 Others (Mac/Unix/Linux/WebTV)Internet as reliable source of medical information Not at all 1 To some extent 1 4 Definitely 13 11 Yes, definitely No, not alwaysInternet reliable source of medical information 24(80%) 6(20%) RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 28 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Prefer conventional search engine 26(86.7%) 4(13.3%)Prefer either/both Neuroanatomy browser as definitive knowledge resource 29(96.7%) 1(3.3%)Appendix-Table-1c: Searchability - including navigation, input flexibility, output Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) (Very) / Difficult 1 (7%) 2 (13%) 3 (10%) 3 (20%) 8 (53.3%) 11 (37%) Acceptable difficulty 3 (20%) 4 (27%) 7 (23%) 2 (13%) 2 (13.3%) 4 (13%) Easy / (Very) 11 (73%) 9 (60%) 20 (67%) 10 (67%) 5 (33.3%) 15 (50%) Total 15 15 30 15 15 30Appendix-Table-1d: Browser interface speed - page loading, response times Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Very fast 1 (6.5%) 2 (13%) 3 (10%) 3 (20%) 8 (53%) 11 (37%) Moderately fast 13 (87%) 13 (87%) 26 (87%) 9 (60%) 6 (40%) 15 (50%) Moderately slow 1 (6.5%) 0 1 (3%) 3 (20%) 1 (7%) 4 (13%) Very slow 0 0 0 0 0 0 Total 15 15 30 15 15 30Appendix-Table-1e: Success rate - Effectiveness, output, logical arrangement of info, reliability Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) From 1st attempt 5 (33%) 4 (27%) 9 (30%) 5 (33.3%) 3 (20%) 8 (27%) After 1+ failure 10 (67%) 11 (73%) 21 (70%) 8 (53.3%) 5 (33%) 13 (43%) Not successful 0 0 0 2 (13.3%) 7 (47%) 9 (30%) Total 15 15 30 15 15 30Appendix-Table-1f: Adequacy of user help/instructions/hints, ease of learning, understanding,predictability, error recovery/correction/prevention Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Easy, no help needed 5 (33%) 5 (33%) 10 (33.3%) 1 (7%) 1 (7%) 2 (7%) Easy, instructions useful 9 (60%) 10 (67%) 19 (63.3%) 10 (67%) 5 (33%) 15 (50%) Need more help 1 (7%) 0 1 (3.3%) 4 (26%) 9 (60%) 13 (43%) Total 15 15 30 15 15 30Appendix-Table-1g: Usefulness - Information Quality Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) (Strongly) / Agree 13 (87%) 12 (80%) 25 (83.3%) 12 (80%) 7 (46.5%) 19 (63.3%) Ambiguous 2 (13%) 2 (13%) 4 (13.3%) 0 1 (7%) 1 (3.3%) Disagree / (Strongly) 0 1 (7%) 1 (3.3%) 3 (20%) 7 (46.5%) 10 (33.3%) Total 15 15 30 15 15 30Appendix-Table-1h: Usefulness - Info overload Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) No / Slight problem 8 (53%) 13 (87%) 21 (70%) 11 (73%) 8 (53.3%) 19 (63%) RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 29 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Moderate problem 6 (40%) 2 (13%) 8 (27%) 1 (7%) 2 (13.3%) 3 (10%) Significant / Extreme problem 1 (7%) 0 1 (3%) 3 (20%) 5 (33.3%) 8 (27%) Total 15 15 30 15 15 30Appendix-Table-1i: Comparison of overall usefulness Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Not at all / slightly 2 (13%) 2 (13%) 4 (13%) 4 (27%) 8 (53%) 12 (40%) Somewhat 6 (40%) 6 (40%) 12 (40%) 2 (13%) 3 (20%) 5 (17%) Very much / extremely 7 (47%) 7 (47%) 14 (47%) 9 (60%) 4 (27%) 13 (43%) Total 15 15 30 15 15 30Appendix-Table-1j: Usefulness of either/both as definitive resource Male (%) Female (%) Both (%) (Strongly) / Disagree 1 (7%) 0 1 (3%) Ambiguous 2 (13%) 8 (53%) 10 (33%) Agree / (Strongly) 12 (80%) 7 (47%) 19 (64%) Total 15 15 30Appendix-Table-1k: Actual task-based usage - Overall usability and usefulness Male (%) Female (%) Both (%) Interactive 3D atlas (Colorado) 6 (40%) 7 (47%) 13 (44%) 3D brain browser (Oregon) 4 (27%) 6 (40%) 10 (33%) Both interfaces equally 5 (33%) 2 (13%) 7 (23%) Total 15 15 30Appendix-Table-1l: Perceived future prospects - robustness, reliability, cost Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) No / Slight 5 (33%) 1 (7%) 6 (20%) 6 (40%) 6 (40%) 12 (40%) Somewhat 3 (20%) 1 (7%) 4 (13%) 1 (7%) 4 (27%) 5 (17%) Very / Extreme 7 (47%) 13 (86%) 20 (67%) 8 (53%) 5 (33%) 13 (43%) Total 15 15 30 15 15 30Appendix-Table-2: Subjective opinions of users pertaining to both interfaces, and their comparisons,based on themes and patternsPatterns based on usability User commentsand usefulness themes Interactive 3D atlas (Colorado) 3D brain browser (Oregon)Ease of use “Interface 1 (was) so very user-friendly…” “Second was better as it was more user-friendly” “Interface 1 handling was not so easy”Effectiveness, fitness for “…helps students to learn better and teachers to teach better.”purpose “Helps student in self-study” “…it is useful, helps us to understand better…” “…it gives us the actual 3-D images which could not be obtained from textbooks.” “Very informative”; “Can relate Anatomy with clinical lesions” “I could not identify exactly where the brain stem is located” “Tracts were bit difficult to understand”Ease of learning (learnability), “Only the first time was difficult…”predictability, adaptation to userlevels and styles RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 30 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Ease of understanding “Most of the difficulty to understand (is “Interface 2: I really didn’t understand because we don’t) know the basics about anything!!” the topic, before we go to the program”Performance, robustness and “Image planes not always in synch with “…very difficult for orientation”reliability direction of view”Adequate user help, error “No HELP!”correction/recovery (fault “No info provided”tolerance)Estimated user satisfaction “Interface 1 shows good images” “I didn’t like it at all” “Good imagination!” “…brain slice pieces are very good” “Great images” “For both sites, I liked the color 3-D images”Adequate response times “Interface 1 takes some time to load” “Interface 2 quick to load”Appropriate amount of output “…more explanation needed; needs to be “… too small for visualization” updated.” “Small, not very informative” “Not accurately labeled” “…no proper explanation” “no proper labeling, can’t identify the structures” “…(had) CT MRI options…” “…does not give a proper view”User control, input flexibility “Zoom did not work”Appendix-Table-3a: Comparison of heuristic violation severity and rating Interactive 3-D Atlas (Colorado) 3-D Brain Browser Interface (Oregon) Visibility of system status 1 2 Match between system and real world 0 3 User control and freedom 0 0 Consistency and standards 0 0 Error prevention 0 2 Recognition rather than recall 0 3 Flexibility and efficiency of use 1 3 Aesthetic and minimalist design 0 3 Help users recover from errors 0 0 Help and documentation 1 3 Navigation 1 2 Use of modes 0 0 Structure of information 0 3 Physical constraints 3 3 Extraordinary users 3 4 Average 0.67 2.07Appendix-Table-3b: Heuristic evaluation detailed resultsVisibility of system status (Max score = 3) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Is status feedback provided continuously (e.g. progress indicators/ messages)? No YesAre warning messages displayed for long enough? Yes NoIs there provision for some form of feedback? Yes NoUsability score 2/3 1/3Usability problem severity rating 1 2Match between system and real world (Max score = 9) Interactive 3-D 3-D Brain Browser Atlas (Colorado) Interface (Oregon)Are the words, phrases and concepts used familiar to the user? Yes NoDoes the task sequence parallel the users work processes? Yes NoIs information presented in a simple, natural and logical order? Yes No RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 31 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Is the use of metaphors easily understandable by the user? Yes NoDoes the system cater for users with no prior experience of electronic devices? No NoDoes the system make the users work easier and quicker than without the system? Yes YesDoes the system fit in with the environment in which the users tasks are carried out? Yes YesCan the system realistically reflect real world situations and appear to respond to user? Yes YesAre important controls represented onscreen; is there obvious mapping with real ones? Yes YesUsability score 8/9 4/9Usability problem severity rating 0 3User control and freedom (Max score = 3) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Are facilities provided to "undo" (or "cancel") and "redo" actions? Yes YesAre there clearly marked exits (when user finds themselves somewhere unexpected)? Yes YesAre facilities provided to return to the top level at any stage? N/A N/AUsability score 2/3 2/3Usability problem severity rating 0 0Consistency and standards (Max score = 8) Interactive 3-D 3-D Brain Browser Atlas (Colorado) Interface (Oregon)Is use of terminology, controls, graphics and menus consistent throughout the system? Yes YesIs there a consistent look and feel to the system interface? Yes YesIs there consistency between data entry and data display? N/A N/AIs the interface consistent with any platform conventions? Yes YesHave ambiguous phrases/actions been avoided? Yes NoIs the interface consistent with standard PC conventions? Yes YesIs interactive TV consistent with the related TV programmes? N/A N/AHave colour and style conventions been followed for links (and no other text)? Yes YesUsability score 6/8 5/8Usability problem severity rating 0 0Error prevention (Max score = 8) Interactive 3-D Atlas 3-D Brain Browser Interface (Colorado) (Oregon)Is a selection method provided (e.g. from a list) as an alternative to Yes Nodirect entry of information?Is user confirmation required before deleting something? N/A N/ADoes the system ensure work is not lost either by user or system error? Yes NoDoes the system prevent calls being accidentally made? No NoAre the options given in dialog boxes obvious? N/A N/ADoes the system provide foolproof synchronization with a PC? Yes YesHas the possibility of the user making errors been removed? Yes YesIs the system robust and safe enough for its surroundings? Yes YesUsability score 5/8 3/8Usability problem severity rating 0 2Recognition rather than recall (Max score = 6) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Are help and instructions visible or easily accessible when needed? Yes NoIs the relationship between controls and their actions obvious? Yes NoIs it possible to search for information (e.g. a phone number) rather than N/A N/Aentering the information directly?Is the functionality of the buttons on the device obvious from their labels? Yes NoAre input formats (e.g. dates or lengths of names) and units of values indicated? N/A N/AIs the functionality of novel device controls (e.g. thumbwheels) obvious? N/A N/AUsability score 3/6 0/6Usability problem severity rating 0 3 RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 32 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Flexibility and efficiency of use (Max score = 8) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Does the system allow for a range of user expertise? Yes NoDoes the system guide novice users sufficiently? Yes NoIs it possible for expert users to use shortcuts and to tailor frequent actions? Yes NoIs it possible to access and re-use a recent history of instructions? Yes NoDoes the system allow for a range of user goals and interaction styles? Yes YesDoes the system allow all functionality to be accessed either using function Yes Yesbuttons or using the stylus?Is it possible to replace and restore default settings easily? Yes YesHave unnecessary registrations been avoided? No YesUsability score 7/8 4/8Usability problem severity rating 1 3Aesthetic and minimalist design (Max score = 8) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Is the design simple, intuitive, easy to learn and pleasing? Yes NoIs the system free from irrelevant, unnecessary and distracting information? Yes YesAre icons clear and buttons labelled and is the use of graphic controls obvious? Yes NoIs the information displayed at any one time kept to a minimum? Yes YesIs the number of applications provided appropriate (has featuritis been avoided)? Yes NoHas the need to scroll been minimized and where necessary, are navigation facilities Yes Yesrepeated at the bottom of the screen?Is the system easy to remember how to use? Yes YesHave excessive scripts, applets, movies, graphics and images been avoided? No NoUsability score 7/8 4/8Usability problem severity rating 0 3Help users recover from errors (Max score = 3) Interactive 3-D 3-D Brain Browser Atlas (Colorado) Interface (Oregon)Do error messages describe problems sufficiently, assist in their diagnosis and suggest N/A N/Aways of recovery in a constructive way?Are error messages written in a non-derisory tone; refrain from attributing blame to user? N/A N/AIs it clear how the user can recover from errors? Yes YesUsability score 1/3 1/3Usability problem severity rating 0 0Help and documentation (Max score = 3) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Is help clear and direct and simply expressed in plain English, free from jargon? Yes NoIs help provided in a series of steps that can be easily followed? No NoIs it easy for the user to search, understand and apply help text? Yes NoUsability score 2/3 0/3Usability problem severity rating 1 3Navigation (Max score = 4) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Is navigational feedback provided (e.g. showing a users current and initial No Nostates, where theyve been and what options they have for where to go)?Are any navigational aids provided (e.g. find facilities)? Yes NoDoes the system track where the user was in the last session? Yes YesHas opening unnecessary new browser windows been avoided? Yes YesUsability score 3/4 2/4Usability problem severity rating 1 2Use of modes (Max score = 2) RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 33 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Does the system use different modes appropriately and effectively? Yes YesIs it easy to exit from each mode of use? Yes YesUsability score 2/2 2/2Usability problem severity rating 0 0Structure of information (Max score = 10) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Is there a hierarchical organisation of information from general to specific? Yes NoAre related pieces of information clustered together? Yes NoIs the length of a piece of text appropriate to the display size and interaction Yes Yesdevice?Has the number of screens required per task been minimized? Yes YesDoes each screen comprise 1 document on 1 topic with the most important Yes Yesinformation appearing at the top?Has hypertext been used appropriately to structure content and are links intuitive Yes Yesand descriptive?Have pages been structured to facilitate scanning by the reader? Yes NoAre the URLs, page titles and headlines straightforward, short and descriptive? Yes NoHas excessive use of white space been avoided? Yes NoHas textual content been kept to a maximum of two columns? Yes NoUsability score 10 / 10 4 / 10Usability problem severity rating 0 3Physical constraints (Max score = 5) Interactive 3-D Atlas 3-D Brain Browser (Colorado) Interface (Oregon)Are function buttons large enough to be usable? No NoIs the screen visible at a range of distances and in various types of lighting? No NoDoes the touch-screen cater for users touching the screen quickly and slowly? N/A N/AIs the distance between targets (e.g. icons) and the size of targets appropriate Yes No(size should be proportional to distance)?Has the use of text in images and large or irregular imagemaps been avoided? Yes YesUsability score 2/5 1/5Usability problem severity rating 3 3Extraordinary users (Max score = 6) Interactive 3-D 3-D Brain Browser Atlas (Colorado) Interface (Oregon)Is the use of colour restricted appropriately (and suitable for colour-blind users)? No NoDo the buttons allow for use by older, less agile fingers or people wearing gloves? No NoDo buttons give tactile feedback when selected? No NoIs the touch-screen usable by people of all heights and those in wheelchairs? N/A N/AAre equivalent alternatives provided for visual and auditory content? No NoHave accessibility and internationalization guidelines been applied if appropriate? Yes YesUsability score 1/6 1/6Usability problem severity rating 3 4Appendix-Table-4: LIDA detailed results Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)Accessibility(Max=60) 70%(42/60) 80%(48/60)-Automated tests(Max=57) 72%(41/57) 79%(45/57)-Different Browsers(Max=3) N/A N/A-Registration(Max=3) 33%(1/3) 100%(3/3)Usability(Max=54) 72%(39/54) 48%(26/54)-Clarity(Max=18) 72%(13/18) 28%(5/18)-Consistency(Max=9) 100%(9/9) 100%(9/9)-Functionality(Max=15) 53%(8/15) 20% (3/15) RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 34 Tutor: KW Lam; Student: Sanjoy Sanyal
    • -Engagibility(Max=12) 100%(9/9) 100%(9/9)Reliability(Max=27) 78%(21/27) 74%(20/27)-Currency(Max=9) 33%(3/9) 22%(2/9)-Conflicts of interest(Max=9) 100%(9/9) 100%(9/9)-Content production(Max=9) 100%(9/9) 100%(9/9)Overall score(Max=141) 72%(102/141) 67%(94/141)Mean 73.7 72.1SD 28.54 37.26Probability associated with Student’s ttest (2 tailed distribution, unpaired 2sample with unequal variance) 0.92Level 1 Accessibility (Max=57) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)1. Page Setup (Max=15) 40% (6 / 15) 60% (9 / 15)Document Type Definition 3 0HTTP-Equiv Content-Type (in header) 0 3HTML Language Definition 0 0Page Title 3 3Meta Tag Keywords 0 32. Access Restrictions (Max=12) 66% (8 / 12) 100% (12 / 12)Image Alt Tags 3 3Specified Image Widths 2 3Table Summaries 0 3Frames 3 33. Outdated Code (Max=27) 100% (27 / 27) 88% (24 / 27)Body Tags - Body Background Colour 3 0Body Tags - Body Topmargin 3 3Body Tags - Body Margin Height 3 3Table Tags - Table Background Colour 3 3Table Tags - Table Column Height 3 3Table Tags - Table Row Height 3 3Font Tags - Font Color 3 3Font Tags - Font Size 3 3Align (non style sheet) 3 34. Dublin Core Tags (Max=3) 0% (0 / 3) 0% (0 / 3)Dublin Core Title Tag 0 0ACCESSIBILITY RATING 72% (41 / 57) 79% (45 / 57)Level 2 Usability (Max=54) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)1. Clarity (Max=18) 72% (13 / 18) 28% (5 / 18)Clear statement 3 0Detail level 2 1‘Block of content’ 2 (Font too small) 0 (Not scannable, small graphics)Navigation 3 1Site location 3 3Colour scheme 2 (Colour-blind unfriendly)* 0 (Significantly so)*2. Consistency (Max=9) 100% (9 / 9) 100% (9 / 9)Page layout 3 3Navigation link 3 3Site organisation 3 33. Functionality (Max=15) 53% (8 / 15) 20% (3 / 15)Search facility 0 0Browsing facility 3 0Cognitive overhead 2 0Browser navigational tools 3 3Plug-ins 0 04. Engagibility (Max=12) 100% (9 / 9) 100% (9 / 9)Effective judgment 3 3Interactive 3 3Personalise 3 3USABILITY RATING 72% (39 / 54) 48% (26 / 54)*See Vischeck results for colour blind accessibility RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 35 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Level 3 Reliability (Max=27) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)1. Currency (Max=9) 33% (3 / 9) 22% (2 / 9)Recent events 1 1User comments 0 0Updated 2 12. Conflicts of interest (Max=9) 100% (9 / 9) 100% (9 / 9)Who runs site 3 3Who pays for site 3 3Objective 3 33. Content production (Max=9) 100% (9 / 9) 100% (9 / 9)Clear method 3 3Robust method 3 3Original source check 3 3RELIABILITY RATING 78% (21 / 27) 74% (20 / 27)Appendix-Table-5: WebXACT automated test results Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)General issuesMetadata summaryAuthor No author YesDescription No description No descriptionKey words No keywords YesPage contentImages: Server side image maps 0 0 Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)Quality issues Warnings WarningsContent DefectsBroken links: 0 0Broken anchors: 0 0Links to local files: 0 0Search and navigationElements missing Alt text 0 0Page efficiencyElements missing height and width 1 0attributesWarnings when accessing this page 0 0Browser compatibilityFirst-party cookies denied for default 0 0Internet Explorer privacy setting Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)Accessibility issues Does not comply with all automatic Does not comply with all automatic and manual checkpoints of W3C WCAG, and manual checkpoints of W3C WCAG, requires repairs and manual verification requires repairs and manual verificationPriority 1 automatic checkpoints status, 0 errors, 0 instances on page status, 0 errors, 0 instances on pagePriority 1 manual checkpoints status, 7 test warnings,11 instances on status, 6 test warnings,8 instances on page pagePriority 2 automatic checkpoints status, 2 test errors, 4 instances on status, 1 test error, 1 instance on page pagePriority 2 manual checkpoints status, 13 test warnings, 15 instances status, 14 test warnings, 18 instances on page on pagePriority 3 automatic checkpoints status, 4 test errors, 5 instances on status, 1 test error, 1 instance on page pagePriority 3 manual checkpoints status, 10 test warnings, 10 instances status, 9 test warnings, 9 instances on on page page RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 36 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon)Privacy issuesData collectionPage encryption level: 0 bit 0 bitForms using GET: 0 0Visitor Tracking and P3P ComplianceFirst-party cookies denied for default 0 0Internet Explorer privacy settingThird-party cookies 0 0P3P compact policy No P3P compact policy No P3P compact policyWeb beacons (graphics from external 0 0sites)Third party contentThird-party links 2 3 = Item or issue that is not serious; need not be addressed immediately = Success of an item or issue = Quality issues; but not serious, need not be addressed immediately = Failure of an item/issue in a number of situationsAppendix-Table-6: Calculation of Web Accessibility Barrier (WAB) score Colorado Webpage Oregon WebpagePriority 1 manual checkpoints 11 instances on page x Wf 3 = 33 8 instances on page x Wf 3 = 24Priority 2 automatic checkpoints 4 instances on page x Wf 2 = 8 1 instance on page x Wf 2 = 2Priority 2 manual checkpoints 15 instances on page x Wf 2 = 30 18 instances on page x Wf 2 = 36Priority 3 automatic checkpoints 5 instances on page x Wf 1 = 5 1 instance on page x Wf 1 = 1Priority 3 manual checkpoints 10 instances on page x Wf 1 = 10 9 instances on page x Wf 1 = 9WAB score 86 72WCAG attaches a three-point priority level to each checkpoint from its impact on Web accessibility. Priority 1 checkpointsmandate the largest level of compliance while Priority 3 checkpoints are optional for Web content developers. In weighting thecalculation of the WAB score, we used the priority level in reverse order. The weighting factor (Wf) for Priority 1 violations is 3,for Priority 2 violations is 2, and for Priority 3 violations is 1.Appendix-Table-7: Summary of results Colorado browser Oregon browserQuestionnaireVery easy searchability 67% 50%Very difficult searchability 10% 37%Moderately fast browser speed 87% 50%Very fast browser speed 10% 37%Task failure rate 0% 30%Task success rate (1+ attempt) 70% 43% (M:F=13%:47%)Extra help requirement 3% 43%Ease of use 97% 57%Information usefulness 83% 63%Information overload 30% 37%Very useful 47% (M:F=47%:47%) 43% (M:F=60%:27%)Actual task-based usage 44% 33%Future prospects 67% (M:F=47%:87%) 43% (M:F=53%:33%)Heuristic violation rating 0.67 2.07LIDAAccessibility Failed validation; 70% Failed validation; 80%Usability 72% 48%Reliability 78% 74%Overall score 72% 67%W3C validation service RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 37 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Failed validation; 14 errors Failed validation; 18 errors Not Valid HTML 4.01 Strict Not valid (no Doctype)WebXACTAutomatic checkpoints 6 errors; 9 instances 2 errors; 2 instancesManual checkpoints 30 warnings; 36 instances 29 warnings; 35 instancesWAB score 86 72Vischeck Mild difficulty for colour-blind Useless to colour-blindEvaluation Questionnaire for Browser Interface Usability of Neuroanatomy ApplicationsThis is a questionnaire to evaluate the two Neuroanatomy browser interfaces that you have worked with. Your personal details andopinions would be strictly anonymised and compiled only for statistical calculations. It should take about 30 minutes to completethe questionnaire. Thank you very much for your cooperation.Questions (You may leave blank any question you dont want to answer)Question 1: Gender:  Male  FemaleQuestion 2: Your age is in which of the following ranges?  17  18  19  20  21  22 or moreQuestion 3: Which class/semester are you studying? (PC=Pre-clinical)  PC 1  PC 2  PC 3  PC 4  PC 5Question 4: How many hours per day, including college and home, do you currently use the Internet:  Less than 1 hour  Between 1 and 2 hours  Between 2 and 3 hours  Between 3 and 4 hours  More than 4 hoursQuestion 5: For how long have you been regularly using a computer?  Not at all  Few weeks  2-6 months  6-24 months  >2 yearsQuestion 6: Do you have a working personal computer (PC) in your home?  Yes  NoQuestion 7: From which location are you most likely to use the Web?  Home  Internet café  School/College  Library  OtherQuestion 8: What type of Internet connection do you have currently (if you use more than one type of connection in differentplaces, select the one you use most):  Modem  ISDN/ADSL/Broadband/Always connectedQuestion 9: What web browser do you use most frequently?  Microsoft Internet Explorer  Netscape Navigator / Communicator  Mozilla Firefox  Other RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 38 Tutor: KW Lam; Student: Sanjoy Sanyal
    • Question 10: The Desktop Area/Screen Resolution you most commonly use is:  640 by 480 pixels  800 by 600 pixels  1024 by 768 pixels or moreQuestion 11: Which operating system are you currently running on your most frequently used PC?  Windows  Macintosh  Unix/Linux  WebTV  OtherQuestion 12: Do you consider the Internet an important source of reliable medical information:  Not at all  To some extent  DefinitelyQuestion 13: Based on what you have seen of the two 3-D Neuroanatomy browsers so far, would you use either or both of them asdefinitive Neuroanatomy knowledge resources?  Yes  NoQuestion 14: The two browser interfaces meet my information retrieval and navigation needs better than other medicalinformation portals/gateways:  Strongly disagree  Disagree  Neither disagree nor agree  Agree  Strongly agreeQuestion 15: I prefer to use a conventional search engine (e.g., Google) to gather information or to have a librarian, staff member,or family member gather information for me:  True  FalseQuestion 16: How well does each of the two browser interfaces that you tried, perform its intended purpose? (Please rate each ofthe interfaces from ‘not at all well’ to ‘extremely well’ considering aspects like ease of access, use and navigation, and logicalarrangement of information on the page(s))Question 16.1: The interactive Neuroanatomy 3-D browser interface from Visible Human Experience (#1):  Not at all well  Slightly well  Somewhat well  Very well  Extremely wellQuestion 16.2: The Java plug-in 3-D browser from University of Oregon (#2):  Not at all well  Slightly well  Somewhat well  Very well  Extremely wellQuestion 17.1: Did you find the medical information factually correct in the Website # 1?  Yes  Somewhat (not always)  NoQuestion 17.2: Did you find the medical information factually correct in the Website # 2?  Yes  Somewhat (not always)  NoQuestion 18.1: Searching for the right parts of the brain, and finding the right areas through browser # 1 was:  Very difficult  Difficult  Of moderate (acceptable) difficulty  Easy  Very easyQuestion 18.2: Searching for the right parts of the brain, and finding the right areas through browser # 2 was:  Very difficult  Difficult  Of moderate (acceptable) difficulty RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 39 Tutor: KW Lam; Student: Sanjoy Sanyal
    •  Easy  Very easyQuestion 19: How important/useful to you is each of the 2 browser interfaces (Please rate each of the information interfaces from‘not at all important’ to ‘extremely important’)Question 19.1: Browser interface # 1:  Not at all important  Slightly important  Somewhat important  Very important  Extremely importantQuestion 19.2: Browser interface # 2:  Not at all important  Slightly important  Somewhat important  Very important  Extremely importantQuestion 20.1: Did you find interface ‘Help’ (and other instructions/hints) around browser interface # 1 adequate?  Yes, it is easy to use once you understand how it works; the online help and instructions provided were useful  It is very easy to use; I didn’t need any help (or made very little use of it)  No, I need more help to locate the information I need on the WebsiteQuestion 20.2: Do you find interface ‘Help’ (and other instructions/hints) around browser interface # 2 adequate?  Yes, it is easy to use once you understand how it works; the online help and instructions provided were useful  It is very easy to use; I didn’t need any help (or made very little use of it)  No, I need more help to locate the information I need on the WebsiteQuestion 21.1: The speed at which browser interface # 1 loads on my Internet connection is:  Very fast  Moderately fast  Moderately slow  Very slowQuestion 21.2: The speed at which browser interface # 2 loads on my Internet connection is:  Very fast  Moderately fast  Moderately slow  Very slowQuestion 22.1: You tried to find some structures in the brain images using browser interface # 1. Generally speaking, howsuccessful were you in completing this task?  Successful from the first attempt  Successful after one / more failed attempts  Not successfulQuestion 22.2: You tried to find some structures in the brain images using browser interface # 2. Generally speaking, howsuccessful were you in completing this task?  Successful from the first attempt  Successful after one / more failed attempts  Not successfulQuestion 23: What interface did you use most of the time to accomplish the above task?  Browser interface # 1  Browser interface # 2  Both were equally usedQuestion 24.1: Interface # 1 pointers to information are of good quality, accurate, up-to-date and useful:  Strongly agree  Agree  Neither agree nor disagree  Disagree  Strongly disagreeQuestion 24.2: Interface # 2 pointers to information are of good quality, accurate, up-to-date and useful:  Strongly agree  Agree  Neither agree nor disagree  Disagree  Strongly disagreeQuestion 25.1: When I looked for information in site # 1 I got overloaded quickly with too much detail:  Not at all a problem RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 40 Tutor: KW Lam; Student: Sanjoy Sanyal
    •  A slight problem  A moderate problem  A significant problem  An extreme problemQuestion 25.2: When I looked for information in site # 2 I got overloaded quickly with too much detail:  Not at all a problem  A slight problem  A moderate problem  A significant problem  An extreme problemQuestion 26: Do you think the two 3-D visual sites are useful additions to / improvements over conventional text-based Webportal interfaces?  Yes, definitely  Could be an improvement (but not always)  No, visual interfaces are uselessQuestion 27: What do you think are the future prospects of the two interfaces?Question 27.1: Browser interface # 1  Not at all important  Slightly important  Somewhat important  Very important  Extremely importantQuestion 27.2: Browser interface # 2  Not at all important  Slightly important  Somewhat important  Very important  Extremely importantQuestion 28: Were there any parts of the service that you found especially helpful? What do you like most about these sites andwhy? (Mention each site separately)Question 29: Were there any parts of the service that you found especially difficult to use or understand? What do you dislikemost about them and why? (Mention each site separately)Question 30: Should money be invested to continue developing and implementing these interfaces? Why/why not?Question 31: What are your suggestions or comments about what would make these browser interfaces better? (e.g., I would likethe following added to, changed in, or deleted from them)Question 32: Could you tell us your thoughts about this questionnaire? Are we asking the right questions? Are we asking thequestions in the right way? Your feedback will help us design better questionnaires in the future.Thank you once again very much for your feedback. RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 41 Tutor: KW Lam; Student: Sanjoy Sanyal