PPT presentation


Published on

1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Substitute Case study screen
  • Substitute menu screen
  • PPT presentation

    1. 1. Evaluating educational software Instructor: Heiko Spallek, DMD, PhD Based on papers/presentations by: T. Schleyer, DMD, PhD, University of Pittsburgh L. Johnson, PhD, University of Iowa D. Rubright, MFA, MA, University of Iowa H. Spallek, DMD, PhD, University of Pittsburgh February 20th, 2004 Dental Information Systems #2201
    2. 2. <ul><li>A framework for evaluating </li></ul><ul><li>educational software </li></ul><ul><li>Asking the learners … </li></ul><ul><li>Guidelines for the Design of Educational Software </li></ul><ul><li>ADEA Software Competition </li></ul><ul><li>Heuristic Evaluation (your assignment) </li></ul>Outline
    3. 3. What constitutes “good” information? What makes a computer-based course “good”?
    4. 4. Domains of quality criteria <ul><li>Pedagogical Issues </li></ul><ul><li>Subject Matter </li></ul><ul><li>Language/Grammar </li></ul><ul><li>Surface Features </li></ul><ul><li>Menus </li></ul><ul><li>Questions </li></ul><ul><li>Feedback </li></ul><ul><li>Invisible Functions </li></ul><ul><li>Off-line Materials </li></ul><ul><li>Evaluation </li></ul>
    5. 5. Purpose of study <ul><li>Designed to: </li></ul><ul><ul><li>verify applicability of quality criteria </li></ul></ul><ul><ul><li>survey the state-of-the-art in online continuing dental education </li></ul></ul><ul><li>Results applicable to a broad range of computer-based materials, not just Web-based courses </li></ul>
    6. 6. Design <ul><li>Complete survey of Web-based CDE using indices, search engines </li></ul><ul><li>manual review and coding for 34 criteria </li></ul><ul><li>summarization of raw data for each criterion </li></ul>
    7. 7. Criteria <ul><li>Provider and course listing </li></ul><ul><li>Course description </li></ul><ul><li>Course format </li></ul><ul><li>Course content and interaction </li></ul>
    8. 8. Results - Providers and courses <ul><li>Total yield: 157 courses (25 hrs of searching!) </li></ul><ul><li>32 providers </li></ul><ul><li>Universities currently provide highest number of courses/institution </li></ul>
    9. 9. Course topics <ul><li>Topic N % of total </li></ul><ul><li>Periodontology 31 19.7 </li></ul><ul><li>Oral Diagnosis 18 11.5 </li></ul><ul><li>Pathology 9 5.7 </li></ul><ul><li>Prosthodontics 8 5.1 </li></ul><ul><li>Implantology 7 4.5 </li></ul><ul><li>Basic Science 6 3.8 </li></ul><ul><li>Dental Materials 6 3.8 </li></ul>
    10. 10. Course formats <ul><li>Brochure or book format </li></ul><ul><li>slide show </li></ul><ul><li>case report </li></ul><ul><li>newsletter or composite report </li></ul>
    11. 11. Credit hours <ul><li>78 courses offered credit hours </li></ul><ul><li>price per credit hour: $5 - $25 </li></ul>
    12. 12. Credit hours vs length in screens
    13. 13. Media used <ul><li>Text: 100% </li></ul><ul><li>Images: 84% </li></ul><ul><li>Video: 2% </li></ul><ul><li>PDF: 7% </li></ul>
    14. 14. Questions <ul><li>77 (53%) of all courses included questions: </li></ul><ul><ul><li>multiple choice: 85% </li></ul></ul><ul><ul><li>true/False: 34% </li></ul></ul><ul><ul><li>open-ended: 8% </li></ul></ul><ul><li>Questions mostly at end, in few cases throughout </li></ul><ul><li>28% of tests online scored </li></ul>
    15. 15. Dates <ul><li>Date of creation: 11% </li></ul><ul><li>Date of last update: 24% </li></ul>
    16. 16. Course length
    17. 17. Content <ul><li>Author not indicated: 71% </li></ul><ul><li>No goals and objectives: 23% </li></ul><ul><li>No references: 85% </li></ul>
    18. 18. Navigation <ul><li>Direct indication of progress: 23% </li></ul><ul><li>Indirect indication of progress: 45% </li></ul><ul><li>Progress actively obscured: 32% </li></ul><ul><li>Navigation: approx. 60% easy or very easy to navigate </li></ul>
    19. 19. Interaction <ul><li>No e-mail contact possible: 47% </li></ul><ul><li>Author’s e-mail listed: 24% </li></ul><ul><li>Other e-mail: 29% </li></ul>
    20. 20. Discussion <ul><li>Limitations </li></ul><ul><ul><li>design guidelines are preliminary only </li></ul></ul><ul><ul><li>study used only a subset of design criteria </li></ul></ul><ul><ul><li>some criteria subjective (navigation, length) </li></ul></ul><ul><ul><li>not certain that all online courses in dentistry were found </li></ul></ul><ul><ul><li>password-protected courses not reviewed </li></ul></ul>
    21. 21. Discussion <ul><li>Variation of credit/hours vs course length </li></ul><ul><li>Dearth of true multimedia courses </li></ul><ul><li>Testing and feedback uses Internet capabilities only marginally </li></ul><ul><li>Low compliance with accepted standards for educational materials </li></ul>
    22. 22. Discussion (cont.) <ul><li>Poor use of navigational design </li></ul><ul><li>Interaction obviously not desired in most courses </li></ul><ul><li>Advanced functions of educational software not used (e.g. customization after pretest) </li></ul>
    23. 23. Recommendations <ul><li>Disseminate Guidelines widely </li></ul><ul><li>Online CDE should be peer-reviewed </li></ul><ul><li>Develop valid instruments for assessing courses </li></ul><ul><li>Insert TITLE and KEYWORD tags into HTML </li></ul><ul><li>Establish central index of courses </li></ul>
    24. 24. <ul><li>Evaluation of Web-based Dental CE-Courses </li></ul><ul><li>Heiko Spallek DMD PhD*, Elizabeth Pilcher DMD**, Ji-Young Lee***, Titus Schleyer DMD PhD* </li></ul><ul><li>* Center for Dental Informatics, University of Pittsburgh, School of Dental Medicine, Pittsburgh, PA </li></ul><ul><li>** Medical University of South Carolina, College of Dental Medicine, Charleston, SC </li></ul><ul><li>*** Temple University School of Dentistry, Philadelphia, PA </li></ul><ul><li>Study goals </li></ul><ul><li>evaluate the outcomes of online CDE courses through analysis of 6 organizations </li></ul><ul><li>focused on </li></ul><ul><ul><li>how the participants of online CDE can be characterized </li></ul></ul><ul><ul><li>whether the participants' expectations were met by the courses </li></ul></ul><ul><ul><li>how the participants evaluated the content of the courses </li></ul></ul><ul><ul><li>why they enrolled </li></ul></ul><ul><ul><li>how they experienced the online environment </li></ul></ul><ul><li>-> develop recommendations for the design of future courses </li></ul>
    25. 25. <ul><li>Study design </li></ul><ul><li>exploratory study </li></ul><ul><ul><li>survey of 436 past course participants </li></ul></ul><ul><ul><li>from 9 online CDE courses </li></ul></ul><ul><ul><li>from 6 organizations (health care schools and commercial CE providers) </li></ul></ul><ul><li>courses varied in content, length, type of provider, tuition </li></ul><ul><li>inclusion criteria were </li></ul><ul><ul><li>continuing education courses in dentistry that granted continuing education credits </li></ul></ul><ul><ul><li>online for at least a year </li></ul></ul>
    26. 26. Evaluated CDE courses
    27. 27. <ul><li>Study design: survey </li></ul><ul><li>&quot;The Tailored Design Method&quot; by Don Dillman </li></ul><ul><li>self-administered, dual-mode (e-mail and postal mail), partially branched survey (concise, limited the number of open-ended questions) </li></ul><ul><li>3 demographic questions </li></ul><ul><li>5 computer literacy questions </li></ul><ul><li>4 specific course material questions </li></ul><ul><li>6 online environment questions </li></ul><ul><li>3 course content questions </li></ul><ul><li>4 marketing questions </li></ul><ul><li>-> instrument available at http://di.dental.pitt.edu/cesurvey/ </li></ul>
    28. 28. Results/discussion: response rate
    29. 29. Results/discussion: demographics Responses to the question “On the whole, how sophisticated a computer user do you consider yourself?”
    30. 30. <ul><li>Results/discussion: marketing </li></ul><ul><li>Responses to the question “How did you learn about this particular course? </li></ul><ul><li>19% Internet search engine </li></ul><ul><li>15% course provider's homepage </li></ul><ul><li>15% personal recommendation </li></ul><ul><li>9% professional journal </li></ul><ul><li>2% alumni journal </li></ul><ul><li>10% other sources </li></ul>
    31. 31. <ul><li>Results/discussion: marketing </li></ul><ul><li>Participants perceived respectability of an online CDE course: </li></ul><ul><li>Agreement with the statement &quot;I am more likely to tell others about my participation in an online course than my participation in a traditional classroom-based lecture.&quot; </li></ul><ul><li>19% strongly agreed with this statement </li></ul><ul><li>11% somewhat agreed </li></ul><ul><li>24% neither agreed nor disagreed </li></ul><ul><li>6% somewhat disagreed </li></ul><ul><li>1% strongly disagreed </li></ul><ul><li>8% no opinion </li></ul>
    32. 32. <ul><li>Results/discussion: online environment </li></ul><ul><li>Time spent working online for the course? </li></ul><ul><ul><li>&quot;Dentistry on the Internet&quot; => 14.5 hours </li></ul></ul><ul><ul><li>&quot;Submitting an Invention to a Dental Manufacturer&quot; => 1 hour </li></ul></ul><ul><li>When? </li></ul><ul><ul><li>27% accessed the course material during work hours </li></ul></ul><ul><ul><li>79% after work hours </li></ul></ul><ul><ul><li>6% specified both </li></ul></ul><ul><li>From where? </li></ul><ul><ul><li>31% home </li></ul></ul><ul><ul><li>64% their office </li></ul></ul><ul><ul><li>3% a library </li></ul></ul>
    33. 33. <ul><li>Results/discussion: online environment </li></ul><ul><li>What was the single most important reason that attracted you to an online course </li></ul><ul><ul><li>“ convenience” 47% </li></ul></ul><ul><li>What was the biggest disadvantage to the online format? </li></ul><ul><ul><li>“ lack of human interaction” 13% </li></ul></ul><ul><ul><li>“ cannot ask questions” 12% </li></ul></ul><ul><li>The lack of face-to-face contact with a teacher was a stumbling block for your learning </li></ul><ul><ul><li>agreed 18% </li></ul></ul><ul><ul><li>disagreed 65% </li></ul></ul><ul><ul><li>neither agreed nor disagreed 17% </li></ul></ul>
    34. 34. <ul><li>Results/discussion: meeting initial expectations </li></ul><ul><li>courses ranked equally well in most categories </li></ul><ul><ul><li>&quot;exploit the convenience of online learning&quot; </li></ul></ul><ul><ul><li>&quot;fit the course into my schedule&quot; </li></ul></ul><ul><li>problem spots </li></ul><ul><ul><li>cannot &quot;communicate with peers online&quot; </li></ul></ul><ul><ul><li>cannot &quot;interact one-to-one with the instructor&quot; </li></ul></ul><ul><li>authors' experiences suggest </li></ul><ul><ul><li>participants seldom raise content-related questions </li></ul></ul><ul><ul><li>since inception: the instructor of the course “Top 40 Drugs” received a total of four content-related questions </li></ul></ul>
    35. 35. <ul><li>Conclusions </li></ul><ul><li>E valuated online CDE courses do meet some of the needs and expectations of dental professionals. </li></ul><ul><li>generally limited number of participants -> no ROI * </li></ul><ul><li>participants mainly originated in the United States </li></ul><ul><li>recommendations for online course development ->->->->->->->->-> </li></ul><ul><li>Carr, Sarah. Is anyone making money on distance education? The Chronicle of Higher Education 2-16-2001 </li></ul>
    36. 36. <ul><li>Recommendations </li></ul><ul><li>Online CDE courses need to: </li></ul><ul><li>be current </li></ul><ul><li>cover the subject matter in-depth </li></ul><ul><li>be guided by an experienced instructor </li></ul><ul><li>define average time necessary to complete the entire course </li></ul><ul><li>be marketed among dental professionals </li></ul>
    37. 37. Pedagogical Issues <ul><li>Benefits of the computer </li></ul><ul><li>Instructional techniques match the audience </li></ul><ul><li>Instructional techniques match the content </li></ul><ul><li>Assessment strategy </li></ul><ul><li>Customizable content </li></ul><ul><li>Content is reinforced </li></ul><ul><li>Interactions vary </li></ul>How well does it teach? Are activities appropriate for the audience, objectives and content?
    38. 38. Pedagogical Issues <ul><li>Benefits of the computer </li></ul>www.lib.uiowa.edu/commons/skullvr/index.html
    39. 39. Pedagogical Issues <ul><li>Instructional techniques match audience and content </li></ul>D4: Dx & Tx Planning DiagnosticBytes D1 & D2: Assessment Assessment of Geriatric Patients
    40. 40. Pedagogical Issues <ul><li>Assessment Strategy </li></ul>Diagnosis of Head and Neck Pain
    41. 41. Pedagogical Issues <ul><li>Interactions vary </li></ul>Diagnosis of Head and Neck Pain
    42. 42. <ul><li>Information is complete, accurate, & logically organized </li></ul>Subject Matter Is the content accurate and appropriate for the audience? www.uiowa.edu/~oprm/AtlasHome.html
    43. 43. Language/Grammar <ul><li>Glossary </li></ul>Assessment of Geriatric Patients Is language usage appropriate and understandable? Assessment of Geriatric Patients
    44. 44. Surface Features <ul><li>Media is easily used </li></ul><ul><li>Screen & application enhance learning </li></ul><ul><li>Color is coordinated </li></ul><ul><li>Text is easy to read </li></ul><ul><li>Bookmarks used </li></ul><ul><li>Opportunity for errors minimized </li></ul>Diagnosis of Head and Neck Pain Does media play correctly, is text readable and is the overall look pleasing?
    45. 45. Surface Features -- Menus <ul><li>Clear user controls </li></ul><ul><li>Menus have clear labels </li></ul><ul><li>Completed sections indicated </li></ul><ul><li>Esthetic screen displays </li></ul>research.dentistry.uiowa.edu/summaries/index.html Will students get lost?
    46. 46. Questions <ul><li>Reflect objectives </li></ul><ul><li>Interspersed </li></ul><ul><li>Easy to answer </li></ul><ul><li>Change answers </li></ul><ul><li>Skip questions </li></ul>Are there questions for the student to answer to gauge if they are learning or making progress in the program?
    47. 47. Feedback <ul><li>Indicates correct/incorrect answers </li></ul><ul><li>Provides informative feedback </li></ul><ul><li>Uses media when appropriate </li></ul>Is feedback given to guide the student and make learning more efficient and effective?
    48. 48. Invisible Functions <ul><li>Data continuously stored </li></ul><ul><li>Data collection turned on/off </li></ul><ul><li>Data secure </li></ul><ul><li>Reports generated </li></ul>What is going on behind the scene to check on student progress? Assessment of Geriatric Patients
    49. 49. Off-line Materials <ul><li>Equipment requirements </li></ul><ul><li>Troubleshooting information </li></ul><ul><li>Operating instructions </li></ul><ul><li>Curriculum integration suggestions </li></ul><ul><li>Support materials provided </li></ul>How is the program supported and connected to other resources? Assessment of Geriatric Patients
    50. 50. Formative Evaluation <ul><li>47 Students </li></ul><ul><ul><li>15 schools </li></ul></ul><ul><ul><li>3 countries </li></ul></ul><ul><li>Computer Support </li></ul><ul><ul><li>Installation problems </li></ul></ul><ul><ul><li>Operation problems </li></ul></ul><ul><li>Observation </li></ul><ul><li>Student </li></ul>DiagnosticBytes Can I suggest changes to make the program better in the future?
    51. 51. Formative Evaluation <ul><li>Strengths </li></ul><ul><ul><li>Easy manipulation of images </li></ul></ul><ul><ul><li>Easy to enter a diagnosis </li></ul></ul><ul><ul><li>Highly interactive </li></ul></ul><ul><ul><li>Element of fun </li></ul></ul><ul><ul><li>Futuristic setting </li></ul></ul><ul><ul><li>Patient &quot;spoke&quot; to responses </li></ul></ul><ul><ul><li>Good graphics </li></ul></ul><ul><ul><li>Element of surprise </li></ul></ul><ul><ul><li>Great deal of decision-making </li></ul></ul>DiagnosticBytes
    52. 52. Formative Evaluation <ul><li>Weaknesses </li></ul><ul><ul><li>Certain images too small </li></ul></ul><ul><ul><li>Instrument interface clumsy </li></ul></ul><ul><ul><li>Mark, the assistant, became annoying </li></ul></ul><ul><ul><li>Help needs expansion </li></ul></ul><ul><ul><li>Consent is difficult to find </li></ul></ul><ul><ul><li>Patient did not say why a proposed treatment was rejected </li></ul></ul><ul><ul><li>Expand the evaluation </li></ul></ul><ul><ul><li>Evaluation needs to display images during discussion </li></ul></ul>DiagnosticBytes
    53. 53. Summative Evaluation <ul><li>Design </li></ul>Is this program effective? What measures are used? Is it fair? Assessment of Geriatric Patients
    54. 54. Summative Evaluation <ul><li>Results </li></ul>Assessment of Geriatric Patients
    55. 55. A vision for the future Content Content Content Content
    56. 56. Content Content Content Content Decision support Education Analysis User interface Knowledge base
    57. 57. Developing a Protocol for an Educational Software Competition
    58. 58. Background <ul><ul><li>Pedagogical Issues </li></ul></ul><ul><ul><li>Subject Matter </li></ul></ul><ul><ul><li>Language and Grammar </li></ul></ul><ul><ul><li>Surface Features </li></ul></ul><ul><ul><li>Questions, Answers and Feedback </li></ul></ul><ul><ul><li>Invisible Functions </li></ul></ul><ul><ul><li>Off-line Materials </li></ul></ul><ul><ul><li>Evaluation </li></ul></ul>* http:// www.temple.edu/dentistry/di/edswstd <ul><li>Standards Committee for Dental Informatics developing Guidelines for the Design of Educational Software* (ANSI accredited) </li></ul><ul><li>133 criteria in 8 categories </li></ul>
    59. 59. Background <ul><li>Educational Computing in Dentistry Competition </li></ul><ul><li>No published/validated protocol for evaluation of educational software has evolved into a rating instrument </li></ul><ul><li>Application of Guidelines assists further development and validation </li></ul><ul><li>Application of Guidelines assists adoption of standards </li></ul><ul><li>Promote educational software as a promotional activity </li></ul>
    60. 60. Methods <ul><li>Use Guidelines to develop rating instrument </li></ul><ul><ul><li>Pedagogy </li></ul></ul><ul><ul><li>Subject Matter </li></ul></ul><ul><ul><li>Technical Aspects </li></ul></ul><ul><li>Include most relevant criteria </li></ul><ul><li>Judge completes review in 2 hours </li></ul>
    61. 61. Methods – Rating Scale <ul><li>F our-point scale </li></ul><ul><li>3 = agree </li></ul><ul><li>2 = somewhat agree </li></ul><ul><li>1 = somewhat disagree </li></ul><ul><li>0 = disagree </li></ul>
    62. 62. Methods <ul><li>Pedagogy </li></ul><ul><li>The computer is appropriate for the instructional objective. </li></ul><ul><li>The program offers a variety of interactions. </li></ul>
    63. 63. Methods <ul><li>Subject Matter </li></ul><ul><li>The goals and objectives of the program are clearly stated. </li></ul><ul><li>The subject matter presented is accurate. </li></ul><ul><li>The subject matter presented matches the knowledge level of the audience. </li></ul>
    64. 64. Methods <ul><li>Technical Aspects </li></ul><ul><li>The screen displays draw attention to important information. </li></ul><ul><li>Type styles are easy to read. </li></ul>
    65. 65. Methods -- Participants <ul><li>Open to all ADEA members </li></ul><ul><li>3,000 individual members </li></ul><ul><li>Dental institutions (n=55) </li></ul><ul><li>Dental hygiene programs (n=67) </li></ul>
    66. 66. Methods -- Competition <ul><li>Two Categories </li></ul><ul><li>CD-ROM </li></ul><ul><li>World Wide Web </li></ul><ul><li>Awards </li></ul><ul><li>1 st -- $1,000 </li></ul><ul><li>2 nd -- $500 </li></ul><ul><li>3 rd -- $250 </li></ul><ul><li>Honorable Mention </li></ul>
    67. 67. Methods – Entry Process <ul><li>Submission via Web (n= 30) </li></ul><ul><li>Required hardware and software </li></ul><ul><li>Objectives </li></ul><ul><li>Audience </li></ul><ul><li>Description </li></ul><ul><li>Formative Evaluation </li></ul><ul><li>Summative Evaluation </li></ul>
    68. 68. Methods -- Review <ul><li>Recruit qualified judges (n=13) </li></ul><ul><li>6 dental faculty active in dental informatics </li></ul><ul><li>6 instructional designers </li></ul><ul><li>1 dental hygienist/instructional designer </li></ul>
    69. 69. Methods – Review <ul><li>Calibrate judges </li></ul><ul><li>2 groups </li></ul><ul><li>One-hour long conference call </li></ul><ul><li>Review process & instrument </li></ul><ul><li>Gather feedback & made changes to instrument & process </li></ul>
    70. 70. Methods – Review <ul><li>Enter/submit rating & open ended comments on spreadsheet </li></ul><ul><li>Random assignment of programs </li></ul><ul><li>Assignments avoided conflict of interest </li></ul><ul><li>2 judges / program </li></ul><ul><li>5 - 7 programs / judge (2 weeks) </li></ul>
    71. 71. Analysis <ul><li>Calculated raw summary scores </li></ul><ul><ul><li>Pedagogy – 96 points maximum </li></ul></ul><ul><ul><li>Subject Matter – 33 points maximum </li></ul></ul><ul><ul><li>Technical Aspects – 75 points maximum </li></ul></ul><ul><ul><li>TOTAL – 204 points maximum </li></ul></ul><ul><li>Adjust for N/A ratings </li></ul><ul><li>Average scores </li></ul><ul><li>Evaluation scores weighted by a factor of 5 </li></ul>
    72. 72. Results – Judges’ Time <ul><li>Mean = 50 minutes </li></ul><ul><li>Range = 30 minutes to 4 hours </li></ul>
    73. 73. WWW Composite Scores
    74. 74. CD-ROM Composite Scores
    75. 75. Formative Evaluation of Review Process <ul><li>Two conference calls with judges </li></ul><ul><li>Instrument Feedback </li></ul><ul><ul><li>Different program types reviewed with same instrument </li></ul></ul><ul><ul><li>4-point scale constraining </li></ul></ul><ul><ul><li>Program “presentation” may impact rating </li></ul></ul><ul><ul><li>Need to assign weights to criteria, especially evaluation </li></ul></ul>
    76. 76. Formative Evaluation of Review <ul><li>Process </li></ul><ul><ul><li>Time consuming </li></ul></ul><ul><ul><li>Required a team approach – technologist and content expert </li></ul></ul><ul><li>Judge Calibration </li></ul><ul><ul><li>Requires additional training </li></ul></ul><ul><ul><li>Currently limited by costs </li></ul></ul>
    77. 77. Discussion <ul><li>Strengths of process & instrument </li></ul><ul><ul><li>Dental and instructional design experts involved </li></ul></ul><ul><ul><li>Instrument based on national guidelines </li></ul></ul><ul><ul><li>Calibration of judges (limited) </li></ul></ul><ul><ul><li>Formative evaluation will improve process & instrument </li></ul></ul>
    78. 78. <ul><li>There are a variety of expert review methods to chose from: </li></ul><ul><li>Heuristic evaluation </li></ul><ul><li>Guidelines review </li></ul><ul><li>Consistency inspection </li></ul><ul><li>Cognitive walkthrough </li></ul><ul><li>Formal usability inspection </li></ul>Your assignment
    79. 79. <ul><li>Expert Review </li></ul><ul><li>Method: Heuristics Evaluation </li></ul><ul><li>= systematic inspection of a user interface design for usability </li></ul><ul><li>most popular usability inspection method </li></ul><ul><li>goal: find usability problems (  fix them as part of an iterative design process) </li></ul><ul><li>small set of evaluators examine the interface </li></ul><ul><li>evaluator judge compliance with recognized usability principles (the &quot;heuristics&quot;) </li></ul><ul><li>general heuristics + category-specific heuristics </li></ul>
    80. 80. <ul><li>Expert Review </li></ul><ul><li>Ten Usability Heuristics by Jakob Nielsen </li></ul><ul><li>Visibility of system status </li></ul><ul><li>Match between system and the real world </li></ul><ul><li>User control and freedom </li></ul><ul><li>Consistency and standards </li></ul><ul><li>Error prevention </li></ul><ul><li>Recognition rather than recall </li></ul><ul><li>Flexibility and efficiency of use </li></ul><ul><li>Aesthetic and minimalist design </li></ul><ul><li>Help users recognize, diagnose, and recover from errors ( bad , good example) </li></ul><ul><li>Help and documentation </li></ul><ul><li>Nielsen, J. (1994). Heuristic evaluation. </li></ul><ul><li>In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY. </li></ul>
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.