Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Arrogance or Apathy: The Need for Formative Evaluation + Current & Emerging Strategies

727 views

Published on

Dr. Michael M. Grant presents a rationale for using formative evaluation. He also presents a number of methods currently used, such as usability, and emerging strategies for capturing user data.

Published in: Education
  • Be the first to comment

Arrogance or Apathy: The Need for Formative Evaluation + Current & Emerging Strategies

  1. 1. Arrogance or Apathy? The need for formative evaluation + current & emerging strategies Michael M. Grant, PhD University of South Carolina michaelmgrant@sc.edu Michael M. Grant 2015
  2. 2. Michael M. Grant The University of South Carolina http://viral-notebook.com @michaelmgrant
  3. 3. Arrogance or Apathy? We just don’t have time to do evaluation. Our HR folks won’t let us do evals. There’s really no point because we’re going to deploy it anyways. It’s just not going to make a difference. We don’t have access to testers. Our managers don’t care. We’re just doing it for compliance.
  4. 4. Name a reason for evaluation. http://pollev.com/mgrant
  5. 5. Level 5: ROI Level 4: Organization Level 3: Transfer Level 2: Learning Level 1: Reaction Kirkpatrick (& Phillips) Levels
  6. 6. Level 2 Evaluations Level  5:   ROI   Level  4:   Organiza2on   Level  3:  Transfer   Level  2:  Learning   Level  1:  Reac2on   Appeal Effectiveness Efficiency
  7. 7. Level 5: ROI Level 4: Organization Level 3: Transfer Level 2: Learning Level 1: Reaction Kirkpatrick Levels 91.3% 53.9% 22.9% 7.6% 2.1% (ASTD, 2005)   in practice
  8. 8. 79% 38% 15% 9% 0% 10% 20% 30% 40% 50% 60% 70% 80% Reaction Cognitive Behavior Results Kirkpatrick Levels R.A.  Noe  (2005)  Employee  Training  and  Development   in practice
  9. 9. Level 5: ROI Level 4: Organization Level 3: Transfer Level 2: Learning Level 1: Reaction Kirkpatrick Levels (ASTD, 2009)   92% 53.9% 22.9% 7.6% 17.9% in practice
  10. 10. Level 5: ROI Level 4: Organization Level 3: Transfer Level 2: Learning Level 1: Reaction Kirkpatrick Levels (TrainingMag, 2013)   97% 94% 94% 88% 71% in practice Training Magazine Top 125 Companies in 2013
  11. 11. The New World Kirkpatrick 4 Levels ! ! Learning Context Performance Context
  12. 12. Level 4 and Level 5 Level  5:   ROI   Level  4:   Organiza2on   How  do  we  measure  the  impact  on  business?   How  do  we  measure  the  return  on  investment?  
  13. 13. Level 4 and Level 5 Level  5:   ROI   Level  4:   Organiza2on   How  do  we  measure  the  impact  on  business?   How  do  we  measure  the  return  on  investment?   Compares  benefits  to  cost/  Benefit  Cost  ra2o:   ROI(%)  =  Net  Monetary  Benefits      x  100                                Program  costs   Measures  changes  in  business  impact  variables   (  produc2vity,  incidents,  compliance  discrepancies,   customer  service,  etc,.  
  14. 14. Effectiveness Evaluation Activities include field tests, observations, interviews and performance assessments. Purpose? Determine whether the ILE accomplishes its objectives within the immediate or short-term context of its implementation.
  15. 15. Formative Evaluation What’s the purpose?
  16. 16. A focus on improvement during development.
  17. 17. User Review Observations from one-on-ones and small groups
  18. 18. “Vote early and often.” The sooner formative evaluation is conducted during development, the more likely that substantive improvements will be made and costly errors avoided. — Reeves & Hedberg (2003), p. 142
  19. 19. Formative Evaluation Stages •  Design team review •  Expert review •  One-to-one •  Small group •  Field trials
  20. 20. Questions for Evaluation 1.  What are the logistical requirements for implementing the ILE? –  Hardware –  Software –  Adjunct materials –  Help and support 2.  What are the user reactions to the ILE? –  Appeal –  Motivation –  Usability –  Comprehension 3.  What are the trainer/ instructor reactions to the ILE? –  Appeal –  Utility 4.  What are the expert reactions to the ILE? –  Content –  Instructional design –  Human-computer interface –  Aesthetics 5.  What corrections must be made to the ILE? 6.  What enhancements can be made to the ILE?
  21. 21. Data Collection Matrix Methods 1. What are the logistical requirements? 2. What are user reactions ? 3. What are trainer reactions? 4. What are expert reactions? 5. What corrections must be made? 6. What enhancements can be made? Anecdotal records X X X X X User questionnaires X X X X User interviews X X X X User focus groups X X X Usability observations X X X X Online data collection X X Expert reviews X X X
  22. 22. Evaluation informs development • Project ConceptualizationReview • DesignNeeds Assessment • DevelopmentFormative Evaluation • ImplementationEffectiveness Evaluation • InstitutionalizationImpact Evaluation • Project Re-conceptualizationMaintenance Evaluation from Reeves & Hedberg (2003)
  23. 23. Contemporary Development Models Michael Allen/Allen Interactions’ Successive Approximation Model (SAM)
  24. 24. Contemporary Development Models Image  from  h`p://www.intechopen.com/source/html/19453/media/image2.jpeg   Concurrent Design
  25. 25. from  Tripp,  S.,  &  Bichelmeyer,  B.  (1990)   Rapid Prototyping Contemporary Development Models
  26. 26. Contemporary Development Models •  Originated in manufacturing •  ID hijacked from software development •  Focused on development primarily •  Types of prototypes §  Look-and-feel: colors, effects, gross screen layouts §  Media: use of sound effects, narration, 3D illustrations, video, etc. §  Navigation: move through sections, access support (glossary, calculator, etc.) §  Interactivity: content, activities, feedback Rapid Prototyping
  27. 27. Contemporary Development Models 1.  Active user involvement is imperative 2.  The team must be empowered to make decisions 3.  Requirements evolve but the timescale is fixed 4.  Capture requirements at a high level; lightweight & visual 5.  Develop small, incremental releases and iterate 6.  Focus on frequent delivery of products 7.  Complete each feature before moving on to the next 8.  Apply the 80/20 rule 9.  Testing is integrated throughout the project lifecycle – test early and often 10.  A collaborative & cooperative approach between all stakeholders is essential Agile Software Development
  28. 28. What to consider with effectiveness . . . •  An approved evaluation plan – e.g., union, stakeholders, management •  Feasibility •  Reliability •  Validity •  Implementation logs
  29. 29. “Training evaluation provides the data needed to demonstrate that training does provide benefits to the company.” (p. 311, R. Krishnaveni, 2008)
  30. 30. “Vote early and often.” The  sooner  forma6ve  evalua6on  is  conducted   during  development,  the  more  likely  that   substan6ve  improvements  will  be  made  and   costly  errors  avoided.     (Reeves  &  Hedberg,  2003,  p.  142)  
  31. 31. Formative Evaluation Expert  review  during   development   User  review  during   development   Usability  tes2ng   3 Methods
  32. 32. “Experts are anyone with specialized knowledge that is relevant to the design of your interactive learning environment.” (Reeves & Hedberg, 2003, p. 145)
  33. 33. Expert Review • Scope • Sequence • Accuracy • Scenarios • Examples SMEs • Instructional strategies • Sequence • Practice • Mnemonics Instructional experts • Aesthetics • Metaphors • Icons • Navigation Graphic designers • Logistics Teachers/ Trainers • User experience • Story/narrative • Connections with other systems Interaction designers • SCORM compliance/pkg • LMS integration • Metadata LMS Administrators
  34. 34. Interface Review Guidelines from  h`p://it.coe.uga.edu/~treeves/edit8350/UIRF.html      
  35. 35. How usable is this interface?
  36. 36. Here’s what I wrote …
  37. 37. User review in development 1-on-1 Observations •  Prototype Revision 1 •  Try-out impressions; obvious flaws; examples/scenarios •  2 to 3 people •  Instruments –  Observation Notes Form –  Interview Protocol –  Attitude Survey Small Group Trials •  Prototype Revision 2 •  Identify strengths and weaknesses •  3 to 4 people •  Instruments –  Observation Notes Form –  Attitude Survey –  Interview Protocol –  Posttest/Learner Performance     Observations from one-on-ones and small groups
  38. 38. User review in development     In a contemporary model, users are likely involved early and through multiple iterations and multiple prototypes.
  39. 39. What Is Usability?
  40. 40. Two Major Methods to Evaluate Usability Heuristic Evaluation • Quick • Expert Analyses • No user involvement User Testing • Finds more problems • User involvement increases validity • Seeing problems has a huge impact on developers
  41. 41. “The most common user action on a Web site is to flee.” — Edward Tufte
  42. 42. “At least 90% of all commercial Web sites are overly difficult to use …. the average outcome of Web usability studies is that test users fail when they try to perform a test task on the Web. Thus, when you try something new on the Web, the expected outcome is failure. — Jakob Nielsen
  43. 43. Nielsen Web Usability Rules 1.  Visibility of system status 2.  Match between system and real world 3.  User control and freedom 4.  Consistency and standards 5.  Error prevention 6.  Recognition rather than recall 7.  Flexibility and efficiency of use 8.  Help users recognize, diagnose, and recover from errors 9.  Help and documentation 10.  Aesthetic and minimalist design
  44. 44. •  Ease of learning - How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks? •  Efficiency of use - Once an experienced user has learned to use the system, how fast can he or she accomplish tasks? •  Memorability - If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything? •  Error frequency and severity - How often do users make errors while using the system, how serious are these errors, and how do users recover from these errors? •  Subjective satisfaction - How much does the user like using the system?
  45. 45. Heuristic Evaluation Process 1.  Several experts individually compare a product to a set of usability heuristics 2.  Violations of the heuristics are evaluated for their severity and extent suggested solutions 3.  At a group meeting, violation reports are categorized and assigned 4.  average severity ratings, extents, heuristics violated, description of opportunity for improvement
  46. 46. Heuristic Evaluation Comparisons Advantages •  Quick: Do not need to find or schedule users •  Easy to review problem areas many times •  Inexpensive: No fancy equipment Disadvantages •  Validity: No users involved •  Finds fewer problems (40-60% less??) •  Getting good experts •  Building consensus with experts
  47. 47. Heuristic Evaluation Report
  48. 48. Heuristic Evaluation Report
  49. 49. User Testing
  50. 50. User Testing •  People whose characteristics (or profiles) match those of the Web site’s target audience perform a sequence of typical tasks using the site. •  Examines: –  Ease of learning –  Speed of task performance –  Error rates –  User satisfaction –  User retention over time
  51. 51. Image  from  (nz)dave  at  h`p://www.flickr.com/photos/nzdave/491411546/  
  52. 52. “For most companies…it's fine to conduct tests in a conference room or an office — as long as you can close the door to keep out distractions. What matters is that you get hold of real users and sit with them while they use the design. A notepad is the only equipment you need.” — Jakob Nielsen http://www.nngroup.com/articles/usability-101-introduction-to-usability/
  53. 53. Elements of User Testing •  Define target users •  Have users perform representative tasks •  Observe users •  Report results Often called a user profile or persona. Image  from  h`p://www.op2mum-­‐web.co.uk/services/user-­‐needs/personas/  &  h`p://uxsuccess.com/2009/12/01/agile-­‐personas-­‐and-­‐context-­‐scenario/      
  54. 54. Why Multiple Evaluators? •  Single evaluator achieves poor results – Only finds about 35% of usability problems – 5 evaluators find more than 75%
  55. 55. Why only 5 Users? (Nielsen,  2000)  
  56. 56. Reporting User Testing •  Overall goals/objectives •  Methodology •  Target profile •  Testing outline with test script •  Specific task list to perform •  Data analysis & results •  Recommendations
  57. 57. User Experience (UX) from Jesse James Garrett | http://www.jjg.net/ia
  58. 58. User Experience (UX) from Peter Morville | http://semanticstudios.com/user_experience_design/
  59. 59. User Experience (UX) •  Project Management •  User Research •  Usability Evaluation •  Information Architecture (IA) •  User Interface Design   •  Interaction Design (IxD) •  Visual Design Content Strategy •  Accessibility •  Web Analytics b Learner
  60. 60. Current & Emerging Strategies for User Testing
  61. 61. Now defunct!
  62. 62. A/B Testing
  63. 63. usabilla Morae userzoom
  64. 64. 10 Second Usability Test 1.  Disable stylesheets 2.  Check for the following: 1.  Semantic markup 2.  Logical organization 3.  Only images related to content appear
  65. 65. References & Acknowledgements American Society for Training & Development. (2009). The value of evaluation: Making training evaluations more effective. Author. Follett, A. (2009, October 9). 10 qualitative tools to improve your web site. Instant Shift. Retrieved March 18, 2010 from http://www.instantshift.com/2009/10/08/10-qualitative-tools-to-improve- your-website/ Image from http://www.flickr.com/photos/mutsmuts/4695658106/sizes/z/in/photostream/ Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Jakob Nielsen’s Alertbox. Retrieved from http://www.useit.com/alertbox/20000319.html Nielsen, J. (2012, January 4). Usability 101: Introduction to usability. NielsenNorman Group. Retrieved from http://www.nngroup.com/articles/usability-101-introduction-to-usability/ Reeves, T.C. (2004, December 9). Design research for advancing the integration of digital technologies into teaching and learning: Developing and evaluating educational interventions. Paper presented to the Columbia Center for New Media Teaching and Learning, New York, NY. Available at http://ccnmtl.columbia.edu/seminars/reeves/CCNMTLFormative.ppt Reeves, T.C. & Hedberg, J.C. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications. Usability.gov Wu, S. (2015, June 22). 7 best pieces of user testing software. Creative Bloq. Retrieved from http://www.creativebloq.com/ux/best-user-testing-software-61515337
  66. 66. Michael M. Grant 2015

×