National e-Learning Laboratory

703 views

Published on

Published in: Education, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
703
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

National e-Learning Laboratory

  1. 1. Information evening National e-Learning Laboratory
  2. 2. Introduction <ul><li>Agenda </li></ul><ul><ul><li>Why we want to study usability at nell </li></ul></ul><ul><ul><li>Introduction to usability research </li></ul></ul><ul><ul><li>Demonstration of systems </li></ul></ul><ul><ul><li>Questions and answers </li></ul></ul>
  3. 3. What is usability <ul><li>Usability is the &quot;effectiveness, efficiency and satisfaction with which a specified set of users can achieve a specified set of tasks in a particular environment.&quot; </li></ul>ISO
  4. 4. Why study usability? <ul><li>From a user’s perspective </li></ul><ul><ul><li>The interface is the system </li></ul></ul><ul><li>From a business perspective </li></ul><ul><ul><li>Critical processes are located at the human computer interface </li></ul></ul><ul><li>From a design perspective </li></ul><ul><ul><li>Early insights save €€€ over late fixes </li></ul></ul>
  5. 5. Some Usability Criteria <ul><li>How quickly can a user learn to use a new system in order to perform tasks? </li></ul><ul><li>Learning </li></ul><ul><li>At level of organisation, structure, interface, navigation. </li></ul>
  6. 6. Some Usability Criteria <ul><li>What effort is necessary for users to perform tasks? </li></ul><ul><li>Efficiency/Efficacy </li></ul>
  7. 7. Some Usability Criteria <ul><li>How does a system react to errors provoked by the user and what consequences do user errors have? </li></ul><ul><li>Reliability/Robustness </li></ul>
  8. 8. Some Usability Criteria <ul><li>How quickly can a user remember the way to use a rarely used system? </li></ul><ul><li>Recall </li></ul>
  9. 9. Some Usability Criteria <ul><li>How satisfactorily can the system be used? </li></ul><ul><li>Satisfaction/Completion </li></ul>
  10. 10. Usability Testing Timeline Planning: framing the question Planning: target group recruitment Session recording: n users by x sessions Analysis Report
  11. 11. Framing the question <ul><li>Spend time on getting the right question </li></ul><ul><li>Pick your target group </li></ul><ul><li>Describe the process and behavioural indicators </li></ul><ul><li>Set a time frame </li></ul><ul><li>A precise question will get a precise answer </li></ul><ul><li>Decide on how you want to report results and for whom </li></ul>
  12. 12. What are your users doing? <ul><li>How are they using your product? </li></ul><ul><li>What is their experience? </li></ul><ul><li>How could it be improved? </li></ul>
  13. 13. Web-server Logging: navigation paths, performance, task completion. Server-side Logs
  14. 14. Each mouse-click is recorded: time stamp, location, application Each key-press is recorded, too.
  15. 15. Example of Screen Recording Screen behaviour: dynamic web-sites & applications, mouse clicks, keyboard use.
  16. 16. Screen behaviour: dynamic web-sites & applications, mouse clicks, keyboard use. Screen Behaviour Server-side Logs
  17. 17. Example of User Video from different Perspectives
  18. 18. The remote eyetracker is hidden under the screen. It uses infrared light to track the eyes.
  19. 19. The eyetracker records the gaze position and the duration of each fixation.
  20. 20. User behaviour: gesture and posture, facial expression, off-screen activities, gaze position. User Behaviour and Reaction Screen Behaviour Server-side Logs
  21. 21. User feedback: quantitative feedback, e.g., usability questionnaire
  22. 22. User feedback: qualitative feedback, e.g., open-ended questions or interview
  23. 23. User feedback: qualitative feedback, e.g., narrative over screen-recording User Feedback User Behaviour and Reaction Screen Behaviour Server-side Logs
  24. 24. Screen Behaviour User Behaviour and Reaction Server-side Logs User Feedback
  25. 25. What is NELL? <ul><li>Equipment to observe and analyse learner behaviour </li></ul><ul><ul><li>4 workstations </li></ul></ul><ul><ul><li>4 observation desks </li></ul></ul><ul><li>Analysis software </li></ul><ul><li>Participant panel </li></ul>
  26. 27. Microphone
  27. 28. Dome Camera
  28. 29. Observation Equipment <ul><li>Desk camera </li></ul><ul><li>Dome camera </li></ul><ul><li>Screen recorder </li></ul><ul><li>Keyboard log </li></ul><ul><li>Desk and ceiling microphones </li></ul><ul><li>Remote Eye-tracker </li></ul>
  29. 30. User workstation 3 & 4 User workstation 1 & 2 Control desk Test and Observation Rooms
  30. 31. Observation Desk
  31. 32. Screen Recorder Machines
  32. 33. Switchboard
  33. 34. Observation <ul><li>Dome camera control </li></ul><ul><li>Switchboard </li></ul><ul><li>Video and screen-recorder machines </li></ul><ul><li>Data aggregation </li></ul><ul><li>Analysis Software </li></ul>
  34. 35. Analysis Software <ul><li>Qualitative and quantitative analyis of sequential data (Observer XT) </li></ul><ul><li>Emotional expression recognition (FaceReader) </li></ul><ul><li>Eye-tracking analysis (BeGaze) </li></ul>
  35. 36. Analysis <ul><li>Qualitative & quantitative Research </li></ul>Qualitative Quantitative
  36. 37. Analysis <ul><li>Qualitative & quantitative Research </li></ul><ul><li>Deep Analysis of small sample </li></ul><ul><li>Formative Evaluation during development lifecycle </li></ul><ul><li>Analysis supported by software </li></ul>
  37. 38. Eyetracking
  38. 39. We developed a Peer Finder for on-line learning. Design A is straightforward: A list of peers.
  39. 40. In design B users can indicate their preferences.
  40. 41. In design C users can search for suitable peers.
  41. 42. Which works best? A, B, or C? Let’s explore how learners are using the different designs. A B C
  42. 43. The remote eyetracker is hidden under the screen. It uses infrared light to track the eyes.
  43. 44. The eyetracker records the gaze position and the duration of each fixation.
  44. 45. Areas of Interest We can define Areas of Interest to track what learners look at, e.g., which information of a user is considered. For example, in this area, learners can see whether a peer is available or not.
  45. 46. In fact they spent most of the time reading the names rather than assessing their peers’ knowledge or availability.
  46. 47. Usability Questionnaire Results While everybody said all three designs are easy to use ... the eyetracker data revealed that ...
  47. 48. … many users ignored the instructions and the peers’ knowledge completely.
  48. 49. Further Information <ul><li>Abi Reynolds </li></ul><ul><li>Stephan Weibelzahl </li></ul><ul><li>Leo Casey </li></ul><ul><li>www.ncirl.ie </li></ul><ul><li>Centre for Research and Innovation in Learning and Teaching </li></ul><ul><li>[email_address] </li></ul><ul><li>+353 1 4498600 </li></ul>

×