Your SlideShare is downloading. ×
Chapter13id
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Chapter13id

438
views

Published on

Published in: Economy & Finance, Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
438
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Asking users & experts
  • 2. The aims
      • Discuss the role of interviews & questionnaires in evaluation.
      • Teach basic questionnaire design.
      • Describe how do interviews, heuristic evaluation & walkthroughs.
      • Describe how to collect, analyze & present data.
      • Discuss strengths & limitations of these techniques
  • 3. Interviews
    • Unstructured - are not directed by a script. Rich but not replicable.
    • Structured - are tightly scripted, often like a questionnaire. Replicable but may lack richness.
    • Semi-structured - guided by a script but interesting issues can be explored in more depth. Can provide a good balance between richness and replicability.
  • 4. Basics of interviewing
    • Remember the DECIDE framework
    • Goals and questions guide all interviews
    • Two types of questions: ‘closed questions’ have a predetermined answer format, e.g., ‘yes’ or ‘no’ ‘open questions’ do not have a predetermined format
    • Closed questions are quicker and easier to analyze
  • 5. Things to avoid when preparing interview questions
    • Long questions
    • Compound sentences - split into two
    • Jargon & language that the interviewee may not understand
    • Leading questions that make assumptions e.g., why do you like …?
    • Unconscious biases e.g., gender stereotypes
  • 6. Components of an interview
    • Introduction - introduce yourself, explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form.
    • Warm-up - make first questions easy & non-threatening.
    • Main body – present questions in a logical order
    • A cool-off period - include a few easy questions to defuse tension at the end
    • Closure - thank interviewee, signal the end, e.g, switch recorder off.
  • 7. The interview process
    • Use the DECIDE framework for guidance
    • Dress in a similar way to participants
    • Check recording equipment in advance
    • Devise a system for coding names of participants to preserve confidentiality.
    • Be pleasant
    • Ask participants to complete an informed consent form
  • 8. Probes and prompts
    • Probes - devices for getting more information. e.g., ‘would you like to add anything?’
    • Prompts - devices to help interviewee, e.g., help with remembering a name
    • Remember that probing and prompting should not create bias.
    • Too much can encourage participants to try to guess the answer.
  • 9. Group interviews
    • Also known as ‘focus groups’
    • Typically 3-10 participants
    • Provide a diverse range of opinions
    • Need to be managed to: - ensure everyone contributes - discussion isn’t dominated by one person - the agenda of topics is covered
  • 10. Analyzing interview data
    • Depends on the type of interview
    • Structured interviews can be analyzed like questionnaires
    • Unstructured interviews generate data like that from participant observation
    • It is best to analyze unstructured interviews as soon as possible to identify topics and themes from the data
    • (refer to page 395)
  • 11. Questionnaires
    • Questions can be closed or open
    • Closed questions are easiest to analyze, and may be done by computer
    • Can be administered to large populations
    • Paper, email & the web used for dissemination
    • Advantage of electronic questionnaires is that data goes into a data base & is easy to analyze
    • Sampling can be a problem when the size of a population is unknown as is common online
  • 12. Questionnaire style
    • Varies according to goal so use the DECIDE framework for guidance
    • Questionnaire format can include: - ‘yes’, ‘no’ checkboxes - checkboxes that offer many options - Likert rating scales - semantic scales - open-ended responses
    • Likert scales have a range of points
    • 3, 5, 7 & 9 point scales are common
    • Debate about which is best
  • 13. Developing a questionnaire
    • Provide a clear statement of purpose & guarantee participants anonymity
    • Plan questions - if developing a web-based questionnaire, design off-line first
    • Decide on whether phrases will all be positive, all negative or mixed
    • Pilot test questions - are they clear, is there sufficient space for responses
    • Decide how data will be analyzed & consult a statistician if necessary
  • 14. Encouraging a good response
    • Make sure purpose of study is clear
    • Promise anonymity
    • Ensure questionnaire is well designed
    • Offer a short version for those who do not have time to complete a long questionnaire
    • If mailed, include a s.a.e.
    • Follow-up with emails, phone calls, letters
    • Provide an incentive
    • 40% response rate is high, 20% is often acceptable
  • 15. Advantages of online questionnaires
    • Responses are usually received quickly
    • No copying and postage costs
    • Data can be collected in database for analysis
    • Time required for data analysis is reduced
    • Errors can be corrected easily
    • Disadvantage - sampling problematic if population size unknown
    • Disadvantage - preventing individuals from responding more than once
  • 16. Problems with online questionnaires
    • Sampling is problematic if population size is unknown
    • Preventing individuals from responding more than once
    • Individuals have also been known to change questions in email questionnaires
  • 17. Questionnaire data analysis & presentation
    • Present results clearly - tables may help
    • Simple statistics can say a lot, e.g., mean, median, mode, standard deviation
    • Percentages are useful but give population size
    • Bar graphs show categorical data well
    • More advanced statistics can be used if needed
  • 18. Add
    • SUMI
    • MUMMS
    • QUIS
  • 19. Asking experts
    • Experts use their knowledge of users & technology to review software usability
    • Expert critiques (crits) can be formal or informal reports
    • Heuristic evaluation is a review guided by a set of heuristics
    • Walkthroughs involve stepping through a pre-planned scenario noting potential problems
  • 20. Heuristic evaluation
    • Developed Jacob Nielsen in the early 1990s
    • Based on heuristics distilled from an empirical analysis of 249 usability problems
    • These heuristics have been revised for current technology, e.g., HOMERUN for evaluating commercial websites
    • Heuristics still needed for mobile devices, wearables, virtual worlds, etc.
    • Design guidelines form a basis for developing heuristics
  • 21. H O M E R U N
    • H igh-quality content
    • O ften updated
    • M inimal download time
    • E ase of use
    • R elevant to users’ needs
    • U nique to the online medium
    • N etcentric corporate culture
    • Nielsen (1999)
  • 22. Nielsen’s heuristics
    • Visibility of system status
    • Match between system and real world
    • User control and freedom
    • Consistency and standards
    • Help users recognize, diagnose, recover from errors
    • Error prevention
    • Recognition rather than recall
    • Flexibility and efficiency of use
    • Aesthetic and minimalist design
    • Help and documentation
  • 23. Discount evaluation
    • Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used.
    • Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.
  • 24. 3 stages for doing heuristic evaluation
    • Briefing session to tell experts what to do
    • Evaluation period of 1-2 hours in which: - Each expert works separately - Take one pass to get a feel for the product - Take a second pass to focus on specific features
    • Debriefing session in which experts work together to prioritize problems
  • 25. Advantages and problems
    • Few ethical & practical issues to consider
    • Can be difficult & expensive to find experts
    • Best experts have knowledge of application domain & users
    • Biggest problems - important problems may get missed - many trivial problems are often identified
  • 26. Cognitive walkthroughs
    • Focus on ease of learning
    • Designer presents an aspect of the design & usage scenarios
    • One of more experts walk through the design prototype with the scenario
    • Expert is told the assumptions about user population, context of use, task details
    • Experts are guided by 3 questions
  • 27. The 3 questions
    • Will the correct action be sufficiently evident to the user?
    • Will the user notice that the correct action is available?
    • Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems
  • 28. Pluralistic walkthrough
    • Variation on the cognitive walkthrough theme
    • Performed by a carefully managed team
    • The panel of experts begins by working separately
    • Then there is managed discussion that leads to agreed decisions
    • The approach lends itself well to participatory design
  • 29. Key points
    • Structured, unstructured, semi-structured interviews, focus groups & questionnaires
    • Closed questions are easiest to analyze & can be replicated
    • Open questions are richer
    • Check boxes, Likert & semantic scales
    • Expert evaluation: heuristic & walkthroughs
    • Relatively inexpensive because no users
    • Heuristic evaluation relatively easy to learn
    • May miss key problems & identify false ones

×