NUX Manchester – 3rd February 2014
Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)
Jonathan Willson, Principal Lecturer in Information and Communications at MMU. Jonathan works closely with Richard Eskins (known to many) teaching web development and our new UX unit, plus Digital Rights. Jonathan has applied CIT in a number of research projects as an inexpensive way to obtain rich data and deep insights.
5 Digital Marketing Tips | Devherds Software Solutions
Critical Incident Technique: collecting data from a user’s perspective (aka What users really think)
1. SUCCESSFUL
SURVEYS
verb: to investigate the opinions or experience of
(a group of people) by asking them questions
100% per cent of attendees surveyed were very
satisfied with the NUX event*
Oxford Dictionaries
http://oxforddictionaries.com/definition/english/survey
* I made that bit up
2. Usability tool?
”Surveys are not always thought of as
usability tools since they are normally
employed more for marketing-related tasks
– asking users about their likes and
dislikes."
Braun et al: Usability: the site speaks for itself.
3. Advantages
• carefully constructed surveys can:
• provide good usability data
• validate user requirements
• expand upon user requirements
• be used to preview new features or design
changes
• generate meaningful answers
• supplement expert review
• elicit facts
• collect demographics about your users
4. When to survey?
• at the beginning of a project
• learn about current users and their needs
• identify stakeholders
• before a redesign
• learn about current users and their needs
• after launching a new or revised site
• assess whether needs are met / not met
• identify areas for improvement
• have features or content rated or ranked
• seek ideas for further improvements
5. Questions concerning …
• If users are able to find the information they seek
• How satisfied users are with your site
• What experiences users have had with your site or
similar sites
• What users like and dislike about your site
• What frustrations or issues users have with your site
• If users would recommend your site to others
• If users have any ideas or suggestions for
improvements
usability.gov
6. Online surveys
• structured questionnaire that target audience
completes over the internet
• data (responses) stored in a database
• survey tools provide some level of analysis
• broad reach
• low cost
• quick and easy to launch
• you need to have a clear idea about:
• your purpose
• how you will find participants
7. Points to consider
• length of survey
• keep as brief as possible
• indicate how much time will be required
• online surveys typically indicate progress - %
complete
• mix of questions
• open ended and closed
• follow up survey or interview
• opportunity to ask more in-depth questions
9. “Beware number fetishism”
“Number fetishism leads usability studies
astray by focusing on statistical analyses
that are often false, biased, misleading, or
overly narrow. Better to emphasize insights
and qualitative research”
“qualitative delivers the best results for the
least money”
(Nielson, 2004)
10. Quantitative methods
• quantitative research sometimes considered more
scientific or credible v. insight-based studies
• may reduce a complex situation down to mere numbers
• beware bogus findings
• be aware of limitations
• e.g. test subjects in student projects may be other students who
are not representative of mainstream users
• tests of websites that are scaled-back designs with fewer pages
and more limited content
• good quantitative research may …
• allow comparison and identifying trends over time
• be expensive and difficult
11. Avoid taking the ‘wrong path’
• user interfaces and usability are highly
contextual
• their effectiveness depends on a broad
understanding of human behaviour
• experts tend to get better results than
beginners from qualitative studies
• even beginners will get (some) usable
results
13. … to find out how something is used
“What people say they do and what they
actually do are often very different things.”
“If you base your design decisions on what
people tell you they do, and not what they
actually do, your product will not be
designed to support actual user behaviour.”
However, there is another way …
16. Origins
• developed during WW2
• to identify effective and ineffective
behaviours in a variety of military activities
• subsequently developed as a tool for the
systematic study of human behaviour
• “think of some occasion during combat flying in which
you personally experienced feelings of acute
disorientation … describe what you saw, felt or heard
that brought on the experience”
(Flanaghan, 1954)
17. Purpose
• to gather certain important facts concerning
behaviour in a defined situation
• no single rigid set of rules governing data
collection
• rather a flexible set of principles that must
be modified and adapted to meet the
specific situation at hand
• a way of focusing the respondent’s mind on
specific occurrence(s)
18. Key stages
• General aims
• aims or intended outcomes of the activity under investigation
• Plans and specifications
• instructions given to the observers of the behaviour
• Collecting the data
• interviews, group interviews, questionnaires and record forms
• while the activity is ongoing or fairly recent
• Analysing the data
• summarise and describe the data with a focus on thematic content
• Interpreting and reporting
• present the findings in a usable form
19. Relevance
• various studies have judged the method to
be valid and reliable
• capable of generating a comprehensive
and detailed description of the chosen
domain
• set against more general debates in the
social sciences about the validity and
reliability of similar qualitative methods
21. Reflective process
• identifying effective and ineffective ways of
doing something
• factors that help and hinder
• aspects that are critical in a specific situation
• encourages participants to explore new ideas
for problem-solving and make
recommendations
• can be used with concerned, informed insiders
and interested outsiders
22. Uses
• gathering data for the design or re-design
of equipment
• investigating the information needs or
information-seeking behaviour of users
• task analysis
• identifying how decisions are made
• performance evaluation and appraisal
• determining critical requirements for a job
23. Can you recall …
• an incident when …
• you had no support or backup… ?
• you felt good about something you had done … ?
• you realised you did not know enough … ?
• you were too busy … ?
• aim of study to identify staff development needs
• Harrington, 1992
24. Prompts
• what device were you using?
• who else was involved?
• how did the task start / proceed / conclude?
• did you know what you were doing at each stage?
• what were your feelings during / after the
incident?
• what are your feelings now?
• why was the incident critical for you?
25. Benefits
• flexible
• can be applied to multi-user systems
• focus on important issues, e.g. safety critical
• identify cause and severity
• pick up issues that may not be detected by other methods
• can be adapted to questionnaires or interviews
• cost-effective
Be aware that …
• routine issues may not be reported
• not suited to general task analysis
• relies on memory so focus on recent events