SlideShare a Scribd company logo
Lecture 20: Evaluating medical software
Dr. Martin Chapman
Principles of Health Informatics (7MPE1000). https://martinchapman.co.uk/teaching
Different
book
Lecture structure
Prologue: What is usability?
The Usability Engineering Lifecycle (Part 1) - Design
Lecture 19: Understanding the user, participatory design and guidelines
and heuristics.
The Usability Engineering Lifecycle (Part 2) - Evaluation
Lecture 20: User testing, user assessment methods and iterative design
and follow-up.
Epilogue: Making the lifecycle work
Learning outcomes
1. Understand how the heuristics seen in Lecture 19 can also be
used in interface evaluation.
2. Be able to define the different stages in user testing.
3. Be able to critique the use of user testing as an evaluation
mechanism.
4. Understand the difference between objective and subjective data
collection.
5. Determine mechanisms for supporting iterative design.
6. Be able to critique the usability engineering lifecycle, and what
makes it work.
Short prologue: What is interface evaluation?
Interface evaluation, at a high level, involves determining whether a
designed interface, once it has been designed, is actually usable.
In other words, a decision is made, by either the designer of the
software or other users, on whether the interface meets expectations.
The data gathered for an interface evaluation may be objective,
consisting of (for example) information relating to how easily users
complete tasks, or may be subjective, consisting of, for example,
whether users think an interface is easy to use.
Remember lots of the principles we
discuss could also be applied to whole
systems, as well as user interfaces.
Short prologue: Heuristic evaluation
You might have noticed that if we flip the heuristics we saw in
Lecture 19 on their head and consider them as metrics rather than
guidelines, they actually match (most of) the description just seen.
Heuristics can be used to determine usability and they can be applied
by the designer themselves, and, as such, can be used to conduct an
interface evaluation.
We will now again consider the Usability Engineering Lifecycle, a
specific UI development model.
This model has two distinct phases, the design phase, which was
covered in Lecture 19, and the evaluation phase, which we will cover
here.
Evaluation phase:
User
testing
Iterative design and
follow-up
User assessment
methods
Back to The Usability Engineering Lifecycle
User testing
The Usability Engineering Lifecycle
Short prologue: What is user testing?
We’ve seen that interface evaluation involves determining whether a
designed interface is usable.
When we involve actual users in this process, we call this user
testing.
In the context of software, there
are many different types of
testing, including tests that
confirm the correctness of code.
One typically observes an individual interacting
with an interface, gathers data on this in some
way, and then uses this as a part of the
interface evaluation process.
Heuristic evaluation vs user testing
Using guidelines and heuristics for evaluation (as well as design) is a
good and, critically, inexpensive approach.
This is mostly because it doesn’t tend to involve actual users.
However, sometimes more rigorous approaches are needed. This is
where full user testing, which does involve including users in the
evaluation process, comes into play.
To cover user testing, we will consider three (sub-)stages:
preparation (goals, tasks and plans), date gathering (think-aloud
evaluation and performance evaluation) and debriefing.
1. Preparation
User testing
Recall: Formative and summative evaluation
In Lecture 6, we saw that the goal of system evaluation can be either
formative or summative.
Formal
High-level
Goals
User interface test preparation usually involves formalising several
things, and determining the goal of the overall evaluation being
supported (formative or summative) is one.
Evaluation goals impact the way data
is collected during a test, as we will
see later.
Formative interface evaluation typically takes
place when the UI development lifecycle is
iterative, so the focus is on improvement.
Summative evaluation has an air of finality
about it, and often takes place after the
development process, to, for example,
determine the quality of a piece of software.
Tasks
With a goal in mind, the next element of a test that needs to be
formalised is the tasks a user should perform as a part of the test(s).
The tasks chosen should reflect closely the way the system will be
used in practice. As such, this is an important point at which to
come back to task analysis.
It’s important to try and break tasks down into a size that makes
them easy to complete, but not to a degree that the tasks are trivial.
Tasks
1. ‘Enter number of
encounters’
2. ‘Obtain totals’ 3. ‘Construct a
chart showing
trends’
Easy Challenging
Starting with easier tasks is likely
to increase a user’s confidence.
Testing a spreadsheet program
for recording clinic visits.
Plans
All the decisions
made when
preparing for a test
(such as goals and
tasks, as we’ve seen)
are formalised in
what we call a test
plan.
Several formal
frameworks exist for
structuring test
plans.
Pilot test vs. full tests
Before launching into a full user test, it might be useful to try out
the plan on a smaller group of users first.
We refer to this as pilot testing.
Pilot testing can be used to identify issues with any aspect of the
test procedure, such as the phrasing of instructions for test tasks, or
to clarify the test experimental procedure itself, such as what is to be
measured.
Example: CONSULT pilot study
Goals:
(1) Determine suitability of test protocol
(2) Identify usability and technical problems in
system
7-day pilot study (protocol):
Phase 1 (Day 1): induction and familiarisation
Phase 2 (Days 2 – 4): use with chatbot
Phase 3 (Days 5 – 7): use without chatbot
Panos Balatsoukas, Isabel Sassoon,
Martin Chapman, et al. In the wild
usability assessment of a connected health
system for stroke self-management: a pilot.
In 8th IEEE International Conference
on Healthcare Informatics, 2020.
2. Data collection
User testing
We are now in a position to run
our tests, and need to collect data
from our users that we can then
translate into insights about the
usability of our software
Think-Aloud
One of the most common and important ways to collect data.
Asks users to verbalise their thoughts while they complete the test
tasks.
Allows an evaluator to see how the user views the system and any
issues they have (e.g. misconceptions or points of confusion).
Think-aloud best supports formative
evaluation.
Very fine-grained insight can be obtained as
to which parts of the interface may cause
issues, as every thought is expressed
Performance measurement
In contrast to the think-aloud approach, performance measurements
don’t involve direct communication with the user, but instead let
users complete each test task, while numeric data is collected, such
as completion time and error rates.
When collecting data in this way, there needs to be a clear
connection between the goal and the collection method to make sure
the right data is collected:
Performance measurements best
support summative evaluation.
Performance measurement
Quantify what this usability
principle actually means in the
context of this test.
Provide more detail, e.g. by relating to the
Five Usability Principles from Lecture 19.
Determine how to
actually measure the
user’s performance
Further still, determine
the activities that will
generate data.
May be more specific, depending on what
is being tested.
User testing critique: User testing ethics
We can close out our discussion on data collection during the testing
process by considering some limitations.
As with any use of ‘human subjects’,
there are ethical considerations to be
made. Not doing so risks harm to the
users involved.
User testing critique: reliability and validity
Other factors affect the quality of
our tests, such as the variety of
test tasks created.
Valid test results actually reflect the
usability of the software and don’t, for
example, simply reflect an artefact of
the testing environment.
Is a new piece of EHR software better received because its
usability has increased or because it was displayed on a
better monitor?
Test results are only useful if they are reliable and valid.
Reliable test results are repeatable (the same results would be
received if the tests were repeated), which is often hard in this
domain as users differ greatly.
3. Debriefing
User testing
Debrief
This is the opportunity for the evaluator to ask for further comments
or clarifications about anything that happened during the tests.
This stage in the process is also an opportunity
to collect subjective data…
User assessment methods
The Usability Engineering Lifecycle
Subjective data
We noted earlier that the data collected for interface evaluation can
be either objective or subjective.
The data collected during the main user testing process is, arguably,
objective, in that it focuses on quantifiable task completion.
Other approaches that attempt to gauge, for example, users’ views
on the usability of a system, collect subjective data that doesn’t
reflect actual usability.
Despite its subjectivity, this supplementary data is still incredibly
important.
Questionnaires and Interviews
Questionnaires and interviews, as subjective
data collection techniques, are inherently
linked as both involve asking users a set of
questions and recording their answers.
Both may be literally linked, as an interviewer
goes through a questionnaire live with a
participant, obtaining both the quantitative
data associated with a questionnaire and also
having the opportunity to follow up with
longer-form questions.
Questionnaires that use a specific
scale (Lecture 19) may again be
useful here.
Focus Groups
Focus groups involve multiple participants and are often more
informal, with the added benefit of the facilitator being able to see,
and record the impact of, group dynamics.
Note that all of these techniques can
also be used in the design phase.
Example: PPIE at KCL and beyond
For health software, an important group of users to consult about
usability, or indeed research projects in general, may be patients.
https://digitalmimic.ai
User assessment methods
Subjective data collection methods form part of a wider category of
user assessment methods that are inherently less formal than user
testing methods.
We refer to these as user assessment methods (rather than user
testing) methods, to reflect this.
There are several other user assessment methods worth mentioning
before we review all the evaluation techniques we have seen.
We could perhaps classify heuristic
evaluation as a user assessment
method.
User feedback
The first notable (additional) user assessment method also collects
subjective data, except rather than an interviewer proactively asking
users for their insight, data collection mechanisms are built directly
into the software.
(Fly-on-the-wall) Observation and Logging Use
Watching how users interact with software can be an alternative
approach to obtaining reasonably objective data without the cost of
running a full user test. It is also less obtrusive.
In its simplest form, watching interactions can involve visiting a user
in the location in which a system is deployed and observing them
(‘fly-on-the-wall’).
More complex approaches actually log user interactions within the
system itself and monitor usability in this way.
Logging Use
Area in which
users could click
Area in which
users actually
clicked
When the interactive areas
don’t match the areas that
are actually clicked, we
have a usability problem.
A click
There are certainly some potential
privacy issues with this approach.
Bringing it all together…
So far, we’ve seen how heuristic evaluation offers us techniques to
informally evaluate an interface.
We then looked at a more formal approach to interface evaluation,
which involves the collection of objective data via user testing.
Most recently, we considered user assessment techniques, which
mostly consist of subjective data collection.
We can now summarise all of these techniques…
Iterative design and follow-up
The Usability Engineering Lifecycle
Iterative design
The results of a summative evaluation techniques, or data collection
techniques that rely on looking at deployed software, usually aren’t
feed directly back into the development process.
They, instead, are usually part of follow-up studies of installed
systems, the result of which feed into the next versions of the
software.
However, formative techniques, like Think-Aloud, are very much
conducted with the intent of accommodating feedback into the
design process. Once users have given feedback a new iteration of the
design is produced.
Iterative design – Repeat evaluation
We need to repeatedly evaluate our UI after each iteration to check
not only that our changes have been effective, but also to check that
one change hasn’t introduced other usability issues or made the
interface worse for users who didn’t experience any problems in the
first place.
Inexpensive, heuristic evaluation
may be more appropriate if
there are lots of iterations.
If switching from images/numbers to full text is
applied globally to an interface, it may be an
improvement in some areas, but not others.
Critique: Iterative design
Even with proper evaluation, issues
may still appear over design
iterations due to having an
improper record of why changes
were made.
For example, if a change was made
to fix a significant usability problem,
but this problem is not recorded,
future iterations may undo this
change in favour of less significant
usability problems
Move the controls to the
top to solve efficiency of
use issues (objective)
Move the controls to the
bottom because it looks
better (subjective)
Solution: Design provenance
We can use different forms of design provenance to record the
rationale for why certain design choices were made and ensure
significant design choices are not undone at later stages.
The Usability Engineering Lifecycle
We can now combine what we saw in Lecture 19 with what we’ve
seen in this lecture to look at the full Usability Engineering Lifecycle:
Design phase:
Evaluation phase:
The (full) Usability Engineering Lifecycle
User
testing
Iterative design and
follow-up
User assessment
methods
Participatory
design
Understanding
the user
Guidelines and
heuristics
Epilogue: Making the lifecycle work
Meta-methods
The usability engineering lifecycle will only be successful if several
supplementary activities take place.
We call these meta-methods (methods that support another
method):
1. Ensure that supplementary planning takes place (a test plan is a
good example of this).
2. All plans should be independently reviewed.
3. Conduct a pilot study (as discussed)
4. Construct an overall plan of which steps of the lifecycle will be
used (as not every step can always be used), to guide activities.
Prioritising usability activities
Our final meta-method noted that not every step in the lifecycle may
be used.
Indeed, factors like cost or time constraints may mean that only a
subset of activities can be selected. In the presence of constraints like
these, which activities should be selected?
The literature suggests that task analysis and iterative design are the
most important aspects of the lifecycle, followed by user testing,
participatory design and observation.
‘Ad-hoc’ usability engineering
While usability engineering activities should be planned in advance
and/or take place continuously, sometimes there may be a need for
ad-hoc (unplanned) activities.
To prepare for this, several steps can be taken:
invest in prototyping tools (e.g. wireframing
software), be familiar with the latest UI
guidelines, understand the characteristics of your
average user, maintain ongoing relationships with
users and keep up to date with the latest
literature.
It all comes back to interventions…
When we talked about usability design, we saw that an interface
needs to be correctly designed so as not to hinder interventions.
While it is preferable to remove any potential usability issues during
the design phase, the next best solution is to identify issues
afterwards during the evaluation, using the techniques we’ve looked
at, and correct them.
The usability of both
software used by clinicians
to deliver interventions
(left), and software used by
patients to access an
intervention directly (right),
needs to be considered.
Summary
Evaluating the usability of a user interface after it has been designed
is important to identify any (additional) issues (with intervention
delivery) and to continuously improve.
We have broadly seen two classes of interface evaluation techniques:
more objective approaches, such as user testing, and more subjective
user assessment methods.
We have identified that a single evaluation (and subsequent interface
alterations) is not sufficient, and multiple iterations are needed to
improve.
The Usability Engineering Lifecycle only works if it is supported by
supplementary activities, if activities are prioritised and if there is
proper preparation.
References and Images
Enrico Coiera. Guide to Health Informatics (3rd ed.). CRC Press, 2015.
Jakob Nielsen. Usability Engineering. Morgan Kaufmann, 1993.
Bella Martin. Universal methods of design 100 ways to research complex problems, develop innovative ideas, and design
effective solutions. Rockport Publishers, 2012.
https://www.invespcro.com/blog/usability-testing-on-a-budget/
https://www.researchgate.net/figure/Screenshot-showing-coding-of-public-health-measures-in-EMIS-
web_fig2_340137102
https://open.spotify.com/album/2AmrUwvN3mjWvqjzgh1RCM
https://dribbble.com/shots/6655316-Iterative-Design-Process-UX-UI-Card-Design
https://www.microsoft.com/en-gb/microsoft-teams/group-chat-software
https://www.w3.org/TR/prov-dm/
https://wireframesketcher.com/sample-mockups.html

More Related Content

Similar to Principles of Health Informatics: Evaluating medical software

What I Learned In Pr Writing
What I Learned In Pr WritingWhat I Learned In Pr Writing
What I Learned In Pr Writing
cwhitin4
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
guest08cd22
 
Qué es un blog?
Qué es un blog?Qué es un blog?
Qué es un blog?
carolina_zapata
 
Design For Strangers
Design For StrangersDesign For Strangers
Design For Strangers
test99
 
Designfo#{1} #{2}trangers
Designfo#{1} #{2}trangersDesignfo#{1} #{2}trangers
Designfo#{1} #{2}trangers
guest0437b8
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
guru100
 
Biblioteca.
Biblioteca.Biblioteca.
Biblioteca.
Bibliotecaesc1de12
 
Rashmi Xerox Parc
Rashmi Xerox ParcRashmi Xerox Parc
Rashmi Xerox Parc
test98
 
Designfor strangers
Designfor strangersDesignfor strangers
Designfor strangers
guestc72c35
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
guestbdd02b
 
Majestic MRSS Usability Engineering
Majestic MRSS Usability EngineeringMajestic MRSS Usability Engineering
Majestic MRSS Usability Engineering
Majestic MRSS
 
Chapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation TekhniqueChapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation Tekhnique
Muhammad Najib
 
MMRSS Usability Engineering
MMRSS Usability EngineeringMMRSS Usability Engineering
MMRSS Usability Engineering
MajesticMRSS
 
Software Development Life Cycle & Its Models
Software Development Life Cycle & Its ModelsSoftware Development Life Cycle & Its Models
Software Development Life Cycle & Its Models
Dr.Purushottam Petare
 
MIT521 software testing (2012) v2
MIT521   software testing  (2012) v2MIT521   software testing  (2012) v2
MIT521 software testing (2012) v2
Yudep Apoi
 
Principles of Health Informatics: Usability of medical software
Principles of Health Informatics: Usability of medical softwarePrinciples of Health Informatics: Usability of medical software
Principles of Health Informatics: Usability of medical software
Martin Chapman
 
Ijcatr04051006
Ijcatr04051006Ijcatr04051006
Ijcatr04051006
Editor IJCATR
 
Usability requirements and their elicitation
Usability requirements and their elicitationUsability requirements and their elicitation
Usability requirements and their elicitation
Lucas Machado
 
Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application
tanveer ahmad
 
Sdlc
SdlcSdlc

Similar to Principles of Health Informatics: Evaluating medical software (20)

What I Learned In Pr Writing
What I Learned In Pr WritingWhat I Learned In Pr Writing
What I Learned In Pr Writing
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
 
Qué es un blog?
Qué es un blog?Qué es un blog?
Qué es un blog?
 
Design For Strangers
Design For StrangersDesign For Strangers
Design For Strangers
 
Designfo#{1} #{2}trangers
Designfo#{1} #{2}trangersDesignfo#{1} #{2}trangers
Designfo#{1} #{2}trangers
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
 
Biblioteca.
Biblioteca.Biblioteca.
Biblioteca.
 
Rashmi Xerox Parc
Rashmi Xerox ParcRashmi Xerox Parc
Rashmi Xerox Parc
 
Designfor strangers
Designfor strangersDesignfor strangers
Designfor strangers
 
Designfor Strangers
Designfor StrangersDesignfor Strangers
Designfor Strangers
 
Majestic MRSS Usability Engineering
Majestic MRSS Usability EngineeringMajestic MRSS Usability Engineering
Majestic MRSS Usability Engineering
 
Chapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation TekhniqueChapter 7 - Evaluation Tekhnique
Chapter 7 - Evaluation Tekhnique
 
MMRSS Usability Engineering
MMRSS Usability EngineeringMMRSS Usability Engineering
MMRSS Usability Engineering
 
Software Development Life Cycle & Its Models
Software Development Life Cycle & Its ModelsSoftware Development Life Cycle & Its Models
Software Development Life Cycle & Its Models
 
MIT521 software testing (2012) v2
MIT521   software testing  (2012) v2MIT521   software testing  (2012) v2
MIT521 software testing (2012) v2
 
Principles of Health Informatics: Usability of medical software
Principles of Health Informatics: Usability of medical softwarePrinciples of Health Informatics: Usability of medical software
Principles of Health Informatics: Usability of medical software
 
Ijcatr04051006
Ijcatr04051006Ijcatr04051006
Ijcatr04051006
 
Usability requirements and their elicitation
Usability requirements and their elicitationUsability requirements and their elicitation
Usability requirements and their elicitation
 
Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application Effectiveness of software product metrics for mobile application
Effectiveness of software product metrics for mobile application
 
Sdlc
SdlcSdlc
Sdlc
 

More from Martin Chapman

Principles of Health Informatics: Artificial intelligence and machine learning
Principles of Health Informatics: Artificial intelligence and machine learningPrinciples of Health Informatics: Artificial intelligence and machine learning
Principles of Health Informatics: Artificial intelligence and machine learning
Martin Chapman
 
Principles of Health Informatics: Clinical decision support systems
Principles of Health Informatics: Clinical decision support systemsPrinciples of Health Informatics: Clinical decision support systems
Principles of Health Informatics: Clinical decision support systems
Martin Chapman
 
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
Martin Chapman
 
Technical Validation through Automated Testing
Technical Validation through Automated TestingTechnical Validation through Automated Testing
Technical Validation through Automated Testing
Martin Chapman
 
Scalable architectures for phenotype libraries
Scalable architectures for phenotype librariesScalable architectures for phenotype libraries
Scalable architectures for phenotype libraries
Martin Chapman
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
Martin Chapman
 
Using AI to autonomously identify diseases within groups of patients
Using AI to autonomously identify diseases within groups of patientsUsing AI to autonomously identify diseases within groups of patients
Using AI to autonomously identify diseases within groups of patients
Martin Chapman
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
Martin Chapman
 
Principles of Health Informatics: Social networks, telehealth, and mobile health
Principles of Health Informatics: Social networks, telehealth, and mobile healthPrinciples of Health Informatics: Social networks, telehealth, and mobile health
Principles of Health Informatics: Social networks, telehealth, and mobile health
Martin Chapman
 
Principles of Health Informatics: Communication systems in healthcare
Principles of Health Informatics: Communication systems in healthcarePrinciples of Health Informatics: Communication systems in healthcare
Principles of Health Informatics: Communication systems in healthcare
Martin Chapman
 
Principles of Health Informatics: Terminologies and classification systems
Principles of Health Informatics: Terminologies and classification systemsPrinciples of Health Informatics: Terminologies and classification systems
Principles of Health Informatics: Terminologies and classification systems
Martin Chapman
 
Principles of Health Informatics: Representing medical knowledge
Principles of Health Informatics: Representing medical knowledgePrinciples of Health Informatics: Representing medical knowledge
Principles of Health Informatics: Representing medical knowledge
Martin Chapman
 
Principles of Health Informatics: Informatics skills - searching and making d...
Principles of Health Informatics: Informatics skills - searching and making d...Principles of Health Informatics: Informatics skills - searching and making d...
Principles of Health Informatics: Informatics skills - searching and making d...
Martin Chapman
 
Principles of Health Informatics: Informatics skills - communicating, structu...
Principles of Health Informatics: Informatics skills - communicating, structu...Principles of Health Informatics: Informatics skills - communicating, structu...
Principles of Health Informatics: Informatics skills - communicating, structu...
Martin Chapman
 
Principles of Health Informatics: Models, information, and information systems
Principles of Health Informatics: Models, information, and information systemsPrinciples of Health Informatics: Models, information, and information systems
Principles of Health Informatics: Models, information, and information systems
Martin Chapman
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
Martin Chapman
 
Using Microservices to Design Patient-facing Research Software
Using Microservices to Design Patient-facing Research SoftwareUsing Microservices to Design Patient-facing Research Software
Using Microservices to Design Patient-facing Research Software
Martin Chapman
 
Using CWL to support EHR-based phenotyping
Using CWL to support EHR-based phenotypingUsing CWL to support EHR-based phenotyping
Using CWL to support EHR-based phenotyping
Martin Chapman
 
Phenoflow: An Architecture for Computable Phenotypes
Phenoflow: An Architecture for Computable PhenotypesPhenoflow: An Architecture for Computable Phenotypes
Phenoflow: An Architecture for Computable Phenotypes
Martin Chapman
 
Phenoflow 2021
Phenoflow 2021Phenoflow 2021
Phenoflow 2021
Martin Chapman
 

More from Martin Chapman (20)

Principles of Health Informatics: Artificial intelligence and machine learning
Principles of Health Informatics: Artificial intelligence and machine learningPrinciples of Health Informatics: Artificial intelligence and machine learning
Principles of Health Informatics: Artificial intelligence and machine learning
 
Principles of Health Informatics: Clinical decision support systems
Principles of Health Informatics: Clinical decision support systemsPrinciples of Health Informatics: Clinical decision support systems
Principles of Health Informatics: Clinical decision support systems
 
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
Mechanisms for Integrating Real Data into Search Game Simulations: An Applica...
 
Technical Validation through Automated Testing
Technical Validation through Automated TestingTechnical Validation through Automated Testing
Technical Validation through Automated Testing
 
Scalable architectures for phenotype libraries
Scalable architectures for phenotype librariesScalable architectures for phenotype libraries
Scalable architectures for phenotype libraries
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
 
Using AI to autonomously identify diseases within groups of patients
Using AI to autonomously identify diseases within groups of patientsUsing AI to autonomously identify diseases within groups of patients
Using AI to autonomously identify diseases within groups of patients
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
 
Principles of Health Informatics: Social networks, telehealth, and mobile health
Principles of Health Informatics: Social networks, telehealth, and mobile healthPrinciples of Health Informatics: Social networks, telehealth, and mobile health
Principles of Health Informatics: Social networks, telehealth, and mobile health
 
Principles of Health Informatics: Communication systems in healthcare
Principles of Health Informatics: Communication systems in healthcarePrinciples of Health Informatics: Communication systems in healthcare
Principles of Health Informatics: Communication systems in healthcare
 
Principles of Health Informatics: Terminologies and classification systems
Principles of Health Informatics: Terminologies and classification systemsPrinciples of Health Informatics: Terminologies and classification systems
Principles of Health Informatics: Terminologies and classification systems
 
Principles of Health Informatics: Representing medical knowledge
Principles of Health Informatics: Representing medical knowledgePrinciples of Health Informatics: Representing medical knowledge
Principles of Health Informatics: Representing medical knowledge
 
Principles of Health Informatics: Informatics skills - searching and making d...
Principles of Health Informatics: Informatics skills - searching and making d...Principles of Health Informatics: Informatics skills - searching and making d...
Principles of Health Informatics: Informatics skills - searching and making d...
 
Principles of Health Informatics: Informatics skills - communicating, structu...
Principles of Health Informatics: Informatics skills - communicating, structu...Principles of Health Informatics: Informatics skills - communicating, structu...
Principles of Health Informatics: Informatics skills - communicating, structu...
 
Principles of Health Informatics: Models, information, and information systems
Principles of Health Informatics: Models, information, and information systemsPrinciples of Health Informatics: Models, information, and information systems
Principles of Health Informatics: Models, information, and information systems
 
Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...Using AI to understand how preventative interventions can improve the health ...
Using AI to understand how preventative interventions can improve the health ...
 
Using Microservices to Design Patient-facing Research Software
Using Microservices to Design Patient-facing Research SoftwareUsing Microservices to Design Patient-facing Research Software
Using Microservices to Design Patient-facing Research Software
 
Using CWL to support EHR-based phenotyping
Using CWL to support EHR-based phenotypingUsing CWL to support EHR-based phenotyping
Using CWL to support EHR-based phenotyping
 
Phenoflow: An Architecture for Computable Phenotypes
Phenoflow: An Architecture for Computable PhenotypesPhenoflow: An Architecture for Computable Phenotypes
Phenoflow: An Architecture for Computable Phenotypes
 
Phenoflow 2021
Phenoflow 2021Phenoflow 2021
Phenoflow 2021
 

Recently uploaded

LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
RAHUL
 
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
National Information Standards Organization (NISO)
 
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDFLifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
Vivekanand Anglo Vedic Academy
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Denish Jangid
 
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptxPrésentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
siemaillard
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
iammrhaywood
 
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.ppt
Level 3 NCEA - NZ: A  Nation In the Making 1872 - 1900 SML.pptLevel 3 NCEA - NZ: A  Nation In the Making 1872 - 1900 SML.ppt
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.ppt
Henry Hollis
 
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
Nguyen Thanh Tu Collection
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
Nicholas Montgomery
 
B. Ed Syllabus for babasaheb ambedkar education university.pdf
B. Ed Syllabus for babasaheb ambedkar education university.pdfB. Ed Syllabus for babasaheb ambedkar education university.pdf
B. Ed Syllabus for babasaheb ambedkar education university.pdf
BoudhayanBhattachari
 
Pharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brubPharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brub
danielkiash986
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
Nicholas Montgomery
 
Gender and Mental Health - Counselling and Family Therapy Applications and In...
Gender and Mental Health - Counselling and Family Therapy Applications and In...Gender and Mental Health - Counselling and Family Therapy Applications and In...
Gender and Mental Health - Counselling and Family Therapy Applications and In...
PsychoTech Services
 
Electric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger HuntElectric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger Hunt
RamseyBerglund
 
Wound healing PPT
Wound healing PPTWound healing PPT
Wound healing PPT
Jyoti Chand
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
TechSoup
 
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem studentsRHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
Himanshu Rai
 
مصحف القراءات العشر أعد أحرف الخلاف سمير بسيوني.pdf
مصحف القراءات العشر   أعد أحرف الخلاف سمير بسيوني.pdfمصحف القراءات العشر   أعد أحرف الخلاف سمير بسيوني.pdf
مصحف القراءات العشر أعد أحرف الخلاف سمير بسيوني.pdf
سمير بسيوني
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
TechSoup
 
Stack Memory Organization of 8086 Microprocessor
Stack Memory Organization of 8086 MicroprocessorStack Memory Organization of 8086 Microprocessor
Stack Memory Organization of 8086 Microprocessor
JomonJoseph58
 

Recently uploaded (20)

LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
 
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
Jemison, MacLaughlin, and Majumder "Broadening Pathways for Editors and Authors"
 
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDFLifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
Lifelines of National Economy chapter for Class 10 STUDY MATERIAL PDF
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
 
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptxPrésentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
 
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.ppt
Level 3 NCEA - NZ: A  Nation In the Making 1872 - 1900 SML.pptLevel 3 NCEA - NZ: A  Nation In the Making 1872 - 1900 SML.ppt
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.ppt
 
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
 
B. Ed Syllabus for babasaheb ambedkar education university.pdf
B. Ed Syllabus for babasaheb ambedkar education university.pdfB. Ed Syllabus for babasaheb ambedkar education university.pdf
B. Ed Syllabus for babasaheb ambedkar education university.pdf
 
Pharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brubPharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brub
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
 
Gender and Mental Health - Counselling and Family Therapy Applications and In...
Gender and Mental Health - Counselling and Family Therapy Applications and In...Gender and Mental Health - Counselling and Family Therapy Applications and In...
Gender and Mental Health - Counselling and Family Therapy Applications and In...
 
Electric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger HuntElectric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger Hunt
 
Wound healing PPT
Wound healing PPTWound healing PPT
Wound healing PPT
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
 
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem studentsRHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
 
مصحف القراءات العشر أعد أحرف الخلاف سمير بسيوني.pdf
مصحف القراءات العشر   أعد أحرف الخلاف سمير بسيوني.pdfمصحف القراءات العشر   أعد أحرف الخلاف سمير بسيوني.pdf
مصحف القراءات العشر أعد أحرف الخلاف سمير بسيوني.pdf
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
 
Stack Memory Organization of 8086 Microprocessor
Stack Memory Organization of 8086 MicroprocessorStack Memory Organization of 8086 Microprocessor
Stack Memory Organization of 8086 Microprocessor
 

Principles of Health Informatics: Evaluating medical software

  • 1. Lecture 20: Evaluating medical software Dr. Martin Chapman Principles of Health Informatics (7MPE1000). https://martinchapman.co.uk/teaching Different book
  • 2. Lecture structure Prologue: What is usability? The Usability Engineering Lifecycle (Part 1) - Design Lecture 19: Understanding the user, participatory design and guidelines and heuristics. The Usability Engineering Lifecycle (Part 2) - Evaluation Lecture 20: User testing, user assessment methods and iterative design and follow-up. Epilogue: Making the lifecycle work
  • 3. Learning outcomes 1. Understand how the heuristics seen in Lecture 19 can also be used in interface evaluation. 2. Be able to define the different stages in user testing. 3. Be able to critique the use of user testing as an evaluation mechanism. 4. Understand the difference between objective and subjective data collection. 5. Determine mechanisms for supporting iterative design. 6. Be able to critique the usability engineering lifecycle, and what makes it work.
  • 4. Short prologue: What is interface evaluation? Interface evaluation, at a high level, involves determining whether a designed interface, once it has been designed, is actually usable. In other words, a decision is made, by either the designer of the software or other users, on whether the interface meets expectations. The data gathered for an interface evaluation may be objective, consisting of (for example) information relating to how easily users complete tasks, or may be subjective, consisting of, for example, whether users think an interface is easy to use. Remember lots of the principles we discuss could also be applied to whole systems, as well as user interfaces.
  • 5. Short prologue: Heuristic evaluation You might have noticed that if we flip the heuristics we saw in Lecture 19 on their head and consider them as metrics rather than guidelines, they actually match (most of) the description just seen. Heuristics can be used to determine usability and they can be applied by the designer themselves, and, as such, can be used to conduct an interface evaluation.
  • 6. We will now again consider the Usability Engineering Lifecycle, a specific UI development model. This model has two distinct phases, the design phase, which was covered in Lecture 19, and the evaluation phase, which we will cover here. Evaluation phase: User testing Iterative design and follow-up User assessment methods Back to The Usability Engineering Lifecycle
  • 7. User testing The Usability Engineering Lifecycle
  • 8. Short prologue: What is user testing? We’ve seen that interface evaluation involves determining whether a designed interface is usable. When we involve actual users in this process, we call this user testing. In the context of software, there are many different types of testing, including tests that confirm the correctness of code. One typically observes an individual interacting with an interface, gathers data on this in some way, and then uses this as a part of the interface evaluation process.
  • 9. Heuristic evaluation vs user testing Using guidelines and heuristics for evaluation (as well as design) is a good and, critically, inexpensive approach. This is mostly because it doesn’t tend to involve actual users. However, sometimes more rigorous approaches are needed. This is where full user testing, which does involve including users in the evaluation process, comes into play. To cover user testing, we will consider three (sub-)stages: preparation (goals, tasks and plans), date gathering (think-aloud evaluation and performance evaluation) and debriefing.
  • 11. Recall: Formative and summative evaluation In Lecture 6, we saw that the goal of system evaluation can be either formative or summative. Formal High-level
  • 12. Goals User interface test preparation usually involves formalising several things, and determining the goal of the overall evaluation being supported (formative or summative) is one. Evaluation goals impact the way data is collected during a test, as we will see later. Formative interface evaluation typically takes place when the UI development lifecycle is iterative, so the focus is on improvement. Summative evaluation has an air of finality about it, and often takes place after the development process, to, for example, determine the quality of a piece of software.
  • 13. Tasks With a goal in mind, the next element of a test that needs to be formalised is the tasks a user should perform as a part of the test(s). The tasks chosen should reflect closely the way the system will be used in practice. As such, this is an important point at which to come back to task analysis. It’s important to try and break tasks down into a size that makes them easy to complete, but not to a degree that the tasks are trivial.
  • 14. Tasks 1. ‘Enter number of encounters’ 2. ‘Obtain totals’ 3. ‘Construct a chart showing trends’ Easy Challenging Starting with easier tasks is likely to increase a user’s confidence. Testing a spreadsheet program for recording clinic visits.
  • 15. Plans All the decisions made when preparing for a test (such as goals and tasks, as we’ve seen) are formalised in what we call a test plan. Several formal frameworks exist for structuring test plans.
  • 16. Pilot test vs. full tests Before launching into a full user test, it might be useful to try out the plan on a smaller group of users first. We refer to this as pilot testing. Pilot testing can be used to identify issues with any aspect of the test procedure, such as the phrasing of instructions for test tasks, or to clarify the test experimental procedure itself, such as what is to be measured.
  • 17. Example: CONSULT pilot study Goals: (1) Determine suitability of test protocol (2) Identify usability and technical problems in system 7-day pilot study (protocol): Phase 1 (Day 1): induction and familiarisation Phase 2 (Days 2 – 4): use with chatbot Phase 3 (Days 5 – 7): use without chatbot Panos Balatsoukas, Isabel Sassoon, Martin Chapman, et al. In the wild usability assessment of a connected health system for stroke self-management: a pilot. In 8th IEEE International Conference on Healthcare Informatics, 2020.
  • 18. 2. Data collection User testing We are now in a position to run our tests, and need to collect data from our users that we can then translate into insights about the usability of our software
  • 19. Think-Aloud One of the most common and important ways to collect data. Asks users to verbalise their thoughts while they complete the test tasks. Allows an evaluator to see how the user views the system and any issues they have (e.g. misconceptions or points of confusion). Think-aloud best supports formative evaluation. Very fine-grained insight can be obtained as to which parts of the interface may cause issues, as every thought is expressed
  • 20. Performance measurement In contrast to the think-aloud approach, performance measurements don’t involve direct communication with the user, but instead let users complete each test task, while numeric data is collected, such as completion time and error rates. When collecting data in this way, there needs to be a clear connection between the goal and the collection method to make sure the right data is collected: Performance measurements best support summative evaluation.
  • 21. Performance measurement Quantify what this usability principle actually means in the context of this test. Provide more detail, e.g. by relating to the Five Usability Principles from Lecture 19. Determine how to actually measure the user’s performance Further still, determine the activities that will generate data. May be more specific, depending on what is being tested.
  • 22. User testing critique: User testing ethics We can close out our discussion on data collection during the testing process by considering some limitations. As with any use of ‘human subjects’, there are ethical considerations to be made. Not doing so risks harm to the users involved.
  • 23. User testing critique: reliability and validity Other factors affect the quality of our tests, such as the variety of test tasks created. Valid test results actually reflect the usability of the software and don’t, for example, simply reflect an artefact of the testing environment. Is a new piece of EHR software better received because its usability has increased or because it was displayed on a better monitor? Test results are only useful if they are reliable and valid. Reliable test results are repeatable (the same results would be received if the tests were repeated), which is often hard in this domain as users differ greatly.
  • 25. Debrief This is the opportunity for the evaluator to ask for further comments or clarifications about anything that happened during the tests. This stage in the process is also an opportunity to collect subjective data…
  • 26. User assessment methods The Usability Engineering Lifecycle
  • 27. Subjective data We noted earlier that the data collected for interface evaluation can be either objective or subjective. The data collected during the main user testing process is, arguably, objective, in that it focuses on quantifiable task completion. Other approaches that attempt to gauge, for example, users’ views on the usability of a system, collect subjective data that doesn’t reflect actual usability. Despite its subjectivity, this supplementary data is still incredibly important.
  • 28. Questionnaires and Interviews Questionnaires and interviews, as subjective data collection techniques, are inherently linked as both involve asking users a set of questions and recording their answers. Both may be literally linked, as an interviewer goes through a questionnaire live with a participant, obtaining both the quantitative data associated with a questionnaire and also having the opportunity to follow up with longer-form questions. Questionnaires that use a specific scale (Lecture 19) may again be useful here.
  • 29. Focus Groups Focus groups involve multiple participants and are often more informal, with the added benefit of the facilitator being able to see, and record the impact of, group dynamics. Note that all of these techniques can also be used in the design phase.
  • 30. Example: PPIE at KCL and beyond For health software, an important group of users to consult about usability, or indeed research projects in general, may be patients. https://digitalmimic.ai
  • 31. User assessment methods Subjective data collection methods form part of a wider category of user assessment methods that are inherently less formal than user testing methods. We refer to these as user assessment methods (rather than user testing) methods, to reflect this. There are several other user assessment methods worth mentioning before we review all the evaluation techniques we have seen. We could perhaps classify heuristic evaluation as a user assessment method.
  • 32. User feedback The first notable (additional) user assessment method also collects subjective data, except rather than an interviewer proactively asking users for their insight, data collection mechanisms are built directly into the software.
  • 33. (Fly-on-the-wall) Observation and Logging Use Watching how users interact with software can be an alternative approach to obtaining reasonably objective data without the cost of running a full user test. It is also less obtrusive. In its simplest form, watching interactions can involve visiting a user in the location in which a system is deployed and observing them (‘fly-on-the-wall’). More complex approaches actually log user interactions within the system itself and monitor usability in this way.
  • 34. Logging Use Area in which users could click Area in which users actually clicked When the interactive areas don’t match the areas that are actually clicked, we have a usability problem. A click There are certainly some potential privacy issues with this approach.
  • 35. Bringing it all together… So far, we’ve seen how heuristic evaluation offers us techniques to informally evaluate an interface. We then looked at a more formal approach to interface evaluation, which involves the collection of objective data via user testing. Most recently, we considered user assessment techniques, which mostly consist of subjective data collection. We can now summarise all of these techniques…
  • 36.
  • 37. Iterative design and follow-up The Usability Engineering Lifecycle
  • 38. Iterative design The results of a summative evaluation techniques, or data collection techniques that rely on looking at deployed software, usually aren’t feed directly back into the development process. They, instead, are usually part of follow-up studies of installed systems, the result of which feed into the next versions of the software. However, formative techniques, like Think-Aloud, are very much conducted with the intent of accommodating feedback into the design process. Once users have given feedback a new iteration of the design is produced.
  • 39. Iterative design – Repeat evaluation We need to repeatedly evaluate our UI after each iteration to check not only that our changes have been effective, but also to check that one change hasn’t introduced other usability issues or made the interface worse for users who didn’t experience any problems in the first place. Inexpensive, heuristic evaluation may be more appropriate if there are lots of iterations. If switching from images/numbers to full text is applied globally to an interface, it may be an improvement in some areas, but not others.
  • 40. Critique: Iterative design Even with proper evaluation, issues may still appear over design iterations due to having an improper record of why changes were made. For example, if a change was made to fix a significant usability problem, but this problem is not recorded, future iterations may undo this change in favour of less significant usability problems Move the controls to the top to solve efficiency of use issues (objective) Move the controls to the bottom because it looks better (subjective)
  • 41. Solution: Design provenance We can use different forms of design provenance to record the rationale for why certain design choices were made and ensure significant design choices are not undone at later stages.
  • 43. We can now combine what we saw in Lecture 19 with what we’ve seen in this lecture to look at the full Usability Engineering Lifecycle: Design phase: Evaluation phase: The (full) Usability Engineering Lifecycle User testing Iterative design and follow-up User assessment methods Participatory design Understanding the user Guidelines and heuristics
  • 44. Epilogue: Making the lifecycle work
  • 45. Meta-methods The usability engineering lifecycle will only be successful if several supplementary activities take place. We call these meta-methods (methods that support another method): 1. Ensure that supplementary planning takes place (a test plan is a good example of this). 2. All plans should be independently reviewed. 3. Conduct a pilot study (as discussed) 4. Construct an overall plan of which steps of the lifecycle will be used (as not every step can always be used), to guide activities.
  • 46. Prioritising usability activities Our final meta-method noted that not every step in the lifecycle may be used. Indeed, factors like cost or time constraints may mean that only a subset of activities can be selected. In the presence of constraints like these, which activities should be selected? The literature suggests that task analysis and iterative design are the most important aspects of the lifecycle, followed by user testing, participatory design and observation.
  • 47. ‘Ad-hoc’ usability engineering While usability engineering activities should be planned in advance and/or take place continuously, sometimes there may be a need for ad-hoc (unplanned) activities. To prepare for this, several steps can be taken: invest in prototyping tools (e.g. wireframing software), be familiar with the latest UI guidelines, understand the characteristics of your average user, maintain ongoing relationships with users and keep up to date with the latest literature.
  • 48. It all comes back to interventions… When we talked about usability design, we saw that an interface needs to be correctly designed so as not to hinder interventions. While it is preferable to remove any potential usability issues during the design phase, the next best solution is to identify issues afterwards during the evaluation, using the techniques we’ve looked at, and correct them. The usability of both software used by clinicians to deliver interventions (left), and software used by patients to access an intervention directly (right), needs to be considered.
  • 49. Summary Evaluating the usability of a user interface after it has been designed is important to identify any (additional) issues (with intervention delivery) and to continuously improve. We have broadly seen two classes of interface evaluation techniques: more objective approaches, such as user testing, and more subjective user assessment methods. We have identified that a single evaluation (and subsequent interface alterations) is not sufficient, and multiple iterations are needed to improve. The Usability Engineering Lifecycle only works if it is supported by supplementary activities, if activities are prioritised and if there is proper preparation.
  • 50. References and Images Enrico Coiera. Guide to Health Informatics (3rd ed.). CRC Press, 2015. Jakob Nielsen. Usability Engineering. Morgan Kaufmann, 1993. Bella Martin. Universal methods of design 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Rockport Publishers, 2012. https://www.invespcro.com/blog/usability-testing-on-a-budget/ https://www.researchgate.net/figure/Screenshot-showing-coding-of-public-health-measures-in-EMIS- web_fig2_340137102 https://open.spotify.com/album/2AmrUwvN3mjWvqjzgh1RCM https://dribbble.com/shots/6655316-Iterative-Design-Process-UX-UI-Card-Design https://www.microsoft.com/en-gb/microsoft-teams/group-chat-software https://www.w3.org/TR/prov-dm/ https://wireframesketcher.com/sample-mockups.html