SlideShare a Scribd company logo
1 of 23
Download to read offline
Quality Assurance for Non-Accredited Training:
Recommendations and Guidance for Commissioners,
Training Providers and Clinicians
Pam Donovan
NE and SE London Communication Skills Training for End of Life Care Pilot Project
NE London Cancer Network
1
st
Floor Outpatients Department
Royal London Hospital
Whitechapel
London E1 1BB
October 2010
2
Intended Audience
The recommendations and guidance contained in this document are intended to help
commissioners of training, training providers and clinicians to commission, deliver
and evaluate training which meets the needs of their organisation. The contents of
the document may be used by anyone who is interested in improving healthcare
provision through the education of the workforce.
Contents
Intended Audience ............................................................................................................... 2
Contents.................................................................................................................................. 2
Summary and Recommendations.................................................................................... 3
Summary ................................................................................................................................... 3
Recommendations..................................................................................................................... 3
1. Introduction ....................................................................................................................... 5
2. The education and training commissioning cycle .................................................. 6
Figure 1: The NHS Education and Training Commissioning cycle and its relationship to
service commissioning....................................................................................................... 7
3. RARPA: an established quality assurance framework for non-accredited
training .................................................................................................................................... 8
4. A framework for evaluating training ........................................................................... 9
Figure 2: The three evaluation processes for training............................................................... 9
5. The QA framework in action: evaluation of outcomes for a sample of
communication skills training courses for end of life care..................................... 10
Table 1: Communication Skills training evaluations ........................................................ 12
Figure 3: Measured effectiveness of communication skills training courses for end of life care
......................................................................................................................... 14
6. Specification of the aims of training......................................................................... 15
7. Practical considerations for conducting an outcome evaluation..................... 16
7.1 Maximising the response rate............................................................................................ 17
7.2 Resources required ........................................................................................................... 18
7.3 Quality of feedback............................................................................................................ 19
8. Using the results of an outcome evaluation........................................................... 20
9. Examples of draft outcome measures for training ............................................... 21
10. Further information ..................................................................................................... 22
Acknowledgements............................................................................................................ 23
3
Summary and Recommendations
Summary
This report describes a new approach to quality assurance for non-accredited
training commissioned within the NHS. This approach has been developed as part of
the NE and SE London Communication Skills Training for End of Life Care Pilot
Project. The quality assurance process described in this report can be applied to
training on any topic commissioned for any staff group. The approach is based on
RARPA, the national quality-assurance framework specified for non-accredited
training commissioned by the Learning and Skills Council.
The quality assurance process uses outcome measures based on participants self-
evaluating some time after the training the extent to which they have applied their
learning in their jobs. Although these outcome measurements are not precise they
are capable of highlighting important quality issues relating to mismatch of training to
participants’ roles, and thus support both commissioners and providers in achieving
value for money in relation to training.
Recommendations
This section summarises the recommendations arising from this project for anyone
involved in the commissioning of training courses or other training resources (e.g.
posts which include training responsibilities)
1. Specify the aims of the training and the intended outcomes of training carefully
when the training is commissioned.
2. When specifying the aims and outcomes of training, obtain informed input from
service commissioners and service providers to ensure that the training
addresses service needs and current and future service priorities.
3. Specify the aims of training in terms of learning outcomes, i.e. exactly what
participants are expected to learn. Ensure that service providers and service
commissioners agree the learning outcomes.
4. Choose outcome measures which measure the intended learning outcomes as
directly as possible, even if the measurement is not very precise.
5. Use a very small number of outcome measures. Choose measures relating to the
topics of highest priority to the service or organisation commissioning the training.
6. Choose outcome measures which give an intermediate score so that issues can
be highlighted and improvements demonstrated.
7. Do not expect to collect evaluation data from every participant who attends the
training, but set a minimum acceptable number of responses in advance.
8. Tailor the data-collection method used for measuring the outcomes to the
requirements and preferences of the particular group(s) of staff who attend the
training.
4
9. Ensure that the data-collection method preserves the anonymity of the
participants in order to maximise the quality of feedback.
10. Design the outcome evaluation as an integral part of the training in order to
maximise response rates and minimise resource requirements, e.g. by making
provision of an attendance certificate conditional on completing the evaluation
survey. For extended training programmes this could involve including planned
evaluation sessions as part the training programme.
11. Identify the resources required to implement an outcome evaluation at the time
when the outcome evaluation is specified and include the necessary resource in
contracts.
12. Keep evaluation questions short, simple and unambiguous to avoid
misinterpretation.
13. Do not ask for more than 5 answers in an evaluation survey, including answers to
parts of questions; i.e. respondents should not be expected to tick more than 5
boxes in total. The fewer questions asked, the better.
14. Include an ‘open’ question requesting general comments.
15. When piloting a quality measure, consider external processes and factors which
may influence how participants apply their learning from the training and how
support for these processes may be needed to achieve the target.
16. Inform all stakeholders in advance of how the results of the outcome evaluation
will be used.
17. When commissioning training, ensure that the managers of the staff who attend
bear joint responsibility for the achievement of outcome quality measures, e.g. for
ensuring their staff attend, for supporting the application of learning in practice
and for collecting and reporting evaluation data where required.
5
1. Introduction
This report describes a new approach to quality assurance for non-accredited
training commissioned within the NHS. This approach has been developed as part of
the NE and SE London Communication Skills Training for End of Life Care Pilot
Project, which was one of 12 pilot projects supported during 2009-2010 by the
National End of Life Care Programme and Connected, the national communication
skills programme for cancer services. The remit of the project was to pilot
improvements in communication skills training for the end of life care workforce.
However, the quality assurance process described in this report is completely
general and could be applied to education and training on any topic commissioned
for any staff group. The approach is based on RARPA, the national quality-assurance
framework specified for non-accredited training commissioned by the Learning and
Skills Council1
.
During the first stage of the project a series of informal interviews was held with
commissioners and providers of training for end of life care in NHS Trusts across NE
and SE London. These interviews confirmed that commissioning of training is
currently highly fragmented. Formally accredited education is commissioned by Trust
Education Leads through block contracts with higher education institutions while a
range of non-accredited training courses is commissioned by service providers from
their service budgets and by clinical service commissioners from their commissioning
budgets. In addition, in-house training is delivered to NHS staff by clinical and non-
clinical specialists funded through block commissions. There is generally little
coordination between the various training commissioners and training is seldom
evaluated in terms of its outcomes, so the impact of training on service quality and
the patient experience can rarely be demonstrated.
Two pressing needs for improvement in education and training commissioning for
end of life care were identified through the interviews:
• The need for closer links between training commissioning and organisational
priorities, risks and goals.
• The need for a process for measuring the training outcomes in terms of their
application in practice.
In the second stage of the project post-course outcome evaluations were conducted
for a sample of non-accredited communication skills training courses delivered to
members of the end of life workforce in NE or SE London. The aims of this part of the
project were:
• to develop outcome quality measures for non-accredited communication skills
training for use in future commissioning
• to explore the practicalities of conducting outcome evaluations and assess the
resources required.
This report presents a summary of the relevant results from the project and provides
recommendations for commissioners wishing to implement an outcomes-based
quality-assurance process for training they commission.
1
LSC (2005) Recognising and Recording Progress and Achievement in Non-accredited Learning, Coventry,
Learning and Skills Council
6
2. The education and training commissioning cycle
Figure 1 is a schematic representation of the education and training commissioning
cycle for end of life care. The cycle begins with strategic planning, followed by
procurement, performance managing, monitoring and evaluation. The evaluation
results should then be used to refresh the strategy before the cycle begins again.
Training in the NHS underpins service delivery so the needs and priorities of clinical
service delivery should drive the education commissioning cycle. Figure 1 indicates
points in the education commissioning cycle where links to service commissioning
and delivery should occur. The strategic review which begins and ends each cycle
should also involve both education and service commissioners to ensure common
priorities are agreed.
The two pressing needs identified during this project relate to the monitoring and
evaluation section of Figure 1 and to the frequent omission of a strategic review.
However, a strategic review cannot achieve its purpose without measurements of the
outcomes of the training which has been delivered. Thus development of outcome
measurements was identified as the first priority.
7
Figure 1: The NHS Education and Training Commissioning cycle and its relationship to service
commissioning2
2
This diagram was developed by and is reproduced with the permission of Juliette Bigley, Programme Manager, Marie Curie Delivering Choice Programme
8
3. RARPA: an established quality assurance framework for
non-accredited training
The RARPA framework (Recognising and Recording Progress and Achievement in
non-accredited learning) was developed to assure quality in education and training
not leading to an accredited qualification. Its development was initiated by the
Learning and Skills Council (LSC) which funds adult education delivered by further
education colleges, voluntary organisations and in specialised settings such as
prisons. The LSC wished to demonstrate whether the training they commissioned
provided value for money and contributed to regional or national educational priorities
and goals. RARPA was developed in 2002, piloted in 2003-4 and all LSC-funded
providers of education and training were required to ensure full implementation by
September 2006.
RARPA comprises the following five-stage process:
1. Specification of aims appropriate to an individual learner or groups of learners
2. Initial assessment to establish the learner’s starting point
3. Identification of appropriately challenging learning objectives
4. Recognition and recording of progress and achievement during the
programme
5. End of programme learner self-assessment, tutor assessment, review of
overall progress and achievement
The priority of the NHS is provision of high-quality healthcare, with the training and
education of the healthcare workforce forming a means towards this goal. This
perspective requires input at RARPA stage 1 from the employing service or
organisation on its needs, priorities and goals, while RARPA stage 5 (assessment of
learning) requires a focus on the application of learning in practice to deliver patient
care. The three intermediate steps are the core business of the training provider and
remain unchanged. Commissioners of training should thus be involved in the
specification of the intended outcomes of the training (stage 1) and in reviewing the
measured outcomes after the training (stage 5) but considerations relating to how the
training is delivered and the management of individual learners (stages 2-4) are the
responsibility of the training provider.
Commissioners should implement a quality assurance (QA) process to ensure that
the training they commission leads to a measurable improvement in patient care. In
planning the QA process it is necessary to begin by considering how this outcome
will be measured. In the next two sections of this report we discuss how training can
be evaluated in the final RARPA stage, and in the following section the first stage of
RARPA (specification of the aims of training) will be considered.
9
4. A framework for evaluating training
Three distinct processes for evaluating training can be identified which aim to answer
three fundamentally different questions, as follows:
• Monitoring: did staff attend the training?
• Evaluation: did staff learn from the training?
• Assessment: have staff applied their learning from the training in practice?
Each of the three processes of monitoring, evaluation and assessment can be
conducted formally or informally, using a variety of techniques and at a range of
levels of sophistication, as shown in Figure 2. Moving upwards and to the right
across Figure 2 the standard of quality assurance improves, but the resources
(including input of staff time) needed to implement the evaluation process increase.
This is indicated in Figure 2 by colour-coding in bronze, silver or gold. The ‘gold
standard’ of quality assurance is achieved by independent assessment of an
individual’s practice, conducted either formally (through accreditation of the training)
or informally. However, this gold standard is expensive to implement. Commissioners
of training should choose the appropriate level of evaluation for any specific training
intervention on the basis of cost-benefit considerations.
Figure 2: The three evaluation processes for training
High assurance, high cost
Low assurance, low cost
Monitored Evaluated Assessed
% by workforce
group
Percentages
Entry, exit and
post-course
Entry, exit
Numbers Exit
Accreditation
Observations by
colleague
Self-assessment
10
Monitoring of training attendance is routinely carried out throughout the NHS for
statutory and mandatory training. In the course of this project it was found that
conferences and training workshops relating to end of life care delivered in the NE
and SE London sectors were usually evaluated at the lowest level in Figure 2, i.e. by
means of exit questionnaires administered at the end of the training. A few instances
were found where entrance and exit surveys were used, and a very small number of
training courses were evaluated using entrance, exit and post-course surveys.
Assessment of whether training has been applied in participants’ subsequent practice
was seldom attempted.
Assessment, rather than monitoring or evaluation, is clearly necessary to judge the
value of training in improving service delivery. Most NHS Trust Education Leads
interviewed during the project recognised the value of assessment and stated that
they would like to assess the outcomes of more of the training they commission. One
reason given for not undertaking outcome evaluations routinely was shortage of
resources. In view of this, the project concentrated on developing assessment at the
lowest (and therefore cheapest) level in Figure 2, i.e. self-assessment by trainees.
This method of assessment is subjective, since it relies on self- reporting, but
commissioners do not need a highly precise measure because they are interested in
checking the quality of the training rather than ‘grading’ participants’ performance.
The project also examined how measurement of training outcomes can be made
most effective and efficient, and this is discussed in section 7.
5. The QA framework in action: evaluation of outcomes for a
sample of communication skills training courses for end of
life care
During the project post-course outcome evaluations were carried out for a sample of
non-accredited3
training courses on communication skills for end of life care. Courses
and training events were selected for evaluation using the following criteria:
• Training delivery completed between 1 Jan 2008 and 31 July 2010
• A range of types of training event, including conferences, workshops, short
courses (up to 1 day) and longer courses (up to 3 days)
• A range of delivery methods, including conference presentations, role play,
simulations, group discussion and didactic teaching
• A range of training providers, including in-house clinical specialists, in-house
training departments and external training providers
• A range of target groups spanning workforce Groups A, B and C identified in the
National End of Life Care strategy4
, different professional disciplines and different
work contexts including acute hospitals, intermediate and continuing care,
community healthcare, nursing homes, mental healthcare settings and primary
care.
19 outcome evaluations were undertaken in total. 3 of these were unsuccessful and
the reasons for this are discussed further in section 8. Details of the training courses,
3
The sample included four workshops for postgraduate trainee doctors which form part of an accredited
Foundation Programme but are not accredited separately in their own right
4
Department of Health, National End of Life Care Strategy, July 2008
11
the target groups of participants and the data collection methods used are
summarised in Table 1.
The key feature of all the outcome evaluations was that course participants were
asked some time after the course whether they had used their learning in the
workplace. The evaluations thus implemented the third evaluation process,
assessment, at the lowest level identified in Figure 2.
All the evaluations included the following questions:
• Did you find the training useful?
• Have you used what you learned in your role?
The answers received are summarised for the 16 successful evaluations in Figure 3.
The effectiveness, judged as the proportion of participants reporting application of
learning in their practice some months after the training, varied between 35% and
100%. The highest possible effectiveness of 100% was reported for three of the
training courses: the palliative care training delivered by the Redbridge Community
Health Services Macmillan Specialist Palliative Care team to qualified nurses working
in Nursing Homes (EoLC workforce Group B), the ‘I don’t know what to say’ course
for reception staff (EoLC workforce Group C) delivered by St Christopher’s Hospice
and the Cancer User Groups Facilitator Training delivered by the SE London Cancer
Network (EoLC workforce Group A). This demonstrates that highly effective training
can be commissioned for all levels of the End of Life Care workforce.
Three factors were identified in the evaluations which had reduced the measured
effectiveness of the other training courses in the sample:
• Participants who had not had an opportunity to apply their learning in the period
between the training and its evaluation.
• Participants who said they had previously attended similar training and had
learned nothing new
• Participants who reported that topics included in the training were not relevant to
their role
These findings highlight the following quality issues to be considered carefully when
commissioning training:
• Which groups of staff will be in a position to apply their learning, how often they
will have the opportunity to apply it and, in the case of practical skills, whether
they will be able to do this sufficiently frequently to maintain their competence
• The starting levels of potential participants
• The match between training content and potential participants’ roles
The commissioners and providers of the training were unaware of these quality
issues before the evaluation. This confirms the importance of measuring outcomes
from training when commissioning for quality.
12
Table 1: Communication Skills training evaluations
Training Title Training Provider Target group EoLC
strategy
workforce
category
5
Competences
6
Training
delivery
date(s)
Data
collection
method
Number of
responses
Interval
between
training and
evaluation
Advanced
Communication Skills
Training (ACST)
Connected Cancer specialists Group A 1a, 1b, 1d, 1e Sept 2008
– Oct
2009
Telephone
interviews
119 (70%) 3 - 17
months
Breaking Bad News Medical Education
Department, Acute
Hospital A
Junior doctors Group B 1a, 1b, 1c October
2009
On-line survey 6 (17%) 4 months
Communication Skills
Workshop
Medical Education
Department, Acute
Hospital B
Junior doctors Group B 1a, 1b, 1c Sept 2009
– March
2010
On-line survey 6 (17%) 1 – 5 months
Gold Standards
Framework in Action
conference
St Christopher’s
Hospice
GPs Group B 1a, 1c, 2a
3d, 3e, 3f
4a, 4b
July 2009 On-line survey 8 (23%) 8 months
The CNS Role in
Leading EoLC
conference
NE London End of
Life Programme
Clinical Nurse
Specialists
Groups A
and B
Not applicable June 2009 On-line survey 22 (33%) 9 months
Palliative Training for
Qualified Nursing
Home Staff
Redbridge
MacMillan
Specialist Palliative
Care Team
Qualified nurses
working in Nursing
Homes
Group B 1a, 1d, 1e,2a, 2c,
2e, 3a, 3b, 3c,
3d, 3f, 4a
January –
April 2010
Focus group
and individual
questionnaires
8 (100%) 1-3 months
Initiating Difficult
Discussions
St Joseph’s
Hospice
Health and social care
professionals and
support staff
Group B 1a, 1b, 1d, 1e December
2009
Telephone
interviews
10 (100%) 5 months
I don’t know what to
say
St Christopher’s
Hospice
Reception and other
support staff
Group C 1b, 1d, 1e Sept 2009 On-line survey 7 (39%) 8 months
Advance Care
Planning for End of
Life Care
NE London End of
Life Programme
Health and social care
professionals and
support staff
Groups A
and B
1a, 1b, 1c, 1d,
1e, 1f
April 2009 On-line survey 22 (18%) 13 months
5
Department of Health, National End of Life Care Strategy, July 2008
6
NHS National End of Life Care Programme, Common core competences and principles for health and social care workers working with adults at the end of life, June 2009
13
End of Life Care
Workshop
City
University/East
London
Foundation Trust
Mental health nurses
and support staff
Groups B
and C
1a, 1e, 2a, 3e, 3f,
3g, 4a, 4b, 4c, 4d
November
2009 –
April 2010
On-line survey 9 (35%) 2 – 7 months
Bereavement Care in
the Hospital Setting
Barts and the
London NHS Trust
Hospital nurses and
support staff
Group B 1a, 1b, 1e April 2010 Face-to-face or
telephone
interviews
13 (100%) 2 months
Cancer Support
Groups Facilitator
training
SE London Cancer
Network
Clinical nurse
specialists, specialist
AHPs and psychological
support staff
Group A 1a, 1b, 1c, 1d, 1e April 2008
– May
2010
On-line survey 25 (41%) 3 – 27
months
Communication Skills
for Administration
Staff
St Francis’
Hospice
Administration staff Group C 1a, 1b, 1d, 1e May 2010 On-line survey 11 (55%) 2 months
Sage and Thyme Guys and St
Thomas’ NHS
Foundation Trust
Hospital nurses Group B 1d, 1e June 2010 On-line survey 15 (68%) 1 month
Liverpool Care
Pathway training
Homerton
University Hospital
NHS Foundation
Trust
Hospital nurses Group B 1a, 1c, 1e July 2009 Face-to-face or
telephone
interviews
15 (50%) 12 months
Getting the Message
Across
Medical Education
Department, Acute
Hospital D
Junior doctors Group B 1a, 1b, 1c May 2010 On-line survey 5 (10%) 2 months
GSF Learning Sets NHS Lambeth GPs and practice staff Group B 1a, 1b, 1c, 1d, 1e Jan –
March
2010
On-line survey 14 (30%) 1-4 months
Breaking Bad News Medical Education
Department, Acute
Hospital C
Junior doctors Group B 1a, 1b, 1c May 2010 On-line survey 2 (6%) 1 month
Step into Palliative
Care
Whipps Cross
University Hospital
Palliative Care
Team
District Nurses, Nursing
Home nurses and
support staff
Group B 1a, 1d, 1e, 2a,
2c, 2e, 3a, 3b,
3c, 3d, 3f, 4a
Sept – Oct
2009
on-line or
postal
questionnaire
3 (13%) 6 months
14
Figure 3: Measured effectiveness of communication skills training courses for end of life care
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Idon'tknow
w
hatto
say
C
ancerSupportG
roups
Facilitatortraining
Palliative
C
are
forqualified
nursing
hom
e
staffLC
P
training
C
om
m
unication
Skills
forAdm
inistration
Staff
Breaking
Bad
N
ew
s
C
om
m
unication
Skills
w
orkshop
C
N
S
leading
EoLCAC
P
forEoLCG
SF
in
Action
Initiating
D
ifficultD
iscussions
AC
ST
Bereavem
entcare
Sage
and
Thym
e
G
etting
the
m
essage
across
EoLC
w
orkshop
Learning applied in practice Useful but not applied in practice Training not useful
15
6. Specification of the aims of training
If a training commissioner intends to use outcome measures as part of a quality
assurance framework it is important to start with a very clear specification of the aims of
the training. This is the first stage of the RARPA process, and Figure 1 shows that this is
one of the stages of the training commissioning cycle that requires input from service
commissioners and providers as well as from the training commissioner and training
providers.
 Recommendation 1: Specify the aims of the training and the
intended outcomes of training carefully when the training is
commissioned.
 Recommendation 2: When specifying the aims and outcomes of
training, obtain informed input from service commissioners and
service providers to ensure that the training addresses service needs
and current and future service priorities.
 Recommendation 3: Specify the aims of training in terms of learning
outcomes, i.e. exactly what participants are expected to learn. Ensure
that service providers and service commissioners agree the learning
outcomes.
Once the learning outcomes of the training have been decided the next stage is to
decide how these outcomes will be measured. Outcome measures defined in terms of
changes in practice are not applicable to all training. Some training is intended to raise
awareness of situations which participants may only encounter occasionally. Training
conferences may cover a range of topics not all of which may be directly relevant to the
practice of every participant.
A useful classification for considering the type of outcome measure required is to assign
each intended learning outcome into one of three categories: increased knowledge,
increased skills or increased understanding. The best method of measuring the learning
outcomes will be different for these three categories:
• Outcome of knowledge-based training - measure by quiz or test
• Outcome of skills-based training - measure by assessment of the skills in practice
• Outcome of understanding-based training – measure by probing questioning.
This project was specifically concerned with skills-based training so the outcome
measure was based on self-assessment of participants’ skills in practice.
Commissioners do not need a highly precise outcome measure. The measure is
intended to be a quality check for the training, not a means of ‘grading’ participants’
performance.
16
 Recommendation 4: Choose outcome measures which measure
the intended learning outcomes as directly as possible, even if the
measurement is not very precise. “Crude measures of the right
things are better than precise measures of the wrong things”7
The number of outcome measures should be limited to make the assessment process
quick and easy to implement. A single, well-designed outcome measure gives more
valuable results than several poorly-designed measures. There is no need to try to
measure every intended learning outcome. It is best to choose measures for one or two
learning outcomes, preferably those which relate to the highest priorities for the service
or organisation.
 Recommendation 5: Use a very small number of outcome
measures. Choose measures relating to the topics of highest
priority to the service or organisation commissioning the training.
Outcome measures should be chosen which are capable of revealing any problems with
the training and demonstrating improvements. There is no point in bothering to measure
something which will come out 100% successful or 100% unsuccessful.
 Recommendation 6: Outcome measures should be chosen which
give an intermediate score so that issues can be highlighted and
improvements demonstrated.
7. Practical considerations for conducting an outcome
evaluation
Once the aims of the training have been specified and outcome measures chosen and
agreed in partnership with the training provider, the final consideration is what resources
must be provided to measure the outcomes.
Whether the intended learning outcomes from training are classified as increased
knowledge, skills or understanding, a commissioner will be interested in how these have
been applied in participants’ practice. It will therefore be necessary to contact course
participants some time after the training, when they have had an opportunity to put into
practice what they learned. Contacting participants for post-course evaluations involves
extra time, trouble and expense and some training commissioners who were interviewed
cited this as a reason for not conducting outcome evaluations.
In this section the following practicalities of conducting an outcome evaluation are
considered:
• How to get sufficient feedback in a post-course evaluation
7
Measurement for Improvement, NHS Institute for Innovation and Improvement (2005), p. 10
17
• How to minimise the resources required
• How to get good quality feedback
7.1 Maximising the response rate
An outcome evaluation does not require a very high response rate. The aim is to assess
quality rather than to produce research-level data. However, training courses and
workshops often include only a small number of participants and unless a reasonable
proportion of them provide feedback the evaluation is not useful. It is a good idea to
decide on a minimum number of responses for an acceptable result before conducting
the evaluation. For the evaluations which formed part of this project this minimum
number was set at 5 participants or 10% of participants (whichever was smaller). Two of
the evaluations shown in Table 1 were considered failures because they did not achieve
this minimum response level.
 Recommendation 7: Do not expect to collect evaluation data from
every participant who attends the training, but set a minimum
acceptable number of responses in advance.
Different data-collection methods produced different response rates. Table 1 shows that
focus groups, face-to-face and telephone interviews produced higher response rates
than surveys. However, none of the methods tested proved the best for every group of
staff. The advantages and disadvantages of each method for different staff groups are
summarised as follows:
• Face-to-face interviews: Found to be the most effective method for collecting
feedback from nursing assistants and nurses, who responded best to personal
contact. The most inclusive method: people with sensory disabilities, learning
disabilities, those with poor standards of literacy and those who speak English as an
additional language (or not at all) can participate.
• Focus groups: Found to be reasonably effective for nurses and nursing assistants
but less effective than face-to-face interviews because not everyone is prepared to
speak in a group. Achieved 100% response rate when supplemented by other
measures to collect feedback from ‘silent’ group members.
• Telephone interviews: Found to be effective for healthcare professionals. Support
staff found it disconcerting to be questioned on the telephone by someone they had
not met. An effective means of collecting feedback from staff working weekend,
twilight-hours or night shifts.
• On-line surveys: The most effective method for doctors and for administration staff
who use computers as an integral part of their role. Less effective with nurses who
responded better to personal contact.
• Paper questionnaires: Found to be the least effective method. In this project this
method was tried only once. The evaluation failed because there were very few
responses.
18
 Recommendation 8: Tailor the data-collection method used for
measuring the outcomes to the requirements and preferences of the
particular group(s) of staff who attend the training.
Participants may be more willing to participate in the outcome evaluation (and the quality
of their responses will be higher) if their anonymity is preserved.
 Recommendation 9: Ensure that the data-collection method
preserves the anonymity of the participants in order to maximise
the quality of feedback.
7.2 Resources required
This project demonstrated that conducting an outcome evaluation need not be onerous
or resource-intensive. However, some additional resources will be required. In this
section the resources needed for the various types of evaluation are summarised.
• On-line surveys: on-line survey tools make designing and administering a survey
very quick and simple. Data is collected automatically and can be down-loaded in
several formats. Participants must have access to the internet to respond to the
survey. Easily the most efficient method if feedback is required from a large number
of people.
• Paper questionnaires: requires substantially more resources than the on-line
survey for designing, printing, reproducing and distributing the survey and for
entering the responses onto the computer. Administrative support will be required.
• Telephone interviews: quick and efficient for relatively small groups of people
(fewer than about 30) when participants have easy access to the telephone at work.
• Face-to-face interviews: not as slow as might be thought where a group of trainees
work in the same location. During the project up to 8 participants were interviewed in
a one-hour visit to a ward. Not suitable when feedback is required from large
numbers of participants.
• Focus group: requires the most resources if arranged as a ‘free-standing’ group.
However, the focus group included in this project was planned as an integral
component of the training and thus required no extra resources at all.
 Recommendation 10: Design the outcome evaluation as an integral
part of the training in order to maximise response rates and minimise
resource requirements, e.g. by making provision of an attendance
certificate conditional on completing the evaluation survey. For
extended training programmes this could involve including planned
evaluation sessions as part the training programme
Whatever the data collection method chosen, additional administrative resources will be
required to analyse the results of the outcome evaluation and report them to the relevant
stakeholders.
19
The amount and nature of resources needed to conduct outcome evaluations will thus
depend on how many evaluations a commissioner chooses to carry out and the methods
chosen for conducting them. If the data collection is to be carried out by service
providers and reported to the commissioner the necessary resources should be provided
within the service contract.
 Recommendation 11: Identify the resources required to
implement an outcome evaluation at the time when the outcome
evaluation is specified and include the necessary resource in
contracts.
7.3 Quality of feedback
In the course of the project it was found that the key consideration for obtaining useful
feedback was careful design of the evaluation questions. One evaluation shown in Table
1 failed because errors in design led to some respondents misinterpreting questions.
 Recommendation 12: Keep evaluation questions short, simple
and unambiguous to avoid misinterpretation
A second consideration is the total number of questions included in the evaluation. The
likelihood of respondents omitting to answer some of the questions increases as the
survey gets longer. Some surveys issued in this project were too long and the high
proportion of missing answers created a problem when analysing the responses.
 Recommendation 13: Do not ask for more than 5 answers in an
evaluation survey, including answers to parts of questions; i.e.
respondents should not be expected to tick more than 5 boxes in
total. The fewer questions asked, the better.
It was found to be useful to include at least one ‘open’ question asking for general
comments on the training. Although most respondents will not answer this, any answers
that are received tend to be unexpected and useful.
 Recommendation 14: Include an ‘open’ question requesting
general comments.
20
8. Using the results of an outcome evaluation
Outcome evaluation produces tangible benefits for all training stakeholders: training
participants, training providers, training commissioners and strategic management. Post-
course evaluation encourages training participants to reflect on what they have learned
and its application to their roles. Evaluation of RARPA pilot projects demonstrated that
the benefits of the quality framework included better motivation, faster progress and
better engagement with learning8
. These benefits arise from learners having a sense
that someone is paying attention to what they are learning, rather than just leaving them
to get on with it.
Outcome evaluation is useful for trainers and training providers because it provides them
with feedback on their role. Examples of how the training providers who participated in
this project have used their evaluation results include the following:
• Adjustment of the course content
• Adjustment of the training delivery methods
• Alterations to the grades of staff admitted to the training
• As evidence for senior management of the value of the training
• To support applications for funding to continue the training
91% of the training providers who participated in the project reported that they would
definitely take part in an outcome evaluation again. 9% reported that they would
probably do so.
For training commissioners the evaluation results form a vital component of the
commissioning cycle, allowing commissioners to assess value for money and to work in
partnership with providers to improve quality and productivity. For strategic management
the results help to track progress towards organisational goals.
During the project a number of different stakeholders expressed concerns about how the
results of outcome evaluations might be used. These concerns included the following:
• Selection of participants. Training providers often do not have control of who attends
their training, and are sometimes left to deal with unsuitable, unwilling or ill-prepared
participants. Training providers were concerned that in order to achieve quality
targets it would be necessary for service management to share responsibility for the
performance of the learners they send on the training.
• Support for application of learning in practice. Stakeholders thought that in order to
ensure quality targets are met, robust systems are needed to assist training
participants in putting into practice what they have learned.
• Definition of course aims. The first step of the RARPA process is key to a successful
quality assurance process. Unless commissioners give a clear definition of the aims
of the training the training provider cannot deliver quality outcomes.
8
LSDA and NIACE (2004) Recognising and Recording Progress and Achievement in Non-Accredited Learning:
Summary of the Evaluation Report on the Pilot Projects April 2003 to March 2004, Leicester, NIACE
21
These concerns highlight the importance of putting systems in place to support the
process when piloting or implementing an outcome evaluation. Introducing an outcome
measure into a contract without consideration of the necessary supporting systems may
do more harm than good.
 Recommendation 15: When piloting a quality measure, consider
external processes and factors which may influence how
participants apply their learning from the training and how support
for these processes may be needed to achieve the target. For
example, working practices may need to be changed to give
participants opportunities to apply what they have learned.
 Recommendation 16: Inform all stakeholders in advance of how
the results of the outcome evaluation will be used.
 Recommendation 17: When commissioning training, ensure that
the managers of the staff who attend bear joint responsibility for the
achievement of outcome quality measures, e.g. for ensuring their
staff attend, for supporting the application of learning in practice
and for collecting and reporting evaluation data where required.
9. Examples of draft outcome measures for training
Using the results obtained in this project, draft outcome measures are proposed for four
different kinds of non-accredited training:
• In-house training, e.g. provided by a Specialist Palliative Care team or Clinical
Facilitator, where there is an explicit intention that participants’ practice will be
improved by the training
• A communication skills training course commissioned from an external training
provider, e.g. a hospice, a training agency or a higher education institution
• A more general training course or conference combining elements of awareness-
raising and skills improvement
• An awareness-raising session included as part of corporate induction for all staff.
Since these four types of training are designed to deliver different outcomes the
suggested outcome measure is different in each case. The expected outcomes of the
training must be clearly set out in the training specification so that appropriate outcome
measures can be chosen when the training is commissioned.
The results shown in Table 1 suggest that it is reasonable to set a target value of 80%
for the proportion of people attending communication skills training who report a benefit
from it some months afterwards. If fewer than 80% of participants report a benefit an
enquiry needs to be made to identify the factors which are limiting its effectiveness.
22
Example 1: In-house training provided by Specialist Palliative Care team
 Proportion of non-specialist palliative care workforce caring for patients with
palliative and end of life care needs provided with teaching and support reporting
increased confidence and competence in practice 2 –3 months after teaching
intervention: 80%
Example 2: Communication skills training course commissioned from external
training provider
 Proportion of participants attending course reporting changes in their practice 2-3
months after attending: 80%
Example 3: Conference combining skills training and awareness-raising
 Proportion of participants attending conference reporting increased knowledge,
understanding or competence 2-3 months after attending: 80%
Example 4: Awareness-raising training at Corporate Staff Induction
 Proportion of participants attending session reporting increased awareness of
specific issues/needs relating to end of life care 2-3 months after attending: 80%
These draft outcome measures are not firm recommendations but are intended to form a
basis for discussions between the training providers and commissioners of training who
wish to introduce outcome measures into their commissioning. Further work will be
required to develop and agree outcome measures to be included in commissioners’
contracts.
10. Further information
Further information on the results of the NE and SE London Communication Skills
Training for End of Life Care Pilot Project can be found in the following reports,
obtainable from the NE London Cancer Network website.
• Report on Evaluation of Advanced Communication Skills Training
• Evaluation of Communication Skills training for End of Life Care
• Training Commissioning by NHS Trust Training Leads in NE and SE London
23
Acknowledgements
The NE and SE London Cancer Networks would like to thank the following partner
organisations for their contributions to this work:
Barking, Havering and Redbridge University Hospitals NHS Trust
Barts and the London NHS Trust
Bexley Care Trust
City University
Coloma Court Nursing Home
East London NHS Foundation Trust
Greenwich and Bexley Community Hospice
Guy’s and St Thomas’ NHS Foundation Trust
Homerton University Hospital NHS Foundation Trust
King’s College Hospital NHS Foundation Trust
Lewisham University Hospital NHS Trust
Macmillan Cancer Support
Marie Curie Delivering Choice Programme
Newham University Hospital NHS Trust
NHS Barking and Dagenham
NHS Bromley
NHS City and Hackney
NHS Greenwich
NHS Havering
NHS Lambeth
NHS Lewisham
NHS Newham
NHS Redbridge
NHS Southwark
NHS Tower Hamlets
NHS Waltham Forest
North East London NHS Foundation Trust
Outer North East London Community Services
Oxleas NHS Foundation Trust
St Christopher’s Hospice
St Francis’ Hospice
St Joseph’s Hospice
South London and the Maudsley NHS Foundation Trust
South London Healthcare NHS Trust
Whipps Cross University Hospital NHS Trust
©North East London Cancer Network 2010

More Related Content

What's hot

Super User Certification Program - Logistics Final v1.1
Super User Certification Program - Logistics Final v1.1Super User Certification Program - Logistics Final v1.1
Super User Certification Program - Logistics Final v1.1Calvin Yong
 
Berhanu topic introduction to ttlm
Berhanu topic  introduction to ttlmBerhanu topic  introduction to ttlm
Berhanu topic introduction to ttlmberhanu taye
 
Prosystem participant handbook
Prosystem participant handbookProsystem participant handbook
Prosystem participant handbookBarbara Verner
 
Management quota seats for mds in top dental colleges in maharashtra
Management quota seats for mds in top dental colleges in maharashtraManagement quota seats for mds in top dental colleges in maharashtra
Management quota seats for mds in top dental colleges in maharashtraVikram D V
 

What's hot (6)

Super User Certification Program - Logistics Final v1.1
Super User Certification Program - Logistics Final v1.1Super User Certification Program - Logistics Final v1.1
Super User Certification Program - Logistics Final v1.1
 
Training and development
Training and developmentTraining and development
Training and development
 
Berhanu topic introduction to ttlm
Berhanu topic  introduction to ttlmBerhanu topic  introduction to ttlm
Berhanu topic introduction to ttlm
 
Prosystem participant handbook
Prosystem participant handbookProsystem participant handbook
Prosystem participant handbook
 
Volume of Learning
Volume of LearningVolume of Learning
Volume of Learning
 
Management quota seats for mds in top dental colleges in maharashtra
Management quota seats for mds in top dental colleges in maharashtraManagement quota seats for mds in top dental colleges in maharashtra
Management quota seats for mds in top dental colleges in maharashtra
 

Viewers also liked

兜率天傳法生態林 20160220
兜率天傳法生態林 20160220兜率天傳法生態林 20160220
兜率天傳法生態林 20160220earth dhamma
 
Derecho informatico
Derecho informaticoDerecho informatico
Derecho informaticomelipert
 
Tcl corporate presentation 2015 campus 08 02 2016
Tcl corporate presentation   2015 campus 08 02 2016Tcl corporate presentation   2015 campus 08 02 2016
Tcl corporate presentation 2015 campus 08 02 2016Sre jith
 
Viviana cortes presentacion recuperativa analisis politico
Viviana cortes presentacion recuperativa analisis politicoViviana cortes presentacion recuperativa analisis politico
Viviana cortes presentacion recuperativa analisis politicovivianauft
 
18746 2016 LGBT Team Pride Recap_Web (1)
18746 2016 LGBT Team Pride Recap_Web (1)18746 2016 LGBT Team Pride Recap_Web (1)
18746 2016 LGBT Team Pride Recap_Web (1)Vera Rees
 
Zawód weterynarz
Zawód weterynarzZawód weterynarz
Zawód weterynarzbystrzaki
 
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...Susan Hillyard
 
Handouts Teaching for Diversity
Handouts Teaching for Diversity Handouts Teaching for Diversity
Handouts Teaching for Diversity Susan Hillyard
 
Teaching English through Drama using ActionSacks
Teaching English through Drama using ActionSacksTeaching English through Drama using ActionSacks
Teaching English through Drama using ActionSacksSusan Hillyard
 
Slide show durability_of_concrete
Slide show durability_of_concreteSlide show durability_of_concrete
Slide show durability_of_concreteDr J.D. Bapat
 
Photoshop CS6 -Rosto Rachado
Photoshop CS6 -Rosto RachadoPhotoshop CS6 -Rosto Rachado
Photoshop CS6 -Rosto RachadoPedro Friedrich
 
Big data usage in government
Big data usage in governmentBig data usage in government
Big data usage in governmentIntellipaat
 

Viewers also liked (15)

兜率天傳法生態林 20160220
兜率天傳法生態林 20160220兜率天傳法生態林 20160220
兜率天傳法生態林 20160220
 
Derecho informatico
Derecho informaticoDerecho informatico
Derecho informatico
 
Tcl corporate presentation 2015 campus 08 02 2016
Tcl corporate presentation   2015 campus 08 02 2016Tcl corporate presentation   2015 campus 08 02 2016
Tcl corporate presentation 2015 campus 08 02 2016
 
Klasa 2c
Klasa 2cKlasa 2c
Klasa 2c
 
Viviana cortes presentacion recuperativa analisis politico
Viviana cortes presentacion recuperativa analisis politicoViviana cortes presentacion recuperativa analisis politico
Viviana cortes presentacion recuperativa analisis politico
 
18746 2016 LGBT Team Pride Recap_Web (1)
18746 2016 LGBT Team Pride Recap_Web (1)18746 2016 LGBT Team Pride Recap_Web (1)
18746 2016 LGBT Team Pride Recap_Web (1)
 
Zawód weterynarz
Zawód weterynarzZawód weterynarz
Zawód weterynarz
 
TITULO V DEL CÓDIGO ORGÁNICO TRIBUTARIO
TITULO V   DEL CÓDIGO ORGÁNICO TRIBUTARIO  TITULO V   DEL CÓDIGO ORGÁNICO TRIBUTARIO
TITULO V DEL CÓDIGO ORGÁNICO TRIBUTARIO
 
adeniran_CV Updated_April_2016
adeniran_CV Updated_April_2016adeniran_CV Updated_April_2016
adeniran_CV Updated_April_2016
 
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...
PLCs for a Change? Setting up and Maintaining a Professional Learning Communi...
 
Handouts Teaching for Diversity
Handouts Teaching for Diversity Handouts Teaching for Diversity
Handouts Teaching for Diversity
 
Teaching English through Drama using ActionSacks
Teaching English through Drama using ActionSacksTeaching English through Drama using ActionSacks
Teaching English through Drama using ActionSacks
 
Slide show durability_of_concrete
Slide show durability_of_concreteSlide show durability_of_concrete
Slide show durability_of_concrete
 
Photoshop CS6 -Rosto Rachado
Photoshop CS6 -Rosto RachadoPhotoshop CS6 -Rosto Rachado
Photoshop CS6 -Rosto Rachado
 
Big data usage in government
Big data usage in governmentBig data usage in government
Big data usage in government
 

Similar to Quality assurance framework for non-accredited training

#About customers satisfaction and_our_service_delivery
#About customers satisfaction and_our_service_delivery#About customers satisfaction and_our_service_delivery
#About customers satisfaction and_our_service_deliveryberhanu taye
 
Assignment Help Moodle Monkey
Assignment Help Moodle Monkey Assignment Help Moodle Monkey
Assignment Help Moodle Monkey ZomakSoluion
 
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPER
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPERIMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPER
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPERReekita Shah Alias Gala
 
Moving Towards Programmatic Assessment
Moving Towards Programmatic AssessmentMoving Towards Programmatic Assessment
Moving Towards Programmatic AssessmentMedCouncilCan
 
Guide for trainers_-_precursor_control
Guide for trainers_-_precursor_controlGuide for trainers_-_precursor_control
Guide for trainers_-_precursor_controlSonny Alba
 
Learning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programLearning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programTOTVET
 
For this assessment, you will develop an 8-14 slide PowerPoint p.docx
For this assessment, you will develop an 8-14 slide PowerPoint p.docxFor this assessment, you will develop an 8-14 slide PowerPoint p.docx
For this assessment, you will develop an 8-14 slide PowerPoint p.docxtemplestewart19
 
Training_Learning_Standards_ 33.pdf
Training_Learning_Standards_ 33.pdfTraining_Learning_Standards_ 33.pdf
Training_Learning_Standards_ 33.pdfAsfourGroup
 
Improvement Plan Presentation.docx
Improvement Plan Presentation.docxImprovement Plan Presentation.docx
Improvement Plan Presentation.docxwrite4
 
Assignment Help Moodle Monkey
Assignment Help Moodle Monkey Assignment Help Moodle Monkey
Assignment Help Moodle Monkey ZomakSoluion
 
Itp accreditation-itp-elective-training-posts
Itp accreditation-itp-elective-training-postsItp accreditation-itp-elective-training-posts
Itp accreditation-itp-elective-training-postsSukumar Nandy
 
Nurse Safety Improvement Care Plan Processes In Service Presentation.docx
Nurse Safety Improvement Care Plan Processes In Service Presentation.docxNurse Safety Improvement Care Plan Processes In Service Presentation.docx
Nurse Safety Improvement Care Plan Processes In Service Presentation.docxstirlingvwriters
 
Police and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyPolice and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyInteract Business Group
 
Improvement Plan Presentation My Nursing Experts.pdf
Improvement Plan Presentation My Nursing Experts.pdfImprovement Plan Presentation My Nursing Experts.pdf
Improvement Plan Presentation My Nursing Experts.pdfstudy help
 
Continuing Professional Development For Social Workers
Continuing Professional Development For Social WorkersContinuing Professional Development For Social Workers
Continuing Professional Development For Social WorkersGallery
 
Manual Student Mentoring Program A2016 - S2017, no enclosures
Manual Student Mentoring Program A2016 - S2017, no enclosuresManual Student Mentoring Program A2016 - S2017, no enclosures
Manual Student Mentoring Program A2016 - S2017, no enclosuresEnikő Tóth
 
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...angelameek4
 

Similar to Quality assurance framework for non-accredited training (20)

#About customers satisfaction and_our_service_delivery
#About customers satisfaction and_our_service_delivery#About customers satisfaction and_our_service_delivery
#About customers satisfaction and_our_service_delivery
 
Assignment Help Moodle Monkey
Assignment Help Moodle Monkey Assignment Help Moodle Monkey
Assignment Help Moodle Monkey
 
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPER
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPERIMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPER
IMPLEMENT QUALITY MANAGEMENT SYSTEM RESEARCH PAPER
 
Moving Towards Programmatic Assessment
Moving Towards Programmatic AssessmentMoving Towards Programmatic Assessment
Moving Towards Programmatic Assessment
 
Guide for trainers_-_precursor_control
Guide for trainers_-_precursor_controlGuide for trainers_-_precursor_control
Guide for trainers_-_precursor_control
 
Learning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the programLearning tool M4T4: Assesment and improving the program
Learning tool M4T4: Assesment and improving the program
 
For this assessment, you will develop an 8-14 slide PowerPoint p.docx
For this assessment, you will develop an 8-14 slide PowerPoint p.docxFor this assessment, you will develop an 8-14 slide PowerPoint p.docx
For this assessment, you will develop an 8-14 slide PowerPoint p.docx
 
Training_Learning_Standards_ 33.pdf
Training_Learning_Standards_ 33.pdfTraining_Learning_Standards_ 33.pdf
Training_Learning_Standards_ 33.pdf
 
Improvement Plan Presentation.docx
Improvement Plan Presentation.docxImprovement Plan Presentation.docx
Improvement Plan Presentation.docx
 
Assignment Help Moodle Monkey
Assignment Help Moodle Monkey Assignment Help Moodle Monkey
Assignment Help Moodle Monkey
 
Itp accreditation-itp-elective-training-posts
Itp accreditation-itp-elective-training-postsItp accreditation-itp-elective-training-posts
Itp accreditation-itp-elective-training-posts
 
Nurse Safety Improvement Care Plan Processes In Service Presentation.docx
Nurse Safety Improvement Care Plan Processes In Service Presentation.docxNurse Safety Improvement Care Plan Processes In Service Presentation.docx
Nurse Safety Improvement Care Plan Processes In Service Presentation.docx
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
Police and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyPolice and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation Study
 
Internship Report
Internship ReportInternship Report
Internship Report
 
Improvement Plan Presentation My Nursing Experts.pdf
Improvement Plan Presentation My Nursing Experts.pdfImprovement Plan Presentation My Nursing Experts.pdf
Improvement Plan Presentation My Nursing Experts.pdf
 
Continuing Professional Development For Social Workers
Continuing Professional Development For Social WorkersContinuing Professional Development For Social Workers
Continuing Professional Development For Social Workers
 
Manual Student Mentoring Program A2016 - S2017, no enclosures
Manual Student Mentoring Program A2016 - S2017, no enclosuresManual Student Mentoring Program A2016 - S2017, no enclosures
Manual Student Mentoring Program A2016 - S2017, no enclosures
 
KMTC Quality Assurance Guidelines 2016
KMTC Quality Assurance Guidelines 2016KMTC Quality Assurance Guidelines 2016
KMTC Quality Assurance Guidelines 2016
 
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...
The Buyer's Guide to Technical Training: Optimizing Work Instructions for Job...
 

Quality assurance framework for non-accredited training

  • 1. Quality Assurance for Non-Accredited Training: Recommendations and Guidance for Commissioners, Training Providers and Clinicians Pam Donovan NE and SE London Communication Skills Training for End of Life Care Pilot Project NE London Cancer Network 1 st Floor Outpatients Department Royal London Hospital Whitechapel London E1 1BB October 2010
  • 2. 2 Intended Audience The recommendations and guidance contained in this document are intended to help commissioners of training, training providers and clinicians to commission, deliver and evaluate training which meets the needs of their organisation. The contents of the document may be used by anyone who is interested in improving healthcare provision through the education of the workforce. Contents Intended Audience ............................................................................................................... 2 Contents.................................................................................................................................. 2 Summary and Recommendations.................................................................................... 3 Summary ................................................................................................................................... 3 Recommendations..................................................................................................................... 3 1. Introduction ....................................................................................................................... 5 2. The education and training commissioning cycle .................................................. 6 Figure 1: The NHS Education and Training Commissioning cycle and its relationship to service commissioning....................................................................................................... 7 3. RARPA: an established quality assurance framework for non-accredited training .................................................................................................................................... 8 4. A framework for evaluating training ........................................................................... 9 Figure 2: The three evaluation processes for training............................................................... 9 5. The QA framework in action: evaluation of outcomes for a sample of communication skills training courses for end of life care..................................... 10 Table 1: Communication Skills training evaluations ........................................................ 12 Figure 3: Measured effectiveness of communication skills training courses for end of life care ......................................................................................................................... 14 6. Specification of the aims of training......................................................................... 15 7. Practical considerations for conducting an outcome evaluation..................... 16 7.1 Maximising the response rate............................................................................................ 17 7.2 Resources required ........................................................................................................... 18 7.3 Quality of feedback............................................................................................................ 19 8. Using the results of an outcome evaluation........................................................... 20 9. Examples of draft outcome measures for training ............................................... 21 10. Further information ..................................................................................................... 22 Acknowledgements............................................................................................................ 23
  • 3. 3 Summary and Recommendations Summary This report describes a new approach to quality assurance for non-accredited training commissioned within the NHS. This approach has been developed as part of the NE and SE London Communication Skills Training for End of Life Care Pilot Project. The quality assurance process described in this report can be applied to training on any topic commissioned for any staff group. The approach is based on RARPA, the national quality-assurance framework specified for non-accredited training commissioned by the Learning and Skills Council. The quality assurance process uses outcome measures based on participants self- evaluating some time after the training the extent to which they have applied their learning in their jobs. Although these outcome measurements are not precise they are capable of highlighting important quality issues relating to mismatch of training to participants’ roles, and thus support both commissioners and providers in achieving value for money in relation to training. Recommendations This section summarises the recommendations arising from this project for anyone involved in the commissioning of training courses or other training resources (e.g. posts which include training responsibilities) 1. Specify the aims of the training and the intended outcomes of training carefully when the training is commissioned. 2. When specifying the aims and outcomes of training, obtain informed input from service commissioners and service providers to ensure that the training addresses service needs and current and future service priorities. 3. Specify the aims of training in terms of learning outcomes, i.e. exactly what participants are expected to learn. Ensure that service providers and service commissioners agree the learning outcomes. 4. Choose outcome measures which measure the intended learning outcomes as directly as possible, even if the measurement is not very precise. 5. Use a very small number of outcome measures. Choose measures relating to the topics of highest priority to the service or organisation commissioning the training. 6. Choose outcome measures which give an intermediate score so that issues can be highlighted and improvements demonstrated. 7. Do not expect to collect evaluation data from every participant who attends the training, but set a minimum acceptable number of responses in advance. 8. Tailor the data-collection method used for measuring the outcomes to the requirements and preferences of the particular group(s) of staff who attend the training.
  • 4. 4 9. Ensure that the data-collection method preserves the anonymity of the participants in order to maximise the quality of feedback. 10. Design the outcome evaluation as an integral part of the training in order to maximise response rates and minimise resource requirements, e.g. by making provision of an attendance certificate conditional on completing the evaluation survey. For extended training programmes this could involve including planned evaluation sessions as part the training programme. 11. Identify the resources required to implement an outcome evaluation at the time when the outcome evaluation is specified and include the necessary resource in contracts. 12. Keep evaluation questions short, simple and unambiguous to avoid misinterpretation. 13. Do not ask for more than 5 answers in an evaluation survey, including answers to parts of questions; i.e. respondents should not be expected to tick more than 5 boxes in total. The fewer questions asked, the better. 14. Include an ‘open’ question requesting general comments. 15. When piloting a quality measure, consider external processes and factors which may influence how participants apply their learning from the training and how support for these processes may be needed to achieve the target. 16. Inform all stakeholders in advance of how the results of the outcome evaluation will be used. 17. When commissioning training, ensure that the managers of the staff who attend bear joint responsibility for the achievement of outcome quality measures, e.g. for ensuring their staff attend, for supporting the application of learning in practice and for collecting and reporting evaluation data where required.
  • 5. 5 1. Introduction This report describes a new approach to quality assurance for non-accredited training commissioned within the NHS. This approach has been developed as part of the NE and SE London Communication Skills Training for End of Life Care Pilot Project, which was one of 12 pilot projects supported during 2009-2010 by the National End of Life Care Programme and Connected, the national communication skills programme for cancer services. The remit of the project was to pilot improvements in communication skills training for the end of life care workforce. However, the quality assurance process described in this report is completely general and could be applied to education and training on any topic commissioned for any staff group. The approach is based on RARPA, the national quality-assurance framework specified for non-accredited training commissioned by the Learning and Skills Council1 . During the first stage of the project a series of informal interviews was held with commissioners and providers of training for end of life care in NHS Trusts across NE and SE London. These interviews confirmed that commissioning of training is currently highly fragmented. Formally accredited education is commissioned by Trust Education Leads through block contracts with higher education institutions while a range of non-accredited training courses is commissioned by service providers from their service budgets and by clinical service commissioners from their commissioning budgets. In addition, in-house training is delivered to NHS staff by clinical and non- clinical specialists funded through block commissions. There is generally little coordination between the various training commissioners and training is seldom evaluated in terms of its outcomes, so the impact of training on service quality and the patient experience can rarely be demonstrated. Two pressing needs for improvement in education and training commissioning for end of life care were identified through the interviews: • The need for closer links between training commissioning and organisational priorities, risks and goals. • The need for a process for measuring the training outcomes in terms of their application in practice. In the second stage of the project post-course outcome evaluations were conducted for a sample of non-accredited communication skills training courses delivered to members of the end of life workforce in NE or SE London. The aims of this part of the project were: • to develop outcome quality measures for non-accredited communication skills training for use in future commissioning • to explore the practicalities of conducting outcome evaluations and assess the resources required. This report presents a summary of the relevant results from the project and provides recommendations for commissioners wishing to implement an outcomes-based quality-assurance process for training they commission. 1 LSC (2005) Recognising and Recording Progress and Achievement in Non-accredited Learning, Coventry, Learning and Skills Council
  • 6. 6 2. The education and training commissioning cycle Figure 1 is a schematic representation of the education and training commissioning cycle for end of life care. The cycle begins with strategic planning, followed by procurement, performance managing, monitoring and evaluation. The evaluation results should then be used to refresh the strategy before the cycle begins again. Training in the NHS underpins service delivery so the needs and priorities of clinical service delivery should drive the education commissioning cycle. Figure 1 indicates points in the education commissioning cycle where links to service commissioning and delivery should occur. The strategic review which begins and ends each cycle should also involve both education and service commissioners to ensure common priorities are agreed. The two pressing needs identified during this project relate to the monitoring and evaluation section of Figure 1 and to the frequent omission of a strategic review. However, a strategic review cannot achieve its purpose without measurements of the outcomes of the training which has been delivered. Thus development of outcome measurements was identified as the first priority.
  • 7. 7 Figure 1: The NHS Education and Training Commissioning cycle and its relationship to service commissioning2 2 This diagram was developed by and is reproduced with the permission of Juliette Bigley, Programme Manager, Marie Curie Delivering Choice Programme
  • 8. 8 3. RARPA: an established quality assurance framework for non-accredited training The RARPA framework (Recognising and Recording Progress and Achievement in non-accredited learning) was developed to assure quality in education and training not leading to an accredited qualification. Its development was initiated by the Learning and Skills Council (LSC) which funds adult education delivered by further education colleges, voluntary organisations and in specialised settings such as prisons. The LSC wished to demonstrate whether the training they commissioned provided value for money and contributed to regional or national educational priorities and goals. RARPA was developed in 2002, piloted in 2003-4 and all LSC-funded providers of education and training were required to ensure full implementation by September 2006. RARPA comprises the following five-stage process: 1. Specification of aims appropriate to an individual learner or groups of learners 2. Initial assessment to establish the learner’s starting point 3. Identification of appropriately challenging learning objectives 4. Recognition and recording of progress and achievement during the programme 5. End of programme learner self-assessment, tutor assessment, review of overall progress and achievement The priority of the NHS is provision of high-quality healthcare, with the training and education of the healthcare workforce forming a means towards this goal. This perspective requires input at RARPA stage 1 from the employing service or organisation on its needs, priorities and goals, while RARPA stage 5 (assessment of learning) requires a focus on the application of learning in practice to deliver patient care. The three intermediate steps are the core business of the training provider and remain unchanged. Commissioners of training should thus be involved in the specification of the intended outcomes of the training (stage 1) and in reviewing the measured outcomes after the training (stage 5) but considerations relating to how the training is delivered and the management of individual learners (stages 2-4) are the responsibility of the training provider. Commissioners should implement a quality assurance (QA) process to ensure that the training they commission leads to a measurable improvement in patient care. In planning the QA process it is necessary to begin by considering how this outcome will be measured. In the next two sections of this report we discuss how training can be evaluated in the final RARPA stage, and in the following section the first stage of RARPA (specification of the aims of training) will be considered.
  • 9. 9 4. A framework for evaluating training Three distinct processes for evaluating training can be identified which aim to answer three fundamentally different questions, as follows: • Monitoring: did staff attend the training? • Evaluation: did staff learn from the training? • Assessment: have staff applied their learning from the training in practice? Each of the three processes of monitoring, evaluation and assessment can be conducted formally or informally, using a variety of techniques and at a range of levels of sophistication, as shown in Figure 2. Moving upwards and to the right across Figure 2 the standard of quality assurance improves, but the resources (including input of staff time) needed to implement the evaluation process increase. This is indicated in Figure 2 by colour-coding in bronze, silver or gold. The ‘gold standard’ of quality assurance is achieved by independent assessment of an individual’s practice, conducted either formally (through accreditation of the training) or informally. However, this gold standard is expensive to implement. Commissioners of training should choose the appropriate level of evaluation for any specific training intervention on the basis of cost-benefit considerations. Figure 2: The three evaluation processes for training High assurance, high cost Low assurance, low cost Monitored Evaluated Assessed % by workforce group Percentages Entry, exit and post-course Entry, exit Numbers Exit Accreditation Observations by colleague Self-assessment
  • 10. 10 Monitoring of training attendance is routinely carried out throughout the NHS for statutory and mandatory training. In the course of this project it was found that conferences and training workshops relating to end of life care delivered in the NE and SE London sectors were usually evaluated at the lowest level in Figure 2, i.e. by means of exit questionnaires administered at the end of the training. A few instances were found where entrance and exit surveys were used, and a very small number of training courses were evaluated using entrance, exit and post-course surveys. Assessment of whether training has been applied in participants’ subsequent practice was seldom attempted. Assessment, rather than monitoring or evaluation, is clearly necessary to judge the value of training in improving service delivery. Most NHS Trust Education Leads interviewed during the project recognised the value of assessment and stated that they would like to assess the outcomes of more of the training they commission. One reason given for not undertaking outcome evaluations routinely was shortage of resources. In view of this, the project concentrated on developing assessment at the lowest (and therefore cheapest) level in Figure 2, i.e. self-assessment by trainees. This method of assessment is subjective, since it relies on self- reporting, but commissioners do not need a highly precise measure because they are interested in checking the quality of the training rather than ‘grading’ participants’ performance. The project also examined how measurement of training outcomes can be made most effective and efficient, and this is discussed in section 7. 5. The QA framework in action: evaluation of outcomes for a sample of communication skills training courses for end of life care During the project post-course outcome evaluations were carried out for a sample of non-accredited3 training courses on communication skills for end of life care. Courses and training events were selected for evaluation using the following criteria: • Training delivery completed between 1 Jan 2008 and 31 July 2010 • A range of types of training event, including conferences, workshops, short courses (up to 1 day) and longer courses (up to 3 days) • A range of delivery methods, including conference presentations, role play, simulations, group discussion and didactic teaching • A range of training providers, including in-house clinical specialists, in-house training departments and external training providers • A range of target groups spanning workforce Groups A, B and C identified in the National End of Life Care strategy4 , different professional disciplines and different work contexts including acute hospitals, intermediate and continuing care, community healthcare, nursing homes, mental healthcare settings and primary care. 19 outcome evaluations were undertaken in total. 3 of these were unsuccessful and the reasons for this are discussed further in section 8. Details of the training courses, 3 The sample included four workshops for postgraduate trainee doctors which form part of an accredited Foundation Programme but are not accredited separately in their own right 4 Department of Health, National End of Life Care Strategy, July 2008
  • 11. 11 the target groups of participants and the data collection methods used are summarised in Table 1. The key feature of all the outcome evaluations was that course participants were asked some time after the course whether they had used their learning in the workplace. The evaluations thus implemented the third evaluation process, assessment, at the lowest level identified in Figure 2. All the evaluations included the following questions: • Did you find the training useful? • Have you used what you learned in your role? The answers received are summarised for the 16 successful evaluations in Figure 3. The effectiveness, judged as the proportion of participants reporting application of learning in their practice some months after the training, varied between 35% and 100%. The highest possible effectiveness of 100% was reported for three of the training courses: the palliative care training delivered by the Redbridge Community Health Services Macmillan Specialist Palliative Care team to qualified nurses working in Nursing Homes (EoLC workforce Group B), the ‘I don’t know what to say’ course for reception staff (EoLC workforce Group C) delivered by St Christopher’s Hospice and the Cancer User Groups Facilitator Training delivered by the SE London Cancer Network (EoLC workforce Group A). This demonstrates that highly effective training can be commissioned for all levels of the End of Life Care workforce. Three factors were identified in the evaluations which had reduced the measured effectiveness of the other training courses in the sample: • Participants who had not had an opportunity to apply their learning in the period between the training and its evaluation. • Participants who said they had previously attended similar training and had learned nothing new • Participants who reported that topics included in the training were not relevant to their role These findings highlight the following quality issues to be considered carefully when commissioning training: • Which groups of staff will be in a position to apply their learning, how often they will have the opportunity to apply it and, in the case of practical skills, whether they will be able to do this sufficiently frequently to maintain their competence • The starting levels of potential participants • The match between training content and potential participants’ roles The commissioners and providers of the training were unaware of these quality issues before the evaluation. This confirms the importance of measuring outcomes from training when commissioning for quality.
  • 12. 12 Table 1: Communication Skills training evaluations Training Title Training Provider Target group EoLC strategy workforce category 5 Competences 6 Training delivery date(s) Data collection method Number of responses Interval between training and evaluation Advanced Communication Skills Training (ACST) Connected Cancer specialists Group A 1a, 1b, 1d, 1e Sept 2008 – Oct 2009 Telephone interviews 119 (70%) 3 - 17 months Breaking Bad News Medical Education Department, Acute Hospital A Junior doctors Group B 1a, 1b, 1c October 2009 On-line survey 6 (17%) 4 months Communication Skills Workshop Medical Education Department, Acute Hospital B Junior doctors Group B 1a, 1b, 1c Sept 2009 – March 2010 On-line survey 6 (17%) 1 – 5 months Gold Standards Framework in Action conference St Christopher’s Hospice GPs Group B 1a, 1c, 2a 3d, 3e, 3f 4a, 4b July 2009 On-line survey 8 (23%) 8 months The CNS Role in Leading EoLC conference NE London End of Life Programme Clinical Nurse Specialists Groups A and B Not applicable June 2009 On-line survey 22 (33%) 9 months Palliative Training for Qualified Nursing Home Staff Redbridge MacMillan Specialist Palliative Care Team Qualified nurses working in Nursing Homes Group B 1a, 1d, 1e,2a, 2c, 2e, 3a, 3b, 3c, 3d, 3f, 4a January – April 2010 Focus group and individual questionnaires 8 (100%) 1-3 months Initiating Difficult Discussions St Joseph’s Hospice Health and social care professionals and support staff Group B 1a, 1b, 1d, 1e December 2009 Telephone interviews 10 (100%) 5 months I don’t know what to say St Christopher’s Hospice Reception and other support staff Group C 1b, 1d, 1e Sept 2009 On-line survey 7 (39%) 8 months Advance Care Planning for End of Life Care NE London End of Life Programme Health and social care professionals and support staff Groups A and B 1a, 1b, 1c, 1d, 1e, 1f April 2009 On-line survey 22 (18%) 13 months 5 Department of Health, National End of Life Care Strategy, July 2008 6 NHS National End of Life Care Programme, Common core competences and principles for health and social care workers working with adults at the end of life, June 2009
  • 13. 13 End of Life Care Workshop City University/East London Foundation Trust Mental health nurses and support staff Groups B and C 1a, 1e, 2a, 3e, 3f, 3g, 4a, 4b, 4c, 4d November 2009 – April 2010 On-line survey 9 (35%) 2 – 7 months Bereavement Care in the Hospital Setting Barts and the London NHS Trust Hospital nurses and support staff Group B 1a, 1b, 1e April 2010 Face-to-face or telephone interviews 13 (100%) 2 months Cancer Support Groups Facilitator training SE London Cancer Network Clinical nurse specialists, specialist AHPs and psychological support staff Group A 1a, 1b, 1c, 1d, 1e April 2008 – May 2010 On-line survey 25 (41%) 3 – 27 months Communication Skills for Administration Staff St Francis’ Hospice Administration staff Group C 1a, 1b, 1d, 1e May 2010 On-line survey 11 (55%) 2 months Sage and Thyme Guys and St Thomas’ NHS Foundation Trust Hospital nurses Group B 1d, 1e June 2010 On-line survey 15 (68%) 1 month Liverpool Care Pathway training Homerton University Hospital NHS Foundation Trust Hospital nurses Group B 1a, 1c, 1e July 2009 Face-to-face or telephone interviews 15 (50%) 12 months Getting the Message Across Medical Education Department, Acute Hospital D Junior doctors Group B 1a, 1b, 1c May 2010 On-line survey 5 (10%) 2 months GSF Learning Sets NHS Lambeth GPs and practice staff Group B 1a, 1b, 1c, 1d, 1e Jan – March 2010 On-line survey 14 (30%) 1-4 months Breaking Bad News Medical Education Department, Acute Hospital C Junior doctors Group B 1a, 1b, 1c May 2010 On-line survey 2 (6%) 1 month Step into Palliative Care Whipps Cross University Hospital Palliative Care Team District Nurses, Nursing Home nurses and support staff Group B 1a, 1d, 1e, 2a, 2c, 2e, 3a, 3b, 3c, 3d, 3f, 4a Sept – Oct 2009 on-line or postal questionnaire 3 (13%) 6 months
  • 14. 14 Figure 3: Measured effectiveness of communication skills training courses for end of life care 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Idon'tknow w hatto say C ancerSupportG roups Facilitatortraining Palliative C are forqualified nursing hom e staffLC P training C om m unication Skills forAdm inistration Staff Breaking Bad N ew s C om m unication Skills w orkshop C N S leading EoLCAC P forEoLCG SF in Action Initiating D ifficultD iscussions AC ST Bereavem entcare Sage and Thym e G etting the m essage across EoLC w orkshop Learning applied in practice Useful but not applied in practice Training not useful
  • 15. 15 6. Specification of the aims of training If a training commissioner intends to use outcome measures as part of a quality assurance framework it is important to start with a very clear specification of the aims of the training. This is the first stage of the RARPA process, and Figure 1 shows that this is one of the stages of the training commissioning cycle that requires input from service commissioners and providers as well as from the training commissioner and training providers.  Recommendation 1: Specify the aims of the training and the intended outcomes of training carefully when the training is commissioned.  Recommendation 2: When specifying the aims and outcomes of training, obtain informed input from service commissioners and service providers to ensure that the training addresses service needs and current and future service priorities.  Recommendation 3: Specify the aims of training in terms of learning outcomes, i.e. exactly what participants are expected to learn. Ensure that service providers and service commissioners agree the learning outcomes. Once the learning outcomes of the training have been decided the next stage is to decide how these outcomes will be measured. Outcome measures defined in terms of changes in practice are not applicable to all training. Some training is intended to raise awareness of situations which participants may only encounter occasionally. Training conferences may cover a range of topics not all of which may be directly relevant to the practice of every participant. A useful classification for considering the type of outcome measure required is to assign each intended learning outcome into one of three categories: increased knowledge, increased skills or increased understanding. The best method of measuring the learning outcomes will be different for these three categories: • Outcome of knowledge-based training - measure by quiz or test • Outcome of skills-based training - measure by assessment of the skills in practice • Outcome of understanding-based training – measure by probing questioning. This project was specifically concerned with skills-based training so the outcome measure was based on self-assessment of participants’ skills in practice. Commissioners do not need a highly precise outcome measure. The measure is intended to be a quality check for the training, not a means of ‘grading’ participants’ performance.
  • 16. 16  Recommendation 4: Choose outcome measures which measure the intended learning outcomes as directly as possible, even if the measurement is not very precise. “Crude measures of the right things are better than precise measures of the wrong things”7 The number of outcome measures should be limited to make the assessment process quick and easy to implement. A single, well-designed outcome measure gives more valuable results than several poorly-designed measures. There is no need to try to measure every intended learning outcome. It is best to choose measures for one or two learning outcomes, preferably those which relate to the highest priorities for the service or organisation.  Recommendation 5: Use a very small number of outcome measures. Choose measures relating to the topics of highest priority to the service or organisation commissioning the training. Outcome measures should be chosen which are capable of revealing any problems with the training and demonstrating improvements. There is no point in bothering to measure something which will come out 100% successful or 100% unsuccessful.  Recommendation 6: Outcome measures should be chosen which give an intermediate score so that issues can be highlighted and improvements demonstrated. 7. Practical considerations for conducting an outcome evaluation Once the aims of the training have been specified and outcome measures chosen and agreed in partnership with the training provider, the final consideration is what resources must be provided to measure the outcomes. Whether the intended learning outcomes from training are classified as increased knowledge, skills or understanding, a commissioner will be interested in how these have been applied in participants’ practice. It will therefore be necessary to contact course participants some time after the training, when they have had an opportunity to put into practice what they learned. Contacting participants for post-course evaluations involves extra time, trouble and expense and some training commissioners who were interviewed cited this as a reason for not conducting outcome evaluations. In this section the following practicalities of conducting an outcome evaluation are considered: • How to get sufficient feedback in a post-course evaluation 7 Measurement for Improvement, NHS Institute for Innovation and Improvement (2005), p. 10
  • 17. 17 • How to minimise the resources required • How to get good quality feedback 7.1 Maximising the response rate An outcome evaluation does not require a very high response rate. The aim is to assess quality rather than to produce research-level data. However, training courses and workshops often include only a small number of participants and unless a reasonable proportion of them provide feedback the evaluation is not useful. It is a good idea to decide on a minimum number of responses for an acceptable result before conducting the evaluation. For the evaluations which formed part of this project this minimum number was set at 5 participants or 10% of participants (whichever was smaller). Two of the evaluations shown in Table 1 were considered failures because they did not achieve this minimum response level.  Recommendation 7: Do not expect to collect evaluation data from every participant who attends the training, but set a minimum acceptable number of responses in advance. Different data-collection methods produced different response rates. Table 1 shows that focus groups, face-to-face and telephone interviews produced higher response rates than surveys. However, none of the methods tested proved the best for every group of staff. The advantages and disadvantages of each method for different staff groups are summarised as follows: • Face-to-face interviews: Found to be the most effective method for collecting feedback from nursing assistants and nurses, who responded best to personal contact. The most inclusive method: people with sensory disabilities, learning disabilities, those with poor standards of literacy and those who speak English as an additional language (or not at all) can participate. • Focus groups: Found to be reasonably effective for nurses and nursing assistants but less effective than face-to-face interviews because not everyone is prepared to speak in a group. Achieved 100% response rate when supplemented by other measures to collect feedback from ‘silent’ group members. • Telephone interviews: Found to be effective for healthcare professionals. Support staff found it disconcerting to be questioned on the telephone by someone they had not met. An effective means of collecting feedback from staff working weekend, twilight-hours or night shifts. • On-line surveys: The most effective method for doctors and for administration staff who use computers as an integral part of their role. Less effective with nurses who responded better to personal contact. • Paper questionnaires: Found to be the least effective method. In this project this method was tried only once. The evaluation failed because there were very few responses.
  • 18. 18  Recommendation 8: Tailor the data-collection method used for measuring the outcomes to the requirements and preferences of the particular group(s) of staff who attend the training. Participants may be more willing to participate in the outcome evaluation (and the quality of their responses will be higher) if their anonymity is preserved.  Recommendation 9: Ensure that the data-collection method preserves the anonymity of the participants in order to maximise the quality of feedback. 7.2 Resources required This project demonstrated that conducting an outcome evaluation need not be onerous or resource-intensive. However, some additional resources will be required. In this section the resources needed for the various types of evaluation are summarised. • On-line surveys: on-line survey tools make designing and administering a survey very quick and simple. Data is collected automatically and can be down-loaded in several formats. Participants must have access to the internet to respond to the survey. Easily the most efficient method if feedback is required from a large number of people. • Paper questionnaires: requires substantially more resources than the on-line survey for designing, printing, reproducing and distributing the survey and for entering the responses onto the computer. Administrative support will be required. • Telephone interviews: quick and efficient for relatively small groups of people (fewer than about 30) when participants have easy access to the telephone at work. • Face-to-face interviews: not as slow as might be thought where a group of trainees work in the same location. During the project up to 8 participants were interviewed in a one-hour visit to a ward. Not suitable when feedback is required from large numbers of participants. • Focus group: requires the most resources if arranged as a ‘free-standing’ group. However, the focus group included in this project was planned as an integral component of the training and thus required no extra resources at all.  Recommendation 10: Design the outcome evaluation as an integral part of the training in order to maximise response rates and minimise resource requirements, e.g. by making provision of an attendance certificate conditional on completing the evaluation survey. For extended training programmes this could involve including planned evaluation sessions as part the training programme Whatever the data collection method chosen, additional administrative resources will be required to analyse the results of the outcome evaluation and report them to the relevant stakeholders.
  • 19. 19 The amount and nature of resources needed to conduct outcome evaluations will thus depend on how many evaluations a commissioner chooses to carry out and the methods chosen for conducting them. If the data collection is to be carried out by service providers and reported to the commissioner the necessary resources should be provided within the service contract.  Recommendation 11: Identify the resources required to implement an outcome evaluation at the time when the outcome evaluation is specified and include the necessary resource in contracts. 7.3 Quality of feedback In the course of the project it was found that the key consideration for obtaining useful feedback was careful design of the evaluation questions. One evaluation shown in Table 1 failed because errors in design led to some respondents misinterpreting questions.  Recommendation 12: Keep evaluation questions short, simple and unambiguous to avoid misinterpretation A second consideration is the total number of questions included in the evaluation. The likelihood of respondents omitting to answer some of the questions increases as the survey gets longer. Some surveys issued in this project were too long and the high proportion of missing answers created a problem when analysing the responses.  Recommendation 13: Do not ask for more than 5 answers in an evaluation survey, including answers to parts of questions; i.e. respondents should not be expected to tick more than 5 boxes in total. The fewer questions asked, the better. It was found to be useful to include at least one ‘open’ question asking for general comments on the training. Although most respondents will not answer this, any answers that are received tend to be unexpected and useful.  Recommendation 14: Include an ‘open’ question requesting general comments.
  • 20. 20 8. Using the results of an outcome evaluation Outcome evaluation produces tangible benefits for all training stakeholders: training participants, training providers, training commissioners and strategic management. Post- course evaluation encourages training participants to reflect on what they have learned and its application to their roles. Evaluation of RARPA pilot projects demonstrated that the benefits of the quality framework included better motivation, faster progress and better engagement with learning8 . These benefits arise from learners having a sense that someone is paying attention to what they are learning, rather than just leaving them to get on with it. Outcome evaluation is useful for trainers and training providers because it provides them with feedback on their role. Examples of how the training providers who participated in this project have used their evaluation results include the following: • Adjustment of the course content • Adjustment of the training delivery methods • Alterations to the grades of staff admitted to the training • As evidence for senior management of the value of the training • To support applications for funding to continue the training 91% of the training providers who participated in the project reported that they would definitely take part in an outcome evaluation again. 9% reported that they would probably do so. For training commissioners the evaluation results form a vital component of the commissioning cycle, allowing commissioners to assess value for money and to work in partnership with providers to improve quality and productivity. For strategic management the results help to track progress towards organisational goals. During the project a number of different stakeholders expressed concerns about how the results of outcome evaluations might be used. These concerns included the following: • Selection of participants. Training providers often do not have control of who attends their training, and are sometimes left to deal with unsuitable, unwilling or ill-prepared participants. Training providers were concerned that in order to achieve quality targets it would be necessary for service management to share responsibility for the performance of the learners they send on the training. • Support for application of learning in practice. Stakeholders thought that in order to ensure quality targets are met, robust systems are needed to assist training participants in putting into practice what they have learned. • Definition of course aims. The first step of the RARPA process is key to a successful quality assurance process. Unless commissioners give a clear definition of the aims of the training the training provider cannot deliver quality outcomes. 8 LSDA and NIACE (2004) Recognising and Recording Progress and Achievement in Non-Accredited Learning: Summary of the Evaluation Report on the Pilot Projects April 2003 to March 2004, Leicester, NIACE
  • 21. 21 These concerns highlight the importance of putting systems in place to support the process when piloting or implementing an outcome evaluation. Introducing an outcome measure into a contract without consideration of the necessary supporting systems may do more harm than good.  Recommendation 15: When piloting a quality measure, consider external processes and factors which may influence how participants apply their learning from the training and how support for these processes may be needed to achieve the target. For example, working practices may need to be changed to give participants opportunities to apply what they have learned.  Recommendation 16: Inform all stakeholders in advance of how the results of the outcome evaluation will be used.  Recommendation 17: When commissioning training, ensure that the managers of the staff who attend bear joint responsibility for the achievement of outcome quality measures, e.g. for ensuring their staff attend, for supporting the application of learning in practice and for collecting and reporting evaluation data where required. 9. Examples of draft outcome measures for training Using the results obtained in this project, draft outcome measures are proposed for four different kinds of non-accredited training: • In-house training, e.g. provided by a Specialist Palliative Care team or Clinical Facilitator, where there is an explicit intention that participants’ practice will be improved by the training • A communication skills training course commissioned from an external training provider, e.g. a hospice, a training agency or a higher education institution • A more general training course or conference combining elements of awareness- raising and skills improvement • An awareness-raising session included as part of corporate induction for all staff. Since these four types of training are designed to deliver different outcomes the suggested outcome measure is different in each case. The expected outcomes of the training must be clearly set out in the training specification so that appropriate outcome measures can be chosen when the training is commissioned. The results shown in Table 1 suggest that it is reasonable to set a target value of 80% for the proportion of people attending communication skills training who report a benefit from it some months afterwards. If fewer than 80% of participants report a benefit an enquiry needs to be made to identify the factors which are limiting its effectiveness.
  • 22. 22 Example 1: In-house training provided by Specialist Palliative Care team  Proportion of non-specialist palliative care workforce caring for patients with palliative and end of life care needs provided with teaching and support reporting increased confidence and competence in practice 2 –3 months after teaching intervention: 80% Example 2: Communication skills training course commissioned from external training provider  Proportion of participants attending course reporting changes in their practice 2-3 months after attending: 80% Example 3: Conference combining skills training and awareness-raising  Proportion of participants attending conference reporting increased knowledge, understanding or competence 2-3 months after attending: 80% Example 4: Awareness-raising training at Corporate Staff Induction  Proportion of participants attending session reporting increased awareness of specific issues/needs relating to end of life care 2-3 months after attending: 80% These draft outcome measures are not firm recommendations but are intended to form a basis for discussions between the training providers and commissioners of training who wish to introduce outcome measures into their commissioning. Further work will be required to develop and agree outcome measures to be included in commissioners’ contracts. 10. Further information Further information on the results of the NE and SE London Communication Skills Training for End of Life Care Pilot Project can be found in the following reports, obtainable from the NE London Cancer Network website. • Report on Evaluation of Advanced Communication Skills Training • Evaluation of Communication Skills training for End of Life Care • Training Commissioning by NHS Trust Training Leads in NE and SE London
  • 23. 23 Acknowledgements The NE and SE London Cancer Networks would like to thank the following partner organisations for their contributions to this work: Barking, Havering and Redbridge University Hospitals NHS Trust Barts and the London NHS Trust Bexley Care Trust City University Coloma Court Nursing Home East London NHS Foundation Trust Greenwich and Bexley Community Hospice Guy’s and St Thomas’ NHS Foundation Trust Homerton University Hospital NHS Foundation Trust King’s College Hospital NHS Foundation Trust Lewisham University Hospital NHS Trust Macmillan Cancer Support Marie Curie Delivering Choice Programme Newham University Hospital NHS Trust NHS Barking and Dagenham NHS Bromley NHS City and Hackney NHS Greenwich NHS Havering NHS Lambeth NHS Lewisham NHS Newham NHS Redbridge NHS Southwark NHS Tower Hamlets NHS Waltham Forest North East London NHS Foundation Trust Outer North East London Community Services Oxleas NHS Foundation Trust St Christopher’s Hospice St Francis’ Hospice St Joseph’s Hospice South London and the Maudsley NHS Foundation Trust South London Healthcare NHS Trust Whipps Cross University Hospital NHS Trust ©North East London Cancer Network 2010