SlideShare a Scribd company logo
Myscience is an initiative of the White Rose University Consortium
(comprising the Universities of Leeds, Sheffield and York) and
Sheffield Hallam University. Myscience manages the national
network of Science Learning Centres, the National STEM Centre
and other programmes supporting STEM education.
Registered Office:
Myscience.co Limited, University of York, Heslington, York, North Yorkshire, YO10 5DD
Tel +44 (0) 1904 328300 Fax +44 (0) 1904 328328 Email enquiries@national.slcs.ac.uk
Websites www.sciencelearningcentres.org.uk and www.nationalstemcentre.org.uk
Myscience.co Limited, registered in England and Wales, Company Number 05081097.
Effective evaluation of the impact of CPD: What does research
tell us?
Background
Over the past ten years the evidence for the critical role of the teacher on pupil outcomes has been
supported by a number of systematic reviews (Curee, 2012; Schleicher, 2011).
The central question is:
“How do we determine the effects and effectiveness of activities designed to enhance the professional
knowledge and skills of educators so that they might in turn improve the learning of students?”
Guskey (2000)
Well-designed evaluation serves multiple purposes. It tells us about the quality of current practice as
well as informing us about how to develop this practice (Guskey, 2000).
Historically, evaluation has been seen as costly and time-consuming and often left to ‘experts’ who
are called in at the end and asked to determine if what has been done has made a difference. As a
result much evaluation of CPD programmes focused on collecting ‘soft’ evidence of impact on
teachers’ reactions to the CPD with limited regard to the collection of qualitative evidence on either
teacher development or on the impact on pupils’ outcomes.
This paper is concerned with identifying how to evaluate the impact of CPD on teachers’ development
and pupil outcomes and the implications for the provision of CPD to enable this to occur.
What is the research evidence of effective professional development?
We are starting with the premise that effective CPD should have outcomes on both teacher
development and pupil learning, so what can research tell us about what needs to be included in CPD
to support this process?
The CUREE report (2012) draws on published research and other evidence to address the question
of: “what are the characteristics of high quality professional learning for practitioners in education?”
The main focus of the report is the features of professional learning, for teachers and their leaders,
which lead to benefits for their pupils and students. They conclude that CPD for teachers is more
likely to benefit students if it is:
 collaborative – involves staff working together, identifying starting points, sharing evidence
about practice and trying out new approaches.
 supported by specialist expertise – usually drawn from beyond the learning setting.
 focused on aspirations for students – which provides the moral imperative and shared
focus.
 sustained over time – professional development sustained over weeks or months had
substantially more impact on practice benefiting students than shorter engagement.
 exploring evidence from trying new things – to connect practice to theory, enabling
practitioners to transfer new approaches and practices and the concepts underpinning them
to practice multiple contexts.
The CPD approaches which demonstrated the characteristics linked to effectiveness included:
 collaborative enquiry – peer-supported, collaborative, evidence based learning activities
taking place over an extended period coupled with risk taking and structured professional
dialogue about evidence.
 coaching and mentoring – a vehicle for contextualising CPD and for embedding enquiry-
oriented learning in day to day practice.
 networks – collaborations within and between schools depending upon and propelled to
success by CPD.
 structured dialogue and group work – practised in pairs and small groups, providing
multiple opportunities for exploring beliefs and assumptions, trying out new approaches and
giving and receiving structured feedback.
However they found that, in the main, CPD practice in England does not often reflect these
characteristics. For example, despite the strength of evidence about the effectiveness of coaching
Ofsted found approaches to coaching were generally very weak (Ofsted, 2006). Additionally, Curee
(2012) found that in a number of international and UK based studies which help to expand, build
understanding or reinforce the knowledge base about effective CPD, which impacts on student
outcomes, an average of 49 hours spent on staff CPD over a year boosted student achievement by
21 percentile points, whereas more limited time (5 to 14 hours) showed a statistically significant effect
on student learning.
Guskey (2000) developed imperatives for improving professional development – the CPD needs to be
an ongoing process that is necessary to support change; include a series of extended, job embedded
learning experiences to provide educators with the opportunity to discuss, think about, try out and
develop new practices in an environment that values enquiry and experiment; and be part of a wider
international process to make positive changes and improvements rather than mainly for those
professionals doing an inadequate job.
The recent TALIS (2013) survey reports that teachers across 36 European countries indicated that
CPD in different topics has had an impact on their teaching, ie 66% of the teachers reported a
moderate or large positive impact of professional development from CPD about knowledge (50% in
England); with 59% reporting moderate or large positive impact of professional development from
CPD pedagogy of the subject fields (45% in England); 47% reported a moderate or large positive
impact of professional development focusing on student evaluation and assessment practices (50% in
England) with 47% on knowledge of the curriculum (42% in England); and 44% reported a moderate
or large positive impact of professional development addressing ICT skills for teaching (25%
England). The survey does not give details of the evidence of exactly how it had impacted on their
teaching.
What is effective practice in evaluating the impact of professional development?
Guskey (2000) describes three different types of evaluation:
1. Planning evaluation. Checking that the CPD is fit for purpose has clearly identified goals and
outcomes and that these are achievable and situated in the existing research.
2. Formative evaluation. The purpose of this type of evaluation is to provide ongoing evidence
of whether expected progress is being made by considering intermediate benchmarks of
success to determine what is working as expected and what difficulties must be overcome.
3. Summative evaluation. This evaluation is conducted at the completion of a programme or
activity to make judgements about its overall success. It describes what was accomplished,
what were the consequences (positive and negative), what were the final results (intended and
unintended), and whether benefits justified the costs. Unlike formative assessment, which is
used to guide improvements, summative evaluations provide decision-makers with the
information they need to make crucial decisions about the long term future of a programme or
activity.
These different phases of evaluation require different data to be collected using a variety of methods.
Test scores for example are often used in summative evaluations, while interviews and survey data
may be used more often to guide formative evaluation (Goodall et al, 2005).
Burns (2007) acknowledges that educational research has been weak in its ability to continuously
develop and refine a body of knowledge which is acknowledged as valid and reliable. So the
evaluation of professional development needs to identify what CPD practices are most likely to impact
on pupil outcomes and this needs to guide reforms in CPD specifically and educational programmes
in general.
In England, accountability measures for schools and colleges require them to show the impact of their
actions. For example, the revised Ofsted Framework (September 2013) requires schools to show the
impact of professional development on the quality of teaching overall and also on specific teachers.
To achieve an ‘outstanding judgement’ on leadership and management, schools need to focus
relentlessly on improving teaching and learning and provide focused professional development for all
staff which has a positive impact on pupil outcomes and teacher development. To do this effectively,
senior leaders need to collect evidence of the impact of the professional development over a period of
time using tools such as staff and pupil voice, lesson observations, recording and sharing reflections
as part of mentoring and coaching, and work scrutiny. Effective analysis of data, both formative and
summative, can be used to monitor impact on short, medium and long term pupil outcomes.
How can we improve the impact of evaluations on CPD?
Guskey outlines three key reasons why evaluations may not be as effective as they could be:
1. A focus on documentation rather than evaluation. Many evaluations consist merely of
summarising the activities undertaken as part of the CPD programme.
2. The evaluation is too shallow and does not address meaningfully the indicators of success.
Where some evaluation does exist, this often takes the form of participation satisfaction
questionnaires.
3. Evaluations are typically brief one-off events, often not planned for, that take place after the
event. As most meaningful change will be long term the evaluation of CPD should reflect this.
One key barrier to evaluation in education may be the perceived need to demonstrate a causal
relationship between professional development and improvements in pupil outcomes. Most schools
are engaged in systematic reform involving the simultaneous implementation of multiple innovations
(Fullan, 1992). Unlike research and evaluation in healthcare or other fields, the ability to control other
factors is limited. In most cases there are too many intervening variables to allow for simple causal
inferences (Guskey, 2000). However, it is possible to collect a strong body of evidence about whether
or not CPD is contributing to specific gains in pupil outcomes. By clarifying goals and identifying the
desired impact, collecting the necessary evidence will become easier. The reason evaluation is often
difficult is because the collection of evidence of impact is not planned for or gathered early enough.
Always seek proof but collect lots of evidence along the way (Guskey, 2000)
To ensure the systematic gathering of evidence Guskey suggests a clear set of evaluation guidelines:
 Planning guidelines
o Clarify intended goals.
o Assess the value of the goals.
o Analyse the context and assess how it might influence implementation.
o Estimate the programme’s potential to meet the desired goals. Include a thorough cost-
benefit analysis.
o Determine how the goals can be assessed. Decide, upfront, what evidence you trust.
o Outline strategies for gathering evidence, including critical intermediate indicators that
might be used to identify problems or forecast final results.
 Formative and summative guidelines
o Gather and analyse evidence on all five of the levels described by Guskey (see below)
as well as an ongoing cost-benefit analysis.
o Prepare interim and final reports that are clear, meaningful, and comprehensible to
those who will use the results.
Teachers often report that CPD improves their teaching and has an impact on pupils, but are
unable to provide real evidence of this impact. Planning for collecting evidence of impact is an
important part of effective evaluation and it is crucial to good CPD. Guskey (2000) proposes a
model of evaluation for CPD using five different levels:
Level 1: Participants’ reactions
Level 1 involves asking participants for their immediate reaction to the CPD.
Level 2: Participants’ learning from CPD
Level 2 ascertains the type and amount of the participants’ learning which has taken place –
this could be cognitive, affective or behavioural.
Level 3: Organisational support and change
Level 3 is about the impact the CPD has had on participants’ colleagues and institutions.
Organisations can help share and embed learning from CPD which can make it more
sustainable and have a motivational impact as well.
Level 4: Participants’ use of new knowledge and skills
Level 4 evaluates the impact that CPD has had over time (6 to 9 months) on the participants’
use of the new knowledge and skills and how this has impacted on classroom practice.
Evaluation at this level takes place over time, the length of which depends on the complexity
of the knowledge or skill to be acquired and the time participants have to develop their skills.
Level 5: Student outcomes
Level 5 is about the impact on pupils. Again this level of evaluation will take time but it is
important to ascertain how the teachers’ new knowledge or practices have improved pupils’
attainment and progress.
Guskey recommends working backwards, embedding pupil outcomes into planning the CPD activity
and its evaluation. This applies to all CPD whether within schools or colleges or externally provided
CPD to ensure that the final goal of improving pupil outcomes is central to the process.
Goodall et al (2005) suggest that in addition to the levels described by Guskey the cost effectiveness
of CPD should be part of its evaluation. CPD should not be undertaken if the cost to the system
outweighs the benefits. The use of effect size of educational interventions in the Meta analysis
undertaken by Hattie (2009), to assess the effectiveness of interventions on pupil outcomes, is an
example of cost-benefit analysis in education. However, what we know about the cost effectiveness of
CPD is still fairly limited and not part of most ongoing evaluation processes.
What does the National Science Learning Network have to offer?
The purpose of any teacher CPD, particularly the subject specific CPD provided by the National
Science Learning Network, is to ensure a positive change in pupils’ outcomes, yet as has been stated
the evaluative evidence to support this relationship is often weak or missing. The assessment often
relies on qualitative data of teachers’ opinions rather than any data related to pupil progress. Based
on the CPD programme for science teachers and support staff at the National Science Learning
Centre, Kudenko and Hoyle (2013) investigated how embedding particular strategies into CPD
increased the reported impact on students’ achievement which was backed up by ‘hard’ and rigorous
evidence.
Using the Guskey five levels of evaluation, the research analysed data from sources including impact
questionnaires, which teachers completed two or more months after each residential CPD session,
quantitative and qualitative accounts of teachers’ post-CPD actions and the evidence of outcomes
and impacts. Teachers tended to use two contrasting narratives to depict their post-CPD experience:
Narrative 1 was pupil-focused, structured and contained measurable evidence, while Narrative 2
contained a description of what and how teachers did and what they thought about the CPD
experience. It was found that teachers used structured measurable outcomes (Narrative 1) more
frequently when their CPD had contained specific training in ‘reflective practices’ and ‘action research’
methodologies. By including reflective practices in the CPD, teachers were more able to plan their
post-CPD actions and to collect evidence of pupil outcomes. This boosted their confidence and
motivation to continue to innovate, increased the sustainability of the changes from the CPD and
increased the dissemination to colleagues in school and beyond. This finding is consistent with the
studies that Curee (2012) noted on the value of ‘action research’ in CPD.
The National Science Learning Network has used this research to further develop their process of
evaluating CPD based on the Guskey (2000) model of evaluation.
The process map below outlines how the evaluation process is used in the Network’s ‘Impact toolkit’
to provide evidence of the impact of CPD over time. Documentation to support each step is available
as part of the Impact toolkit.
Essential to the CPD process is planning clear aims, objectives and outcomes, which are shared with
participants prior to the CPD, and they are given an opportunity to define their learning objectives
(Form B1) which helps tutors to tailor the CPD to the needs of the participants. During the CPD
participants are encouraged to record their learning on Form D1 and to agree actions or interventions
they are going to make as a result of the CPD. Participants are also encouraged to collect evidence of
their own knowledge, understanding or practice level as well as that of their colleagues and their
pupils. They identify their post-CPD actions or intervention and determine how they will collect
evidence of the impact of that intervention on themselves, their colleagues and their pupils (Form D2).
At the end of the CPD (or a series of CPD activities amounting to 0.5 day or more) participants
evaluate their immediate reaction (level 1 of Guskey’s model) through Form D3. About 6 to 8 weeks
after the CPD, participants are asked about their learning (Guskey’s level 2) using Form P1 and about
6 to 9 months after the CPD they are asked about the use of their learning (Guskey’s level 4) using
Form P2. The impact on colleagues (Guskey’s level 3) is evaluated through collecting evidence on
Forms P1 and 2 and a longer term evaluation at the end of the academic year (Form P3) which also
includes evidence of impact on pupil outcomes.
The outcomes of this evaluation of impact process and the evaluation tools referred to on the process
map can be found at: https://www.sciencelearningcentres.org.uk/impact-and-research/impact/impact-
toolkit/ and https://www.sciencelearningcentres.org.uk/impact-and-research/impact/impact-toolkit-
2013/
Discussion Points
 How can we encourage colleagues to have a positive, if critical, mindset about changing their
attitude and practices of evaluating the impact of CPD?
 How can Myscience consultants plan for the effective evaluation of the CPD they provide in
schools or colleges?
 What are the challenges of supporting teachers to collect evidence of impact particularly on
student outcomes as part of CPD?
 What adaptations might we need to evaluate different types of CPD eg online, in-class support
etc?
 How can Myscience consultants support schools to collect evidence of the impact of the CPD
they are providing for schools or colleges?
 How does the need to evaluate the impact of professional development over time support the
changing educational landscape in the UK?
References
 Advisory Committee on Mathematics Education (ACME) (2007) Empowering teachers:
success for learners. http://www.acme-uk.org/media/14054/acmepdreport2013.pdf
 Burns, T., Schuller, T. (2007) Evidence in education: Linking research and policy. OECD
http://www.oecd.org/edu/ceri/47435459.pdf
 Cordingley, P. (2000) Teacher Perspectives on the Accessibility and Usability of Research
Outputs. Paper presented at the British Educational Research Association Annual Meeting
 Curee (2012) Understanding What Enables High Quality Professional Learning. A report on
the research evidence. http://www.curee.co.uk/files/publication/[site-timestamp]/CUREE-
Report.pdf
 Fullan, M., Hargreaves, A. (1992) Teacher development and educational change. Falmer
 Goodall, J., Day, C., Lindsay, G., Muijs, D., Harris, A. I. (2005) Evaluating the impact of CPD,
University of Warwick
http://www2.warwick.ac.uk/fac/soc/cedar/projects/completed05/contprofdev/cpdfinalreport05.p
d
 Guskey, T.R. (2000). Evaluating Professional Development, Thousand Oaks, Ca: Corwin
Press
 Hattie, J. (2009) Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to
Achievement Routledge
 Joyce, B., Showers, B. (2002) Student Achievement through Staff Development. ASCD
 Kudenko, I., Hoyle, P. (2013) How to enhance CPD to maximise the measurable impact on
students’ achievement in Science. ESERA conference and publication 2013
 Ofsted (2006) The logical chain: continuing professional development in effective schools.
Last accessed at http://www.ofsted.gov.uk/resources/logical-chain-continuing-professional-
development-effective-schools on 25/09/2014
 Schleicher, A. (2011) “Building a High Quality teaching profession: lessons from around the
world” OECD
 TALIS (2013) Teaching And Learning International Study 2013: Main findings from the survey
and implications for education and training policies in Europe.
http://ec.europa.eu/education/library/reports/2014/talis_en.pdf

More Related Content

What's hot

Outcome based education
Outcome based educationOutcome based education
Outcome based education
Sanjay Singh
 
Manpers Chapter 7 Presentation
Manpers Chapter 7 PresentationManpers Chapter 7 Presentation
Manpers Chapter 7 PresentationHitomi Tamura
 
Training Evaluation & Management
Training Evaluation & ManagementTraining Evaluation & Management
Driving student outcomes and success: What’s next for the retention pilot pro...
Driving student outcomes and success: What’s next for the retention pilot pro...Driving student outcomes and success: What’s next for the retention pilot pro...
Driving student outcomes and success: What’s next for the retention pilot pro...
LearningandTeaching
 
A7 UsryR
A7 UsryRA7 UsryR
A7 UsryRSheet32
 
Corporate training management
Corporate training managementCorporate training management
Corporate training management
smumbahelp
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
Megha Anilkumar
 
Evaluation manual 2011
Evaluation manual 2011Evaluation manual 2011
Evaluation manual 2011
Helen Philpot
 
Tarining evaluation
Tarining evaluation Tarining evaluation
Tarining evaluation
Rahul Rajan
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
Christina Sax
 
Kirkpatrick's Four Levels Of Evaluation Model
Kirkpatrick's Four Levels Of Evaluation ModelKirkpatrick's Four Levels Of Evaluation Model
Kirkpatrick's Four Levels Of Evaluation Modelsikojp
 
Presentation5
Presentation5Presentation5
Presentation5
Allison barbee
 
Evaluation of training
Evaluation of trainingEvaluation of training
Evaluation of training
Parsikarayala
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Lambda Solutions
 
501 assignment 6 slide share-2 wright-5
501 assignment 6 slide share-2 wright-5501 assignment 6 slide share-2 wright-5
501 assignment 6 slide share-2 wright-5
KaWright1961
 
Lecture 2 technology integration planning model
Lecture 2 technology integration planning modelLecture 2 technology integration planning model
Lecture 2 technology integration planning modelRoel Hernandez
 
Designing and conducting formative evaluations chapter 10
Designing and conducting formative evaluations chapter 10Designing and conducting formative evaluations chapter 10
Designing and conducting formative evaluations chapter 10
TiekaWilkins
 
Training design
Training designTraining design
Training design
Uday Jat
 
Assessment DO no.8, s.2015
Assessment DO no.8, s.2015Assessment DO no.8, s.2015
Assessment DO no.8, s.2015
Ros Co
 

What's hot (20)

Outcome based education
Outcome based educationOutcome based education
Outcome based education
 
Manpers Chapter 7 Presentation
Manpers Chapter 7 PresentationManpers Chapter 7 Presentation
Manpers Chapter 7 Presentation
 
Training Evaluation & Management
Training Evaluation & ManagementTraining Evaluation & Management
Training Evaluation & Management
 
SLO Training
SLO TrainingSLO Training
SLO Training
 
Driving student outcomes and success: What’s next for the retention pilot pro...
Driving student outcomes and success: What’s next for the retention pilot pro...Driving student outcomes and success: What’s next for the retention pilot pro...
Driving student outcomes and success: What’s next for the retention pilot pro...
 
A7 UsryR
A7 UsryRA7 UsryR
A7 UsryR
 
Corporate training management
Corporate training managementCorporate training management
Corporate training management
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
 
Evaluation manual 2011
Evaluation manual 2011Evaluation manual 2011
Evaluation manual 2011
 
Tarining evaluation
Tarining evaluation Tarining evaluation
Tarining evaluation
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
 
Kirkpatrick's Four Levels Of Evaluation Model
Kirkpatrick's Four Levels Of Evaluation ModelKirkpatrick's Four Levels Of Evaluation Model
Kirkpatrick's Four Levels Of Evaluation Model
 
Presentation5
Presentation5Presentation5
Presentation5
 
Evaluation of training
Evaluation of trainingEvaluation of training
Evaluation of training
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
 
501 assignment 6 slide share-2 wright-5
501 assignment 6 slide share-2 wright-5501 assignment 6 slide share-2 wright-5
501 assignment 6 slide share-2 wright-5
 
Lecture 2 technology integration planning model
Lecture 2 technology integration planning modelLecture 2 technology integration planning model
Lecture 2 technology integration planning model
 
Designing and conducting formative evaluations chapter 10
Designing and conducting formative evaluations chapter 10Designing and conducting formative evaluations chapter 10
Designing and conducting formative evaluations chapter 10
 
Training design
Training designTraining design
Training design
 
Assessment DO no.8, s.2015
Assessment DO no.8, s.2015Assessment DO no.8, s.2015
Assessment DO no.8, s.2015
 

Viewers also liked

Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainning
Saranya Dhanesh Kumar
 
Training and development
Training and developmentTraining and development
Training and development
suba ramanujam
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation modelsRose Merah
 
Models of Evaluation in Educational Technology
Models of Evaluation in Educational TechnologyModels of Evaluation in Educational Technology
Models of Evaluation in Educational Technology
Alaa Sadik
 
Evaluation in Education
Evaluation in EducationEvaluation in Education
Evaluation in Education
Kusum Gaur
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
Aruna Ap
 
Cipp evaluation model
Cipp evaluation modelCipp evaluation model
Cipp evaluation model
Mylene Pilongo
 
cipp model
 cipp model cipp model
cipp model
Orly Abellanosa
 

Viewers also liked (8)

Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainning
 
Training and development
Training and developmentTraining and development
Training and development
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
 
Models of Evaluation in Educational Technology
Models of Evaluation in Educational TechnologyModels of Evaluation in Educational Technology
Models of Evaluation in Educational Technology
 
Evaluation in Education
Evaluation in EducationEvaluation in Education
Evaluation in Education
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
 
Cipp evaluation model
Cipp evaluation modelCipp evaluation model
Cipp evaluation model
 
cipp model
 cipp model cipp model
cipp model
 

Similar to 3A - Effective evaluation of the impact of CPD - What does research tell us

Evaluating outcomes in Social Work Education
Evaluating outcomes in Social Work EducationEvaluating outcomes in Social Work Education
Evaluating outcomes in Social Work Education
foreman
 
Workplace+learning+in+hpsr+training
Workplace+learning+in+hpsr+trainingWorkplace+learning+in+hpsr+training
Workplace+learning+in+hpsr+training
SOPH-UWC
 
Training need assessment in a 5star ho
Training need assessment in a 5star hoTraining need assessment in a 5star ho
Training need assessment in a 5star hoShamimansary
 
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
Md. Nazrul Islam
 
What philosophical assumptions drive the teacher/teaching standards movement ...
What philosophical assumptions drive the teacher/teaching standards movement ...What philosophical assumptions drive the teacher/teaching standards movement ...
What philosophical assumptions drive the teacher/teaching standards movement ...
Ferry Tanoto
 
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student SuccessPanel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Hobsons
 
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student SuccessPanel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Hobsons
 
Learning and development activities
Learning and development activitiesLearning and development activities
Learning and development activities
chirchir paul
 
Contemporary perspectives on continuing professional development
Contemporary perspectives on continuing professional developmentContemporary perspectives on continuing professional development
Contemporary perspectives on continuing professional development
aqwxsz123
 
8461135_Mass-communication.pdf
8461135_Mass-communication.pdf8461135_Mass-communication.pdf
8461135_Mass-communication.pdf
Madhu Bk B K
 
Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017
judeslides
 
Frances Raines In the past, I have worked with a pr
Frances Raines In the past, I have worked with a prFrances Raines In the past, I have worked with a pr
Frances Raines In the past, I have worked with a pr
JeanmarieColbert3
 
Assessment For Learning Effects And Impact
Assessment For Learning  Effects And ImpactAssessment For Learning  Effects And Impact
Assessment For Learning Effects And Impact
Andrea Porter
 
Using a standards alignment model as a framework for doctoral candidate asses...
Using a standards alignment model as a framework for doctoral candidate asses...Using a standards alignment model as a framework for doctoral candidate asses...
Using a standards alignment model as a framework for doctoral candidate asses...CPEDInitiative
 
Determinants of Lecturers Assessment Practice in Higher Education in Somalia
Determinants of Lecturers Assessment Practice in Higher Education in SomaliaDeterminants of Lecturers Assessment Practice in Higher Education in Somalia
Determinants of Lecturers Assessment Practice in Higher Education in Somalia
ijejournal
 
A
AA
article
articlearticle
article
sunitha85
 
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Peter Hofman
 
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
ssuser3b33f3
 

Similar to 3A - Effective evaluation of the impact of CPD - What does research tell us (20)

Evaluating outcomes in Social Work Education
Evaluating outcomes in Social Work EducationEvaluating outcomes in Social Work Education
Evaluating outcomes in Social Work Education
 
Workplace+learning+in+hpsr+training
Workplace+learning+in+hpsr+trainingWorkplace+learning+in+hpsr+training
Workplace+learning+in+hpsr+training
 
Training need assessment in a 5star ho
Training need assessment in a 5star hoTraining need assessment in a 5star ho
Training need assessment in a 5star ho
 
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
Importance of Outcome Based Education (OBE) in the Quality Enhancement of Hig...
 
What philosophical assumptions drive the teacher/teaching standards movement ...
What philosophical assumptions drive the teacher/teaching standards movement ...What philosophical assumptions drive the teacher/teaching standards movement ...
What philosophical assumptions drive the teacher/teaching standards movement ...
 
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student SuccessPanel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
 
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student SuccessPanel Debate: An Uncertain Future - TEF, Retention, and Student Success
Panel Debate: An Uncertain Future - TEF, Retention, and Student Success
 
Learning and development activities
Learning and development activitiesLearning and development activities
Learning and development activities
 
Contemporary perspectives on continuing professional development
Contemporary perspectives on continuing professional developmentContemporary perspectives on continuing professional development
Contemporary perspectives on continuing professional development
 
8461135_Mass-communication.pdf
8461135_Mass-communication.pdf8461135_Mass-communication.pdf
8461135_Mass-communication.pdf
 
Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017Presentation to ResearchED London Sept 9th 2017
Presentation to ResearchED London Sept 9th 2017
 
Frances Raines In the past, I have worked with a pr
Frances Raines In the past, I have worked with a prFrances Raines In the past, I have worked with a pr
Frances Raines In the past, I have worked with a pr
 
Assessment For Learning Effects And Impact
Assessment For Learning  Effects And ImpactAssessment For Learning  Effects And Impact
Assessment For Learning Effects And Impact
 
Using a standards alignment model as a framework for doctoral candidate asses...
Using a standards alignment model as a framework for doctoral candidate asses...Using a standards alignment model as a framework for doctoral candidate asses...
Using a standards alignment model as a framework for doctoral candidate asses...
 
Determinants of Lecturers Assessment Practice in Higher Education in Somalia
Determinants of Lecturers Assessment Practice in Higher Education in SomaliaDeterminants of Lecturers Assessment Practice in Higher Education in Somalia
Determinants of Lecturers Assessment Practice in Higher Education in Somalia
 
Fs2
Fs2Fs2
Fs2
 
A
AA
A
 
article
articlearticle
article
 
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
 
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
Computer Assisted Learning - 2008 - Swinglehurst - Peer observation of teachi...
 

3A - Effective evaluation of the impact of CPD - What does research tell us

  • 1. Myscience is an initiative of the White Rose University Consortium (comprising the Universities of Leeds, Sheffield and York) and Sheffield Hallam University. Myscience manages the national network of Science Learning Centres, the National STEM Centre and other programmes supporting STEM education. Registered Office: Myscience.co Limited, University of York, Heslington, York, North Yorkshire, YO10 5DD Tel +44 (0) 1904 328300 Fax +44 (0) 1904 328328 Email enquiries@national.slcs.ac.uk Websites www.sciencelearningcentres.org.uk and www.nationalstemcentre.org.uk Myscience.co Limited, registered in England and Wales, Company Number 05081097. Effective evaluation of the impact of CPD: What does research tell us? Background Over the past ten years the evidence for the critical role of the teacher on pupil outcomes has been supported by a number of systematic reviews (Curee, 2012; Schleicher, 2011). The central question is: “How do we determine the effects and effectiveness of activities designed to enhance the professional knowledge and skills of educators so that they might in turn improve the learning of students?” Guskey (2000) Well-designed evaluation serves multiple purposes. It tells us about the quality of current practice as well as informing us about how to develop this practice (Guskey, 2000). Historically, evaluation has been seen as costly and time-consuming and often left to ‘experts’ who are called in at the end and asked to determine if what has been done has made a difference. As a result much evaluation of CPD programmes focused on collecting ‘soft’ evidence of impact on teachers’ reactions to the CPD with limited regard to the collection of qualitative evidence on either teacher development or on the impact on pupils’ outcomes. This paper is concerned with identifying how to evaluate the impact of CPD on teachers’ development and pupil outcomes and the implications for the provision of CPD to enable this to occur. What is the research evidence of effective professional development? We are starting with the premise that effective CPD should have outcomes on both teacher development and pupil learning, so what can research tell us about what needs to be included in CPD to support this process? The CUREE report (2012) draws on published research and other evidence to address the question of: “what are the characteristics of high quality professional learning for practitioners in education?” The main focus of the report is the features of professional learning, for teachers and their leaders,
  • 2. which lead to benefits for their pupils and students. They conclude that CPD for teachers is more likely to benefit students if it is:  collaborative – involves staff working together, identifying starting points, sharing evidence about practice and trying out new approaches.  supported by specialist expertise – usually drawn from beyond the learning setting.  focused on aspirations for students – which provides the moral imperative and shared focus.  sustained over time – professional development sustained over weeks or months had substantially more impact on practice benefiting students than shorter engagement.  exploring evidence from trying new things – to connect practice to theory, enabling practitioners to transfer new approaches and practices and the concepts underpinning them to practice multiple contexts. The CPD approaches which demonstrated the characteristics linked to effectiveness included:  collaborative enquiry – peer-supported, collaborative, evidence based learning activities taking place over an extended period coupled with risk taking and structured professional dialogue about evidence.  coaching and mentoring – a vehicle for contextualising CPD and for embedding enquiry- oriented learning in day to day practice.  networks – collaborations within and between schools depending upon and propelled to success by CPD.  structured dialogue and group work – practised in pairs and small groups, providing multiple opportunities for exploring beliefs and assumptions, trying out new approaches and giving and receiving structured feedback. However they found that, in the main, CPD practice in England does not often reflect these characteristics. For example, despite the strength of evidence about the effectiveness of coaching Ofsted found approaches to coaching were generally very weak (Ofsted, 2006). Additionally, Curee (2012) found that in a number of international and UK based studies which help to expand, build understanding or reinforce the knowledge base about effective CPD, which impacts on student outcomes, an average of 49 hours spent on staff CPD over a year boosted student achievement by 21 percentile points, whereas more limited time (5 to 14 hours) showed a statistically significant effect on student learning.
  • 3. Guskey (2000) developed imperatives for improving professional development – the CPD needs to be an ongoing process that is necessary to support change; include a series of extended, job embedded learning experiences to provide educators with the opportunity to discuss, think about, try out and develop new practices in an environment that values enquiry and experiment; and be part of a wider international process to make positive changes and improvements rather than mainly for those professionals doing an inadequate job. The recent TALIS (2013) survey reports that teachers across 36 European countries indicated that CPD in different topics has had an impact on their teaching, ie 66% of the teachers reported a moderate or large positive impact of professional development from CPD about knowledge (50% in England); with 59% reporting moderate or large positive impact of professional development from CPD pedagogy of the subject fields (45% in England); 47% reported a moderate or large positive impact of professional development focusing on student evaluation and assessment practices (50% in England) with 47% on knowledge of the curriculum (42% in England); and 44% reported a moderate or large positive impact of professional development addressing ICT skills for teaching (25% England). The survey does not give details of the evidence of exactly how it had impacted on their teaching. What is effective practice in evaluating the impact of professional development? Guskey (2000) describes three different types of evaluation: 1. Planning evaluation. Checking that the CPD is fit for purpose has clearly identified goals and outcomes and that these are achievable and situated in the existing research. 2. Formative evaluation. The purpose of this type of evaluation is to provide ongoing evidence of whether expected progress is being made by considering intermediate benchmarks of success to determine what is working as expected and what difficulties must be overcome. 3. Summative evaluation. This evaluation is conducted at the completion of a programme or activity to make judgements about its overall success. It describes what was accomplished, what were the consequences (positive and negative), what were the final results (intended and unintended), and whether benefits justified the costs. Unlike formative assessment, which is used to guide improvements, summative evaluations provide decision-makers with the information they need to make crucial decisions about the long term future of a programme or activity.
  • 4. These different phases of evaluation require different data to be collected using a variety of methods. Test scores for example are often used in summative evaluations, while interviews and survey data may be used more often to guide formative evaluation (Goodall et al, 2005). Burns (2007) acknowledges that educational research has been weak in its ability to continuously develop and refine a body of knowledge which is acknowledged as valid and reliable. So the evaluation of professional development needs to identify what CPD practices are most likely to impact on pupil outcomes and this needs to guide reforms in CPD specifically and educational programmes in general. In England, accountability measures for schools and colleges require them to show the impact of their actions. For example, the revised Ofsted Framework (September 2013) requires schools to show the impact of professional development on the quality of teaching overall and also on specific teachers. To achieve an ‘outstanding judgement’ on leadership and management, schools need to focus relentlessly on improving teaching and learning and provide focused professional development for all staff which has a positive impact on pupil outcomes and teacher development. To do this effectively, senior leaders need to collect evidence of the impact of the professional development over a period of time using tools such as staff and pupil voice, lesson observations, recording and sharing reflections as part of mentoring and coaching, and work scrutiny. Effective analysis of data, both formative and summative, can be used to monitor impact on short, medium and long term pupil outcomes. How can we improve the impact of evaluations on CPD? Guskey outlines three key reasons why evaluations may not be as effective as they could be: 1. A focus on documentation rather than evaluation. Many evaluations consist merely of summarising the activities undertaken as part of the CPD programme. 2. The evaluation is too shallow and does not address meaningfully the indicators of success. Where some evaluation does exist, this often takes the form of participation satisfaction questionnaires. 3. Evaluations are typically brief one-off events, often not planned for, that take place after the event. As most meaningful change will be long term the evaluation of CPD should reflect this. One key barrier to evaluation in education may be the perceived need to demonstrate a causal relationship between professional development and improvements in pupil outcomes. Most schools are engaged in systematic reform involving the simultaneous implementation of multiple innovations (Fullan, 1992). Unlike research and evaluation in healthcare or other fields, the ability to control other factors is limited. In most cases there are too many intervening variables to allow for simple causal
  • 5. inferences (Guskey, 2000). However, it is possible to collect a strong body of evidence about whether or not CPD is contributing to specific gains in pupil outcomes. By clarifying goals and identifying the desired impact, collecting the necessary evidence will become easier. The reason evaluation is often difficult is because the collection of evidence of impact is not planned for or gathered early enough. Always seek proof but collect lots of evidence along the way (Guskey, 2000) To ensure the systematic gathering of evidence Guskey suggests a clear set of evaluation guidelines:  Planning guidelines o Clarify intended goals. o Assess the value of the goals. o Analyse the context and assess how it might influence implementation. o Estimate the programme’s potential to meet the desired goals. Include a thorough cost- benefit analysis. o Determine how the goals can be assessed. Decide, upfront, what evidence you trust. o Outline strategies for gathering evidence, including critical intermediate indicators that might be used to identify problems or forecast final results.  Formative and summative guidelines o Gather and analyse evidence on all five of the levels described by Guskey (see below) as well as an ongoing cost-benefit analysis. o Prepare interim and final reports that are clear, meaningful, and comprehensible to those who will use the results. Teachers often report that CPD improves their teaching and has an impact on pupils, but are unable to provide real evidence of this impact. Planning for collecting evidence of impact is an important part of effective evaluation and it is crucial to good CPD. Guskey (2000) proposes a model of evaluation for CPD using five different levels: Level 1: Participants’ reactions Level 1 involves asking participants for their immediate reaction to the CPD. Level 2: Participants’ learning from CPD Level 2 ascertains the type and amount of the participants’ learning which has taken place – this could be cognitive, affective or behavioural.
  • 6. Level 3: Organisational support and change Level 3 is about the impact the CPD has had on participants’ colleagues and institutions. Organisations can help share and embed learning from CPD which can make it more sustainable and have a motivational impact as well. Level 4: Participants’ use of new knowledge and skills Level 4 evaluates the impact that CPD has had over time (6 to 9 months) on the participants’ use of the new knowledge and skills and how this has impacted on classroom practice. Evaluation at this level takes place over time, the length of which depends on the complexity of the knowledge or skill to be acquired and the time participants have to develop their skills. Level 5: Student outcomes Level 5 is about the impact on pupils. Again this level of evaluation will take time but it is important to ascertain how the teachers’ new knowledge or practices have improved pupils’ attainment and progress. Guskey recommends working backwards, embedding pupil outcomes into planning the CPD activity and its evaluation. This applies to all CPD whether within schools or colleges or externally provided CPD to ensure that the final goal of improving pupil outcomes is central to the process. Goodall et al (2005) suggest that in addition to the levels described by Guskey the cost effectiveness of CPD should be part of its evaluation. CPD should not be undertaken if the cost to the system outweighs the benefits. The use of effect size of educational interventions in the Meta analysis undertaken by Hattie (2009), to assess the effectiveness of interventions on pupil outcomes, is an example of cost-benefit analysis in education. However, what we know about the cost effectiveness of CPD is still fairly limited and not part of most ongoing evaluation processes. What does the National Science Learning Network have to offer? The purpose of any teacher CPD, particularly the subject specific CPD provided by the National Science Learning Network, is to ensure a positive change in pupils’ outcomes, yet as has been stated the evaluative evidence to support this relationship is often weak or missing. The assessment often relies on qualitative data of teachers’ opinions rather than any data related to pupil progress. Based on the CPD programme for science teachers and support staff at the National Science Learning Centre, Kudenko and Hoyle (2013) investigated how embedding particular strategies into CPD increased the reported impact on students’ achievement which was backed up by ‘hard’ and rigorous evidence.
  • 7. Using the Guskey five levels of evaluation, the research analysed data from sources including impact questionnaires, which teachers completed two or more months after each residential CPD session, quantitative and qualitative accounts of teachers’ post-CPD actions and the evidence of outcomes and impacts. Teachers tended to use two contrasting narratives to depict their post-CPD experience: Narrative 1 was pupil-focused, structured and contained measurable evidence, while Narrative 2 contained a description of what and how teachers did and what they thought about the CPD experience. It was found that teachers used structured measurable outcomes (Narrative 1) more frequently when their CPD had contained specific training in ‘reflective practices’ and ‘action research’ methodologies. By including reflective practices in the CPD, teachers were more able to plan their post-CPD actions and to collect evidence of pupil outcomes. This boosted their confidence and motivation to continue to innovate, increased the sustainability of the changes from the CPD and increased the dissemination to colleagues in school and beyond. This finding is consistent with the studies that Curee (2012) noted on the value of ‘action research’ in CPD. The National Science Learning Network has used this research to further develop their process of evaluating CPD based on the Guskey (2000) model of evaluation. The process map below outlines how the evaluation process is used in the Network’s ‘Impact toolkit’ to provide evidence of the impact of CPD over time. Documentation to support each step is available as part of the Impact toolkit. Essential to the CPD process is planning clear aims, objectives and outcomes, which are shared with participants prior to the CPD, and they are given an opportunity to define their learning objectives (Form B1) which helps tutors to tailor the CPD to the needs of the participants. During the CPD participants are encouraged to record their learning on Form D1 and to agree actions or interventions they are going to make as a result of the CPD. Participants are also encouraged to collect evidence of
  • 8. their own knowledge, understanding or practice level as well as that of their colleagues and their pupils. They identify their post-CPD actions or intervention and determine how they will collect evidence of the impact of that intervention on themselves, their colleagues and their pupils (Form D2). At the end of the CPD (or a series of CPD activities amounting to 0.5 day or more) participants evaluate their immediate reaction (level 1 of Guskey’s model) through Form D3. About 6 to 8 weeks after the CPD, participants are asked about their learning (Guskey’s level 2) using Form P1 and about 6 to 9 months after the CPD they are asked about the use of their learning (Guskey’s level 4) using Form P2. The impact on colleagues (Guskey’s level 3) is evaluated through collecting evidence on Forms P1 and 2 and a longer term evaluation at the end of the academic year (Form P3) which also includes evidence of impact on pupil outcomes. The outcomes of this evaluation of impact process and the evaluation tools referred to on the process map can be found at: https://www.sciencelearningcentres.org.uk/impact-and-research/impact/impact- toolkit/ and https://www.sciencelearningcentres.org.uk/impact-and-research/impact/impact-toolkit- 2013/ Discussion Points  How can we encourage colleagues to have a positive, if critical, mindset about changing their attitude and practices of evaluating the impact of CPD?  How can Myscience consultants plan for the effective evaluation of the CPD they provide in schools or colleges?  What are the challenges of supporting teachers to collect evidence of impact particularly on student outcomes as part of CPD?  What adaptations might we need to evaluate different types of CPD eg online, in-class support etc?  How can Myscience consultants support schools to collect evidence of the impact of the CPD they are providing for schools or colleges?  How does the need to evaluate the impact of professional development over time support the changing educational landscape in the UK?
  • 9. References  Advisory Committee on Mathematics Education (ACME) (2007) Empowering teachers: success for learners. http://www.acme-uk.org/media/14054/acmepdreport2013.pdf  Burns, T., Schuller, T. (2007) Evidence in education: Linking research and policy. OECD http://www.oecd.org/edu/ceri/47435459.pdf  Cordingley, P. (2000) Teacher Perspectives on the Accessibility and Usability of Research Outputs. Paper presented at the British Educational Research Association Annual Meeting  Curee (2012) Understanding What Enables High Quality Professional Learning. A report on the research evidence. http://www.curee.co.uk/files/publication/[site-timestamp]/CUREE- Report.pdf  Fullan, M., Hargreaves, A. (1992) Teacher development and educational change. Falmer  Goodall, J., Day, C., Lindsay, G., Muijs, D., Harris, A. I. (2005) Evaluating the impact of CPD, University of Warwick http://www2.warwick.ac.uk/fac/soc/cedar/projects/completed05/contprofdev/cpdfinalreport05.p d  Guskey, T.R. (2000). Evaluating Professional Development, Thousand Oaks, Ca: Corwin Press  Hattie, J. (2009) Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement Routledge  Joyce, B., Showers, B. (2002) Student Achievement through Staff Development. ASCD  Kudenko, I., Hoyle, P. (2013) How to enhance CPD to maximise the measurable impact on students’ achievement in Science. ESERA conference and publication 2013  Ofsted (2006) The logical chain: continuing professional development in effective schools. Last accessed at http://www.ofsted.gov.uk/resources/logical-chain-continuing-professional- development-effective-schools on 25/09/2014  Schleicher, A. (2011) “Building a High Quality teaching profession: lessons from around the world” OECD  TALIS (2013) Teaching And Learning International Study 2013: Main findings from the survey and implications for education and training policies in Europe. http://ec.europa.eu/education/library/reports/2014/talis_en.pdf