Evaluating
Education
This talk may
save
Your life
This talk may
save
you strife
Your life
Evaluating
Education
bit.ly/evalref
Education is rarely
described as an
intervention
Reaction
Reaction
Learning
Reaction
Learning
Behaviour
Reaction
Learning
Behaviour
Results
bit.ly/evalref
Did a specific outcome
occur?
What were the
outcomes of this
intervention?
Statistical
Conclusion Validity
Internal
Validity
Construct Validity
External
Validity
The four criterion of
methodologica...
Time
Change)
Intervention
Time
Change
Intervention
Population
Population
Population
Population
(e.g Under or
Postgraduates)
•Sample Size
•Power Calculation
•Criterion for
Statistical Significance
Used
Tim
C...
Time (Which can be hours e,g pre and post test or up to years)
Change (Which can be measured in a variety of ways)
Interve...
Threats to Internal Validity are best
analysed by considering an intervention
and control group
•History
Another event hap...
Time (Which can be hours e,g pre and post test or up to years)
Change Y (Which can be measured in a variety of ways)
Inter...
Time (Which can be hours e,g pre and post test or up to years)
Change (Which can be measured in a variety of ways)
Interve...
Time (Which can be hours e,g pre and post test or up to years)
Change (Which can be measured in a variety of ways)
Interve...
Statistical
Conclusion Validity
Internal
Validity
Construct Validity
External
Validity
Interaction
Interaction
Interface
Interaction
Instruction
(The
educational
Intervention)
Interface
Interaction
Instruction
(The
educational
Intervention)
Ideation
Interface
Interaction
Instruction
(The
educational
Intervention)
Ideation
Integration
i) Knowledge
ii) Behaviour
Interface
Interaction
Instruction
(The
educational
Intervention)
Ideation
Integration
i) Knowledge
ii) Behaviour
Implementation
Inte...
Interaction
Instruction
(The
educational
Intervention)
Ideation
Integration
i) Knowledge
ii) Behaviour
Implementation
Inte...
Interaction
Instruction
(The
educational
Intervention)
Ideation
Integration
i) Knowledge
ii) Behaviour
Implementation
Inte...
Thanks
SMACC
Prof. Tim Coats + Dr. David Matheson
Dr. Ffion Davies
Staff/Patients at Leicester Royal Infirmary
@damian_roland
Education is rarely
described as an
intervention
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Roland - Evaluating Education
Upcoming SlideShare
Loading in...5
×

Roland - Evaluating Education

935

Published on

Education as an Intervention. Damian Roland highlights how patient outcomes should be a benchmark for the quality of our teaching.

Published in: Health & Medicine
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
935
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Hello my name is Damian. And I am a fraud. Well maybe an imposter… I am at a conference on critical care and emergency medicine but am at my heart a humble paediatrician. I am talking on educational evaluation but am neither experienced enough or wish to be known as an educationalist. And at an event which reveals TED lectures I neither can pull of Simon’s Professorial ‘pauses’ or have the charm of Chris Nickerson. But being a imposter is a good thing. Imposters aim to understand what people are expecting to see. They have to know a concept and make it translatable so that both the expert and non-expert are drawn in. I hope this gives them perhaps a different insight.
  • Hello my name is Damian. And I am a fraud. Well maybe an imposter… I am at a conference on critical care and emergency medicine but am at my heart a humble paediatrician. I am talking on educational evaluation but am neither experienced enough or wish to be known as an educationalist. And at an event which reveals TED lectures I neither can pull of Simon’s Professorial ‘pauses’ or have the charm of Chris Nickerson. But being a imposter is a good thing. Imposters aim to understand what people are expecting to see. They have to know a concept and make it translatable so that both the expert and non-expert are drawn in. I hope this gives them perhaps a different insight.
  • Hello my name is Damian. And I am a fraud. Well maybe an imposter… I am at a conference on critical care and emergency medicine but am at my heart a humble paediatrician. I am talking on educational evaluation but am neither experienced enough or wish to be known as an educationalist. And at an event which reveals TED lectures I neither can pull of Simon’s Professorial ‘pauses’ or have the charm of Chris Nickson. But being a imposter is a good thing. Imposters aim to understand what people are expecting to see. They have to know a concept and make it translatable so that both the expert and non-expert are drawn in. I hope this gives them perhaps a different insight.
  • My hope then its to try and lay out a framework so that anyone can look at the an initiative or practice changing intervention they are undertaking and determine what, if any, benefit it has. Clearly education is a challenging area as if it had been cracked we would have needed to here the fantastic talks from the previous two speakers.
  • I want to hit you early on with my first big take-home point. Education is everywhere. Virtually every interaction you have in the workplace can lead to learning. “We are always learning”. Really nice mantra but it doesn’t half make life difficult for those of us trying to work out which learning had the greatest effect. In some respects we have become sloppy in not putting the immunisation of a child “an intervention” on a par with a ultrasound teaching course “equally an intervention”. I can feel eyes beginning to roll already so I hope the fact the The description of the mode of delivery of the intervention was reported for only 52% of educational studies by Pino 2012.
    You need to think of everything that you do with a health care professional as an intervention
  • There are a number of models/frameworks/strategies for examining education or training interventions. The most popular, although very much debated, was that created by Kirkpatrick. Its really important to recognise this was not a healthcare model although the obvious face validity led it to become very popular. It is still used in systematic reviews of educational initiatives to this day.
  • Its pretty simple. Its starts with determining a participants reaction to a training programme.
  • A then asks for a demonstration of learning.
  • Followed by a test of whether there has been a change of behaviour and finally
  • The results, or more precisely the return-on-investment should be described. The term kirkpatrick ‘levels’ are commonly used but to be fair to Donald Kirkpatrick he didn’t use the term levels in his original paper. However the concept of a hierarchy has become implicit in the use of the system.
  • Since Kirkpatrick many people have made modifications to make it more healthcare centric. Most retain a stepwise approach and often don’t really deviate from the underlying premise suggested.
  • One of the divergent schematas was proposed by Hakkennes 2006 where the level structure is taken out and example measures are given given the system a bit more weight. I’d like to take a pause here to consider a vital aspect of educational evaluation my second take home point. Does it matter if the content if crap is the outcome is good? Ross Fisher a paediatric surgical consultant with a strong interest in presentation skills would look at this slide and weep. Is this a good slide? Not really – are you honestly taking anything home from this other than what I have said? This highlights a real issue in the challenge of delivery and content. It is almost certain that the two previous speakers lectures will be rated highly – is that because of the speaker, their material, a bit of both. I’m pretty sure some of the speakers could talk to absolute rubbish and still be educational. Natalie May is going to give a talk with no text at all but having seen her present I am sure that the learning from that lecture will be fantastic. The idea of learning styles is becoming increasingly discredited so I am very glad Victoria gave such an expansive appraisal of the types of learning.
  • This very thought prompted Sarah Yardley 2012 to examine a number of educational papers which utlised a outcome focused modal and discredit them. Just concentrating on outcomes was in her opinion akin to performing research on clinical drugs and not assessing potential side effects.
  • Yardley wants us to move from a situation of did a specific outcome occur.
  • To what ere the outcomes of this intervention?
    This emphasises why I thinking about education as intervention is important. Importantly it helps to determine what you don’t think is education (find dave davis once said some asked himi: if you don’t think lectures are effective why do you lecture so much?
  • And we know this is really important. There is evidence that when problem based learning is was introduced patient centred outcomes such as clinical performance increased but outcomes in more traditional outcomes fell (Philip Davies 2000)
  • Statistical Conclusion Validity is the extent to whether the statistical analysis can correctly identify a change if it is present
    Internal Validity is the extent to which the intervention can reliably be ascribed to have effected the change
    External Validity is the extent to which the intervention is reliable in different populations (this may in terms of participants, settings, times etc)
    Construct Validity relates to the association between the concept you are investigating and the measures you use to test it



  • (This does NOT mean the concept of the intervention is what has actually delivered the change or that the outcome measured is relevant to the intervention)
  • Medical Education research often has difficulty in comparing two exactly matched populations which differ only by case and control (as the environment of study often mixes participants). Demonstrating external validity may mean different power calculations are needed for a different setting.
    The need to ensure adequate pre-testing (but avoid confounding by that very testing improving performance) often results in sample sizes which are difficult to obtain
    Concepts in medical education are often based on research which may not have had adequate internal validity to begin with. Therefore demonstrating adequate construct validity (which requires a conceptual framework) while maintaining the integrity of internal validity can be difficult.
    Concepts underpinning construct validity may only be relevant in certain populations making subsequent external validation difficult




  • Half of respondents were actually unable to access the videos in their place of work.
  • Cameron – ottowas rules example
  • Roland - Evaluating Education

    1. 1. Evaluating Education
    2. 2. This talk may save Your life
    3. 3. This talk may save you strife Your life
    4. 4. Evaluating Education bit.ly/evalref
    5. 5. Education is rarely described as an intervention
    6. 6. Reaction
    7. 7. Reaction Learning
    8. 8. Reaction Learning Behaviour
    9. 9. Reaction Learning Behaviour Results
    10. 10. bit.ly/evalref
    11. 11. Did a specific outcome occur?
    12. 12. What were the outcomes of this intervention?
    13. 13. Statistical Conclusion Validity Internal Validity Construct Validity External Validity The four criterion of methodological quality according to Shadish, Cook and Campbell bit.ly/mededvalidity
    14. 14. Time Change) Intervention
    15. 15. Time Change Intervention Population Population Population
    16. 16. Population (e.g Under or Postgraduates) •Sample Size •Power Calculation •Criterion for Statistical Significance Used Tim Change) Intervention Statistical Conclusion Validity is the extent to whether the statistical analysis can correctly identify a change if it is present Population •Was there a risk of a type II error? Population •Were all outcomes correctly powered? Population
    17. 17. Time (Which can be hours e,g pre and post test or up to years) Change (Which can be measured in a variety of ways) Intervention Internal Validity is the extent to which the intervention can reliably be ascribed to have effected the change Population Population Population bit.ly/mededvalidity
    18. 18. Threats to Internal Validity are best analysed by considering an intervention and control group •History Another event happens at the same time as the intervention •Selection There are pre-existing differences between the groups •Regression to the Mean Extreme pre-test baseline performance may mean change happens by chance (e.g. unusually good or bad at outset) •Maturation There is already a pre-existing trend for change •Instrumentation There are changes in the methods of measurement (e.g data collectors become more experienced) •Testing Pre-testing changes the performance of participants (e.g. promotes information seeking) •Differential Attrition Drop out from the study is not equal between groups Intervention Group Control Group
    19. 19. Time (Which can be hours e,g pre and post test or up to years) Change Y (Which can be measured in a variety of ways) Intervention A Population Construct Validity relates to the association between the concept you are investigating and the measures you use to test it
    20. 20. Time (Which can be hours e,g pre and post test or up to years) Change (Which can be measured in a variety of ways) Intervention Population A External Validity is the extent to which the intervention is reliable in different populations (this may in terms of participants, settings, times etc)
    21. 21. Time (Which can be hours e,g pre and post test or up to years) Change (Which can be measured in a variety of ways) Intervention Population B
    22. 22. Statistical Conclusion Validity Internal Validity Construct Validity External Validity
    23. 23. Interaction
    24. 24. Interaction Interface
    25. 25. Interaction Instruction (The educational Intervention) Interface
    26. 26. Interaction Instruction (The educational Intervention) Ideation Interface
    27. 27. Interaction Instruction (The educational Intervention) Ideation Integration i) Knowledge ii) Behaviour Interface
    28. 28. Interaction Instruction (The educational Intervention) Ideation Integration i) Knowledge ii) Behaviour Implementation Interface
    29. 29. Interaction Instruction (The educational Intervention) Ideation Integration i) Knowledge ii) Behaviour Implementation Interface Improvement
    30. 30. Interaction Instruction (The educational Intervention) Ideation Integration i) Knowledge ii) Behaviour Implementation Interface Improvement The 7I’s Framework
    31. 31. Thanks SMACC Prof. Tim Coats + Dr. David Matheson Dr. Ffion Davies Staff/Patients at Leicester Royal Infirmary
    32. 32. @damian_roland
    33. 33. Education is rarely described as an intervention
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×