'Don't tell me, Show me ! :
Using OSCE's in Education'
Nicky Lambert, Laura Foley, Tina Moore & Marion Hinds
OSCE’s: Objective Structured Clinical Examinations
Human Skills
Professional Knowledge
and Skills
• INTERPERSONAL SKILLS,
EMPATHY, RESPECT, KINDNESS,
COMPASSION, CARE.
• RECOVERY FOCUS, MENTAL
HEALTH KNOWLEDGE, SOUND
EVIDENCE BASE, SAFETY, RISK
ASSESSMENT, CASE
FORMULATION, CLINICAL
REASONING, IN MENTAL
HEALTH
Integration and Application of Skills and Knowledge,
to Result in Safe, Compassionate, Practice for
Satisfied Service Users and Stakeholders.
Objective measurement: all students have a similar
assessment experience.
Structured: specific tasks are assessed and they are
organised to draw on information from a wide range of the
curriculum.
Clinical Examination: students’ demonstrate specific skills
and safe practice within an allocated time span, often in
response to a simulated clinical scenario.
OSCE’s are marked according to transparent criteria and are
designed to allow the evaluation of clinical and theoretical
knowledge and professional skills.
Pro’s & Con’s• High ‘face validity’ for
practitioners
• Correlation for grades achieved
in practice areas & ‘balances out’
practice experiences
• Addresses public concerns
around compassion and safety
• Students positive - more
egalitarian form of assessment
• Allow students to distinguish
their higher order skills such as
problem solving, critical thinking
and demonstrate professional
behaviours.
• Resource intensive.
• Needs planning &
structure.
• OSCE’s can isolate
skills which need to
be practiced as a
whole in order to
be clinically
relevant.
• Stress and anxiety
in OSCE
participants
Ensuring Quality
Structure
Clear, relevant to practice, linked
to curriculum.
Service User
Involvement
Get input from the experts!
Marking Rubric
Checklist and Global, pass/fail
criteria.
Standardised Patient
Standardise guidance for
assessors.
Rehearsal
Practice run, clarity around
process and procedure.
Feed forward
Debrief + reflection, consider peer
evaluation.
Quality
Ongoing process: revision,
evaluation + improvement.
Viva Voce Research: Catalyst
Findings: Themes
• Value/Usefulness of a viva voce
as a mode of assessment
• Preparation of students
• Areas around validity and
reliability
• Introduce students to this mode of examination in yr 1.
• Assessors to have a overview of all content in all modules.
• Criteria reflecting expected content and levels.
• Guidance in terms of the amount and type of
‘encouragement’ given to students during assessment .
• Structured Induction & support for new members.
• Database of clinicians.
• Recordings in viva’s – moderation and feedback.
• Promotional video/use of periodic video recordings.
• Simulation of the event: using mock viva’s with peer
feedback
• Clear guidelines in module handbook
• Code of behaviour for assessors
Recommendations
Marion – Use of OSCE’s
• Rationale for OSCE
• Year 3 OSCE – a variation on the
traditional theme
• Assessment Criteria
• Preliminary Student Evaluation
Tina –use of OSCE’s
• Rationale for OSCE
• Skills station V Scenario
• Preparation of students
• VOSCE
• Use of additional aids
• Assessment criteria
Group Exercise
• ‘Would Osce’s enhance your student’s
interpersonal skills and Employability?’
• ‘Do you think OSCEs are valid to assess skills and
competences in your field?’
• ‘Do OSCE’s advantage students who are less
academic, and is this fair?’
• How could your field use or adapt the OSCE
process to assess your Students ?’
• ‘Do OSCE’s offer any advantages/disadvantages
to your current form of academic assessment?’
References
• El-Khoury LH, Saab B, Musharrafieh U, Antoun J. Impact of a 3-day OSCE
workshop on Iraqi physicians. Fam Med. 2012;44:627–32.
• Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of
clinical competence using objective structured examination. Br Med J.
1975;1:44751.
• Khan, K. Z., Gaunt, K., Ramachandran, S., & Pushkar, P. (2013b). The
Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81.
Part II: Organisation & Administration. Medical Teacher 35(9), e1447-
e1463.
• Lambert, N.A and Watkins, L. (2013). "Meet Mohammed : Using
Simulation and Technology to support Learning", The Journal of Mental
Health Training, Education and Practice 8 (2) 66-75.
• Nestel, D., Kneebone, R., Nolan, C., Akhtar, K., & Darzi, A. (2011).
Formative assessment of procedural skills: students’ responses to the
OSCE and the Integrated Performance Procedural Instrument.
Assessment & Evaluation in Higher Education 36(2), 171-183.
• Parish SJ, Ramaswamy M, Stein MR, Kachur EK, Arnsten JH. Teaching
about Substance Abuse with Objective Structured Clinical Exams. J Gen
Intern Med. 2006;21:453–9.
• Watson AR, Houston IB, Close GC. Evaluation of an objective structured
clinical examination. Arch Dis Child. 1982;57:390–2.
Any Questions?

Don't tell me show me ! - Lambert, Foley, Moore and Hinds

  • 1.
    'Don't tell me,Show me ! : Using OSCE's in Education' Nicky Lambert, Laura Foley, Tina Moore & Marion Hinds
  • 2.
    OSCE’s: Objective StructuredClinical Examinations Human Skills Professional Knowledge and Skills • INTERPERSONAL SKILLS, EMPATHY, RESPECT, KINDNESS, COMPASSION, CARE. • RECOVERY FOCUS, MENTAL HEALTH KNOWLEDGE, SOUND EVIDENCE BASE, SAFETY, RISK ASSESSMENT, CASE FORMULATION, CLINICAL REASONING, IN MENTAL HEALTH Integration and Application of Skills and Knowledge, to Result in Safe, Compassionate, Practice for Satisfied Service Users and Stakeholders. Objective measurement: all students have a similar assessment experience. Structured: specific tasks are assessed and they are organised to draw on information from a wide range of the curriculum. Clinical Examination: students’ demonstrate specific skills and safe practice within an allocated time span, often in response to a simulated clinical scenario. OSCE’s are marked according to transparent criteria and are designed to allow the evaluation of clinical and theoretical knowledge and professional skills.
  • 3.
    Pro’s & Con’s•High ‘face validity’ for practitioners • Correlation for grades achieved in practice areas & ‘balances out’ practice experiences • Addresses public concerns around compassion and safety • Students positive - more egalitarian form of assessment • Allow students to distinguish their higher order skills such as problem solving, critical thinking and demonstrate professional behaviours. • Resource intensive. • Needs planning & structure. • OSCE’s can isolate skills which need to be practiced as a whole in order to be clinically relevant. • Stress and anxiety in OSCE participants
  • 4.
    Ensuring Quality Structure Clear, relevantto practice, linked to curriculum. Service User Involvement Get input from the experts! Marking Rubric Checklist and Global, pass/fail criteria. Standardised Patient Standardise guidance for assessors. Rehearsal Practice run, clarity around process and procedure. Feed forward Debrief + reflection, consider peer evaluation. Quality Ongoing process: revision, evaluation + improvement.
  • 5.
  • 6.
    Findings: Themes • Value/Usefulnessof a viva voce as a mode of assessment • Preparation of students • Areas around validity and reliability
  • 7.
    • Introduce studentsto this mode of examination in yr 1. • Assessors to have a overview of all content in all modules. • Criteria reflecting expected content and levels. • Guidance in terms of the amount and type of ‘encouragement’ given to students during assessment . • Structured Induction & support for new members. • Database of clinicians. • Recordings in viva’s – moderation and feedback. • Promotional video/use of periodic video recordings. • Simulation of the event: using mock viva’s with peer feedback • Clear guidelines in module handbook • Code of behaviour for assessors Recommendations
  • 8.
    Marion – Useof OSCE’s • Rationale for OSCE • Year 3 OSCE – a variation on the traditional theme • Assessment Criteria • Preliminary Student Evaluation
  • 9.
    Tina –use ofOSCE’s • Rationale for OSCE • Skills station V Scenario • Preparation of students • VOSCE • Use of additional aids • Assessment criteria
  • 10.
    Group Exercise • ‘WouldOsce’s enhance your student’s interpersonal skills and Employability?’ • ‘Do you think OSCEs are valid to assess skills and competences in your field?’ • ‘Do OSCE’s advantage students who are less academic, and is this fair?’ • How could your field use or adapt the OSCE process to assess your Students ?’ • ‘Do OSCE’s offer any advantages/disadvantages to your current form of academic assessment?’
  • 11.
    References • El-Khoury LH,Saab B, Musharrafieh U, Antoun J. Impact of a 3-day OSCE workshop on Iraqi physicians. Fam Med. 2012;44:627–32. • Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1:44751. • Khan, K. Z., Gaunt, K., Ramachandran, S., & Pushkar, P. (2013b). The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: Organisation & Administration. Medical Teacher 35(9), e1447- e1463. • Lambert, N.A and Watkins, L. (2013). "Meet Mohammed : Using Simulation and Technology to support Learning", The Journal of Mental Health Training, Education and Practice 8 (2) 66-75. • Nestel, D., Kneebone, R., Nolan, C., Akhtar, K., & Darzi, A. (2011). Formative assessment of procedural skills: students’ responses to the OSCE and the Integrated Performance Procedural Instrument. Assessment & Evaluation in Higher Education 36(2), 171-183. • Parish SJ, Ramaswamy M, Stein MR, Kachur EK, Arnsten JH. Teaching about Substance Abuse with Objective Structured Clinical Exams. J Gen Intern Med. 2006;21:453–9. • Watson AR, Houston IB, Close GC. Evaluation of an objective structured clinical examination. Arch Dis Child. 1982;57:390–2. Any Questions?

Editor's Notes

  • #5 Structure: The LO’s derive from the curriculum and the assessment should fit the purpose (Setna et al 2010). I t should be valid and encourage student engagement with learning. Simulations that feel authentic and follow real life processes are more successful and the complexity should be appropriate for the learner’s level of experience (Byrne and Smyth, 2007). A structured reflection on formative OSCE’s and feed-forward identifying strengths and potential improvement support increased confidence and better performance (Taylor and Green, 2013). Service user involvement: The Preregistration Standards for Nurse Education (NMC, 2010) require clear demonstration of service user and carer expertise in programme delivery and design. Speers et al (2007) found that while service users were enthusiastic over the value of their input in practice assessment; nurses were more reserved in this. The value of input in increasing validity was also highlighted, and viewed as being in line with the move to increased collaboration and involvement. Kurz et al (2009) highlighted barriers to objective evaluation when including real patients. However, developing standardised patient scenarios with service users who will be participating in the OSCE takes experience into consideration, while also helping familiarisation with the scenario. This is also advantageous to the development of grading criteria and approach to marking, taking into consideration service user opinion on importance of particular skills attitudes and behaviours. Marking: Traditional clinical evaluation can be marked by subjectivity (Kelim et al, 2012) and although OSCE’s offer the potential for a structured approach and can address anxiety over grade inflation, mock assessments can reveal issues with marking rubrics and the process of grading itself. Rehearsal followed by discussion can strengthen marking criteria; for example it is important to agree what cues can be provided by assessors at each station, and Brannick et al (2011) highlight the importance of establishing a framework for rating communications skills. Checklists are useful in establishing safe practice and determining a pass mark (Rushforth, 2006) however Bondy’s (1983) seminal work allows for a differentiation of student performance. Barry et al (2012) recommend a combination of checklists and global rating scales give a more complete picture and aid constructive feedback. The Standardised patient (SP): This is a technical term to ensure parity of assessment experience, there is no agreement around the length and type of training SPs should have before an OSCE (McWilliam and Botwinski, 2012). MacDonald (2004) recommends a minimum of 10 hours of training per scenario however observation of the level of accuracy in performance over a number of encounters with students can offer a measure of reliability (Adamo, 2003). Rehearsal: Byrne and Smyth (2007) emphasise preparation and planning, especially with larger cohorts as timings and systems often need revising. Practice runs can reduce strain on assessor’s role and they built in breaks every 2 hrs and swapped roles because of strain, promoting inter-rater reliability. Rehearsal allows examiners to develop expertise and clarity around purpose and process. Assessors must be clear about the rationale and process of OSCE’s, they should understand the scoring rubric, giving positive feedback, confidentiality and the management of safety issues (Khan et al, 2013b). Quality as an ongoing process: Barry et al (2013) held a quality review of their OSCE process with their external reviewer and a lecturer from another field which led to a strengthening of marking criteria and inter-marker consistency which improved their reliability. It also exposed issues with the teaching of communication, with observation it was apparent that students followed process but the type and nature of interactions for example, the process of gaining consent etc needed to be revisited. It is beneficial to consider OSCE’s as a work in process subject to continued revision, evaluation and improvement.
  • #8 Clear agreed criteria reflecting expected content and levels (to include knowledge, skill, attitude, professional behaviour)  Clear guidance in terms of the amount and type of ‘encouragement’ given to students during assessment Structured Induction/support programme for new members Data base of clinicians (updates)  Use of recordings in viva’s – moderation and feedback Promotional video/use of periodic video recordings Simulation of the event - use of mock viva’s with peer feedback  Clear guidelines in module handbook Code of behaviour for lecturers