OSCE is an approach to the assessment of clinical competence in which the components of competence are assessed in a planned and structured way, with attention being paid to the objectivity of the examination (HARDEN 1988)
OSCE is an assessment tool in which the components of clinical competence such as history taking, physical examination, simple procedures, interpretation of lab results, patient management problems, communication, attitude etc. are tested using agreed check lists and rotating the student round a number of stations some which have observers with checklists.
objectivity - state in which something is based only on facts and evidence.not influenced by personal feelings or opinions in considering and representing facts.
Standardisation - a level of consistency or uniformity to certain practices or operations within the selected environment. develop and maintain uniformity
Marking scheme for each station is structured and determined in advance.
Each station is manned by one or two examiners to assess the candidate's performance of the required task and award marks based on predetermined and documented criteria called checklists (marking guides).
Candidates rotate through the stations,completing all the stations on their circuit.
All the candidates take the same stations and are assessed on same tasks
Best OSCE stations are created from real clinical scenarios
Miller’s pyramid model divides the development of clinical competence into four, hierarchical processes.1 On the lowest level of the pyramid is ‘knowledge’, tested by written exams and traditional multiple-choice questions (MCQs)
.2 The next level stands for ‘application of knowledge’, assessed by essays, clinical problem-solving exercises and extended MCQs.
3 The third tier of the pyramid represents ‘clinical skills competency’, assessed by standardized patient exercises, simulations and clinical exams.
2 Finally, on top of the pyramid is ‘clinical performance’, assessed by direct observation in real clinical settings.2
The lower level processes account for the cognitive components of competence and involve classroom-based assessments, while the two higher tiers of the pyramid account for the behavioural components of clinical competence, which involve assessment in simulated and real clinical settings.
2. Preparation of stations for OSCE
Facts Evidences Level of consistency uniformity
Assessment tool
objectivity standardisation
Series - Time-limited stations simulated environment
Professional performance
▰ Harden 1975 Dundee
3. • Rotate through the stations,completing all stations in the circuit.
• All take the same stations and assessed on same tasks
Marking scheme structured Determined in advance
Station one or two
examiners
• Predetermined
checklists
performance
4. Miller’s pyramid (1990)
• way of ranking clinical
competence both in
educational settings and in
the workplace
• To truly know whether our
learners are achieving what
we want them to achieve
we should assess them in
the setting that we expect
them to be delivered.
5.
6. OSCE station categories
Clinical
examination
Real
Simulated patient
Communication
skills
counselling,
communicating with
a ventilated patient
History
taking Breaking bad news
Lab tests interpretations
CBP,ABG analysis
report,LFT,Thyroid profile,
CSF results, urine and stool
reports
7.
8.
9. Formative OSCE
▰ During the course to monitor the
progress
▰ At the end of each posting
▰ can be even one or 2 stations
Example
• End of Vital signs unit in FON
stations on asessing TPR and BP
• CVS in Anatomy and physiology -mark
the various sites of pulse checking,
Summative OSCE
• Used at the end of courses or
programmes, or on completion of a
module to test students against set
objectives and learning outcomes.
• usually 14 to 18 stations
Example
• B.SC N 1st,2nd,3rd or 4th yrs
• OSCE in FON, MSN -I,MSN -II etc
10. Preparing station for an OSCE
• Team approach.
Steps
1 . Form an OSCE committee
2 . Blueprinting the examination content -Examination length (Number of stations)
3. Develop a bank of OSCE Stations
a. Choice of topics for new stations
b. Choice of station writers
c. Choice of station types
d . Choice of OSCE station writing template
e. Station writing - observed/unobserved/linked/technology enhanced
f .Marking guidance
g. Peer review workshops
h . Piloting
11. Preparing station for an OSCE
4. Choosing a scoring rubric and standard setting
• Analytical and Holistic ( Global scoring)
• Standard setting
5. Developing a pool of trained examiners
• Identification of potential examiners
• Examiner training workshops and outcomes of
training
6 . Developing a pool of trained standardized patients
• Recruitment of standardized patients
• Standardized patient training
7. Choosing OSCE venue
8 . Setting up the OSCE circuit and
equipment
• Circuit with rest stations
• considerations for individual stations
• The equipment
• Examination day briefings
12. 1. Form an OSCE committee
An organizing team to oversee
scenarios selection process
Smooth running of OSCE
standardized
patients
OSCE circuit
Examiners
Example - Obstetrics
and Gynecology
Nursing
Committee
• HOD,
• Professor,
• Associate professor,
• Asst Professors etc
13. Blue print
Medical Surgical Nursing
Topic Anatomy &
Physiology
Clinical
assessment
Procedure Health
Education
Total no of
questions
GI
Q1 Q2 Q3 Q4 4
CVS Q5, Q6 Q7, Q8 Q9 Q10 6
Respirat
ory
Q11, Q12 Q13, Q14 Q15 Q16 6
Commun
icable
diseases
- - Q17 Q18 2
Total
question
s
5 5 4 4 18
• Two-dimensional matrix
• one axis - Generic
competencies to be tested
(e.g. history taking,
communication skills,
physical examination,
management planning,
etc.) and
• other axis- Problems or
conditions upon which the
competencies will be
demonstrated .
14. Blueprinting and mapping
• Determining and choosing the spread of content
• Content mapping with the curriculum
• Context of the examination - Clinical examination, Communication skills, Lab tests
interpretations
• No more and no less - Ensuring uniformity and equal distribution of questions
• Clarity on Higher order and lower order questions
Example
1st yr - vital signs,
2nd yr - care of chest drain,
3rd and 4th yr - care of mother in 2nd stage of labour, ACLS skills etc
15. Blueprinting - Examination length (number of stations)
• Time limited task - 5 and 10 min.( Determine before developing Blueprint)
• Depends on the number of stations within each OSCE and the length of each station.
• Reliability and validity inflienced by the number of stations and total length of the
examination (Newble 2004).
• An appropriate and realistic time allocation - improve the test reliability
• Content specificity contributes to validity
• Acceptable reliability score - represented by either Cronbach’s or Generalisability (G)
coefficient value between 0.7 and 0.8.
• well constructed OSCE stations, an adequate reliability could be achieved with 14–18 stations
each with 5–10 min duration (Epstein 2007).
16. 2. How to deevelop a bank of OSCE Stations ?
• A bank of robust and quality assured stations - better reliability and validity
• pre-existing bank of stations - update
• Quality assured - appropriate steps in the algorithm
17. Choice of topics for new stations Choice of station writers Choice of station Types
Choice of an OSCE Station
Writing Template
Station Writing
Marking guidance
Peer Review Workshops Piloting Psychometric Analysis
18. Choice of topics for new stations
• Governed by the curriculum outcomes
• If OSCE bank already exists review & identify gaps
• If Curriculum modified / learning objectives of the modules changed - Need for new station
• Assessment of competencies should always be aligned to the teaching and learning that has
taken place as specified by the course curriculum
19. who are the choice of station writers ?
• OSCE lead identifies - Experts
• If a pool of trained examiners already exists seek volunteers to contribute
• station writers to be familiar with the underlying principles of the OSCE for appropriate
content development
• Brief orientation sessions or written instructions could be developed for people new to
this task.
20. Choice of station Types ?
1. Observed station
• An examiner present throughout test
Ex - Communication skills,Procedural skills,Clinical examination
2. Unobserved station
• No examiner present throughout the test.
• Answers may be submitted on paper either after each station or following the completion of
the examination
Ex - Interpretation of clinical information e.g. X-rays,pathology specimens, blood results.
21. Choice of station Types cntd
3.Technology enhanced station
• Part task trainers or highfidelity manikins to assess skills that would otherwise be difficult to
assess in the OSCE format.
Ex - Intimate clinical examinations with use of part task trainers, e.g. Catheterisation
4. Linked stations
• Two consecutive stations. observed or unobserved.
Ex - Observed examination of the respiratory system in the first station and Unobserved
documentation of findings and management plan in the second station.
22. Choice of OSCE station writing template
• Once type of station chosen, an appropriate template need to be developed
• A template helps to develop all stations in a similar format - standardisation - maintain
the reliability of the scores.
Appendix 1
23. Choice of OSCE station
writing template
• Subject/Topic,
• Level of candidate,
• Competencies to be
assessed,
• Station duration,
• Information for the site
organisers , Standardised
patient (SP) age and
sex,Resources and
equipment needed,
Setting up the station - - Instructions for candidates
• What is the scenario (briefly)?
• Who and where they (Candidates) are?
• What are the candidates expected to do?
• What are the candidates not expected to do?
Information for the examiner
• Brief background to the scenario,Examiner’s role,What
are the objectives of the station
• What information they might be able to provide the
candidate?
• What information they should not provide the
candidate? etc
24. Marking guidance
• Need to be decided during the station writing phase itself
1. Analytical (check list scale / checklist rating scale)
- List of statements describing the actions expected of the candidates at the station
2. Holistic (global rating scale)
- allow the assessor to rate the whole process
- marking criteria for individual station is not required
25. Peer review workshops
• Quality-assuring new OSCE stations.
• Invite authors to bring new OSCE to the workshops where delegates can review stations
written by others
• Presence of the authors ensures changes and clarifications easy
Template of Questionnaire
• Are the tasks in this station achievable in 8 Minutes?
• Is this topic taught in curriculum?
• Where is it taught in curriculum?
• The attributes being tested are expected of a FY1 B.Sc Nursing student
• Would this question discriminate between good and poor students?
• Would this station be able to recreate an atmosphere close to real patient encounter? etc
26. Piloting
• small or miniature version of a larger-scale study or project
• Helps to identify issues with the practicality and allocation of time for the tasks.
• changes can then be made to the stations to improve their quality (Whelan 1999).
• Redesign and re-pilot - if there is any problem
Psycho metric analysis
• Content validity - Experts
• Acceptable reliability score represented by either Cronbach’s or Generalisability (G)
coefficient value between 0.7 and 0.8. well constructed OSCE stations, an adequate reliability
could be achieved with 14–18 stations each with 5–10 min duration (Epstein 2007).
27. Choosing a scoring rubric and standard setting
• An assessment tool that delineates the expectations for a task or an assignment’.Stevens and
Levi (2005)
1. Analytical (checklist scale)
- List of statements describing the actions expected of the candidates at the station
2. Holistic (global rating scale)
- Allows the assessor to rate the whole process
28. Choosing a scoring rubric and standard setting cntd
Analytical scoring (checklist scale)
• Could be ‘binary’, yes/no (performed/not performed),
• candidates are marked based on whether or not an action was performed,
• no discrimination for the quality of the actions lower and higher levels of performance.
• Can have 5–7 point rating scale, which allows the examiners to mark candidates based upon
the quality of the actions.
• Key strength - ability to provide an objective assessment - greater inter-rater reliability (
degree of agreement between two asessors)
29. Binary check list
• Candidate performs an
examination of the chest
1. Introduction Y/ N
2. Obtaining consent Y/ N
3. Appropriate exposure Y/ N
4. Professional approach Y/ N
5. General physical examination
Y/ N
6. Inspection Y/ N
7. Palpation Y/ N
8. Percussion Y/ N
9. Auscultation Y/ N
10. Advising patient at the end to cover up
the exposed areas and thank for cooperation.Y/ N
30. Checklist using rating scales
Candidate performs an examination of the chest
1 [] 2 [] 3 [] 4 [] 5 []
1. Unstructured approach
2. Structured approach but completes less than 50%
of key steps.
3. Structured approach but completes more than
50% of key steps.
4. Structured approach and completes a majority of
key steps.
5. Structured approach and completes all key steps.
Such a check list should then be accompanied by a
list of key steps.
• Uses alcohol rub before and after examination and,
when appropriate uses gloves
• Seeks permission to examine, and explains the nature
of examination
• Offers/Asks for chaperone where appropriate
• Asks the patient if any areas to be palpated or moved
are painful
• Positions the patient correctly and comfortably, then
uses a methodical, fluent and correct technique
• Does not distress, embarrass or hurt the patient unduly
• Examines, or suggests examining, all the relevant
areas
• Completes the task, covers up the exposed areas and
thanks the patient
31. 5. Choosing a scoring rubric and standard setting cntd
Holistic scoring (global rating scale)
• Allow the assessor to rate the whole process. ( Vs task specific checklists)
• candiadates may not follow a pre-determined sequence of steps but still perform the task
with a high standard ease.
• determine not only whether an action was performed, but also how well it was performed.
Example - Ability to empathize with patients in communication skills stations.
32. 5 a . Standard setting - Defining the score at which a candidate will pass or fail.
Norm referenced methods and Criterion method
Norm referenced methods - standard set is based upon peer performance
• In ‘poor’ cohort, a candidate may pass an examination that they would have otherwise
failed if they took the examination with a ‘stronger’ cohort.
• Norm referencing is usually deemed unacceptable for clinical competency licensing tests,
which aim to ensure that candidates are safe to practice.
Criterion referencing Method
• clear standard below which a nursing student would not be judged fit to practice without
reference to the achievement of others
Ex- pass score of 50 marks. Marks below 5o would disqualify a student from the exam
33. 5. Developing a pool of trained examiners
• Consistency in marking -pivotal
• New examiners - added to the pool
• Existing examiners - refresher training.
Identification of potential examiners
• reliability of scores generated by the examiners not only depends upon the consistent
marking but also their clinical experience relevant to the OSCE station.
34. Examiner training workshops
• should take place well in advance of the examinations.
• Level of training - background and ability of the examiners (Newble et al. 1980; van der Vleuten et
al. 1989)
• Group discussions about some of the above topics, followed by the opportunity for the
examiners to mark Mock OSCE or videos of real OSCE.
Training all examiners will
• reduce examiner variation (Newble et al. 1980; van der Vleuten et al.)
• Improve consistency in behaviour - improve exam reliability (Tan & Azila 2007).
35. 6. Developing a pool of trained standardised patients
• Continuum of patients used in clinical examinations, from the real patient with clinical
signs who receives no training to the rigorously trained simulated patient. (Collins &
Harden (1998)
Recruitment of standardised patients
• If candidate need to elicit a specific clinical sign, e.g. a heart murmur, a real patient with
the murmur must be used.
• if candidate need to competently examine the CVS (regardless of any clinical abnormality)
a ‘healthy’ volunteer used .
36. Patients recruitment
• Real patients with clinical signs can be accessed through Physicians/ nurses
• A doctor / nurse previously known to the patient and responsible for their care may be the
most appropriate person to make initial contact (Collins & Harden, 1998).
• Recruiting stable patients with common conditions is easier than finding patients with rare
and unstable disease
• co-ordinator is employed large institutions a to undertake the selection process of
standardised/simulated patient
37. Standardised patient training
All patients simulated ,real and standardised need training
• importance of portraying the clinical conditions in question, reliably and repeatedly and
the need for standardisation between candidates
• Pre-examination briefing
• Dedicated training for role play in more complex scenarios
• After training - performance appraisal - for quality assurance before being used in a high
stakes examination.
•
38. Administrative tasks
• Choose an OSCE venue - well in advance based on number of stations and candidates.
• Briefing rooms,
• administrative offices,
• waiting rooms for patients and examiners,
• quarantine facilities and
• refreshment areas.
• Individual rooms - increased confidentiality and low noise levels
39. Setting up the OSCE circuit and equipment
OSCE circuit
• Setup of stations for seamless flow of candidates through the examination.
• Each candidate will individually visit every station within the circuit throughout the course
of the examination.
• The number of candidates in each sitting equal to the number of stations, unless rest
stations are used
• Each candidate will be allocated a start station and move from station to station in the
direction of the circuit until all stations have been completed.
40. Circuit with rest stations
• Rest stations allows a break for the candidates
• keep this station private, so that the candidate at this station cannot over hear what is being
said at the other stations.
• clearly marked, and candidates should be informed of its presence
• should be interspersed within the live stations.
41. The equipment.
. All equipment should be sourced well in advance of the OSCE,
• checked to ensure that it is in good working order.
• spare equipment and batteries available on the day in case of breakages or breakdowns
• If candidates are expected to bring their own stethoscopes for instance, they should be
informed of this
• more advanced equipment / high fidelity simulators - personnel to operate
43. OSCE
SAMPLE STATION
FOR MSC
PSYCHIATRIC
NURSING
STATION NUMBER TASK / QUESTION
1 –procedure Provide medication adherence counselling to Mrs Gita
2-response Listen to the audio and identify the type of lung sounds
3-procedure Demonstrate BLS technique
4-response Identify the drugs- its action, indication, dose and side effects
5- procedure Teach Incentive spirometry technique to the patient
6 REST STATION
7-response Read the patient verbatim and identify the psychopathology
8-procedure Provide Disulfuram counselling to Mr Prakash
9- response Read the ECG strip and identify the abnormality
10- response Identify the medication induced movement disorder from the video clip
44. References
• Kamran Z. Khan, Sankaranarayanan Ramachandran, Kathryn Gaunt & Piyush Pushkar
(2013) The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I:
An historical and theoretical perspective, Medical Teacher, 35:9, e1437-e1446, DOI:
10.3109/0142159X.2013.818634
• Kamran Z. Khan, Kathryn Gaunt, Sankaranarayanan Ramachandran & Piyush Pushkar
(2013) The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part
II: Organisation & Administration, Medical Teacher, 35:9, e1447-e1463,
DOI:10.3109/0142159X.2013.818635
• OSCE mannual, Tamilnadu nurses and midwives council