© 2013 Springer Publishing Company, LLC.
Chapter 14
Clinical Evaluation
Methods
&Oermann Gaberson
Evaluation and Testing in Nursing Education
4th edition
© 2013 Springer Publishing Company, LLC.
Steps in Clinical Evaluation Process
♦ Decide on purpose of evaluation–why are you
evaluating student?
♦ Formative or summative?
♦ Establish standards for evaluation.
– Letter grade or P/F?
♦ Select evaluator.
– Faculty? Preceptor? Self? Multiple?
2
© 2013 Springer Publishing Company, LLC.
Steps in Clinical Evaluation Process
♦ Decide on evaluation method based on
competencies to be achieved
– Use a variety of appropriate methods
♦ Collect and interpret data
♦ Use results to make decisions about students
♦ Evaluate your process—need for changes?
3
© 2013 Springer Publishing Company, LLC.
Selecting Clinical Evaluation
Methods
♦ Select based on clinical competencies
♦ Vary the methods
♦ Select realistic evaluation methods to use
♦ Decide if formative or summative
♦ Review purpose and number required of each
assignment
♦ Consider your time
4
© 2013 Springer Publishing Company, LLC.
Clinical Evaluation Methods
• Observation
• Notes about
performance
• Checklists
• Rating scales
• Simulations
• Standardized patients
• Objective Structured
Clinical Evaluation
• Media clips
• Written assignments
• Journal
5
© 2013 Springer Publishing Company, LLC.
Clinical Evaluation Methods (cont’d)
• Nursing care plans
• Concept maps
• Case method,
unfolding cases,
case study
• Papers
• Portfolios
• Conferences
• Group projects
• Self-assessment
6
© 2013 Springer Publishing Company, LLC.
Observation
♦ Predominant method of assessing clinical
performance
♦ Should be guided by clinical outcomes or
competencies
– Help the teacher focus the observation
7
© 2013 Springer Publishing Company, LLC.
Observation: Threats to Validity
and Reliability
♦ Teacher’s values and biases may influence:
– What is observed
– Inferences and judgments about quality of
performance
♦ Overreliance on first impressions
– Should observe over time and in different
situations before drawing conclusions
8
© 2013 Springer Publishing Company, LLC.
Observation: Threats to Validity
and Reliability (cont’d)
♦ Focus of observation may differ with each
instructor and time period
♦ Teacher may make incorrect judgments
about the observation
– Discuss observations with students, obtain
their perceptions, and modify inferences
when indicated
9
© 2013 Springer Publishing Company, LLC.
Observation: Threats to Validity
and Reliability (cont’d)
♦ Every observation in the clinical setting is only
a sample of the student’s performance
– Observation of the same student at another time
may reveal a different level of performance
– Same is true for observations of teacher
performance
10
© 2013 Springer Publishing Company, LLC.
Documenting Observations
♦ Methods for recording observations and
their context
– Notes about performance
– Checklists
– Rating scales
11
© 2013 Springer Publishing Company, LLC.
Notes About Performance
♦ Descriptions of observed student performance
– May include a judgment about the quality of
the performance
♦ Should be recorded as soon after the
observation as possible
♦ Methods of recording
– Handwitten notes, flow sheets, narratives
– Using technology such as tablet computer or iPad
12
© 2013 Springer Publishing Company, LLC.
Notes About Performance (cont’d)
♦ Share with students frequently to give them
feedback about their performance
– Incorporate student input when indicated
♦ Use conferences with students to review
pattern of performance over time
♦ Serve as documentation for ratings on the
clinical evaluation tool
13
© 2013 Springer Publishing Company, LLC.
Checklist
♦ List of specific behaviors or activities to be
observed and means of indicating whether or
not they were present
♦ Often lists steps of a procedure or skill
♦ Facilitate teacher’s observation of
performance
♦ Facilitate learners’ self-assessment prior to
assessment by the teacher
14
© 2013 Springer Publishing Company, LLC.
Developing a Checklist
♦ List steps or actions in order
– Focus on critical items and sequencing
♦ List errors that learners often make
– Alerts the assessor to observe for these
♦ Develop into a form to check steps or
behaviors as they are performed
– Allow flexibility of sequencing if multiple ways of
performing a procedure safely
15
© 2013 Springer Publishing Company, LLC.
Rating Scales
♦ Also called clinical evaluation tools or
instruments
♦ Means of recording judgments about
observed performance of learners
♦ Two parts
– List of outcomes or competencies learner is to
demonstrate in clinical practice
– Scale for rating students’ performance of them
16
© 2013 Springer Publishing Company, LLC.
Rating Scales (cont’d)
♦ Most useful for summative evaluation of
performance
♦ Also may be used to evaluate specific clinical
activities
– e.g., student’s presentation of a case in
postclinical conference
♦ Useful for giving specific feedback to students
about their performance
17
© 2013 Springer Publishing Company, LLC.
Types of Rating Scales
♦ Two-point scales
– Pass-fail
– Satisfactory-unsatisfactory
♦ Multiple-level scales
– Letters (e.g., A, B, C, D, F)
– Numbers (e.g., 5, 4, 3, 2, 1)
18
© 2013 Springer Publishing Company, LLC.
Types of Rating Scales
♦ Multiple-level scales
– Qualitative labels (e.g., “excellent,” “very good,”
“good,” “fair,” “poor”)
– Frequency labels (e.g., “always,” “often,”
“sometimes,” “never”)
♦ Matrix combining different qualities of the
performance
19
© 2013 Springer Publishing Company, LLC.
Issues With Rating Scales
♦ Interrater consistency (reliability)
– Do all users agree on meaning of ratings?
• e.g., difference between “good” and “excellent”
– Problem even when using descriptors with letters,
numbers, or labels used to rate outcomes or
competencies
20
© 2013 Springer Publishing Company, LLC.
Issues With Rating Scales (cont’d)
♦ Scales based on frequency labels
– e.g., “always,” “sometimes”
– Limited opportunities for students to practice and
demonstrate a high level of some skills
– How to rate a student who has had only one
opportunity to practice a skill?
21
© 2013 Springer Publishing Company, LLC.
Common Errors With Rating Scales
♦ Leniency error
– Tendency to rate all students toward the high end of
the scale
♦ Severity error
– Tendency to rate all students toward the low end of
the scale
♦ Central tendency error
– Hesitancy to mark either end of the rating scale
– Tendency to use only the midpoint of the scale
22
© 2013 Springer Publishing Company, LLC.
Common Errors With
Rating Scales (cont’d)
♦ Halo effect
– Basing judgment on a general impression of the student
♦ Personal bias
– Teacher’s personal values or preferences influence ratings
♦ Logical error
– Giving similar ratings to items that are logically related to
one another
23
© 2013 Springer Publishing Company, LLC.
Common Errors With
Rating Scales (cont’d)
♦ Rater drift
– Definition or interpretation of competencies to be
observed and assessed changes over time
♦ Reliability decay
– Over time, teachers may become less consistent
in their ratings
24
© 2013 Springer Publishing Company, LLC.
Improving Use of Rating Scales
♦ Teachers should have regular discussion of
outcomes or competencies to be rated
– Meaning of each outcome or competency
– Describe what student performance would look
like at each rating level
– May use simulation to facilitate discussion
• Observe a performance, assess it with the rating scale,
and discuss rationale for the ratings
25
© 2013 Springer Publishing Company, LLC.
Clinical Evaluation Tools
♦ Same tool for all courses or course-specific
tool?
– Most faculties use one tool for all courses
• Competencies adapted to each course
♦ Two-level or multilevel scales?
– Most faculties use pass-fail or satisfactory-
unsatisfactory rating scales
26
© 2013 Springer Publishing Company, LLC.
Simulation
♦ Activity that allows learners to experience a
clinical situation without the risks and
constraints of a real-life situation
♦ Does not replace actual experiences with
patients
♦ Useful for clinical evaluation as well as
instruction
27
© 2013 Springer Publishing Company, LLC.
Use of Simulation for
Clinical Evaluation
♦ Case scenarios that students analyze
♦ Computer simulations
♦ Models and manikins
♦ Standardized patients
♦ Human patient simulators
28
© 2013 Springer Publishing Company, LLC.
Incorporating Simulation Into
Evaluation Protocols
♦ Identify clinical outcomes to be assessed with
simulation
♦ Identify types of simulations needed to assess the
designated outcomes
♦ Determine if simulations are available or need to be
developed by the faculty
♦ Determine if simulation is for formative or
summative evaluation or both
29
© 2013 Springer Publishing Company, LLC.
Incorporating Simulation Into
Evaluation Protocols (cont’d)
♦ Determine need to develop or obtain
checklists or other methods for rating
performance
♦ Decide when the simulations will be used in
the course
♦ Decide how the effectiveness of the
simulations will be evaluated
30
© 2013 Springer Publishing Company, LLC.
Standardized Patients
♦ Individuals trained to accurately portray the role of a
patient with a specific diagnosis or condition
♦ Provides consistency in performance evaluation
– Recreate the same patient condition and clinical situation
with each student
♦ Provide written and oral feedback to the students on
their performance
31
© 2013 Springer Publishing Company, LLC.
Objective Structured Clinical
Examination (OSCE)
♦ Students rotate through a series of stations
♦ At each station they may complete an activity
or perform a task
♦ Performance evaluated with checklists or
rating scales
♦ Usually used for summative evaluation
32
© 2013 Springer Publishing Company, LLC.
Games
♦ Involve competition, rules (structure), and
collaboration
♦ Individual games and games played against
other students either individually or in teams
♦ Many require props or equipment
♦ Not appropriate for grading; use only for
instruction and formative evaluation
33
© 2013 Springer Publishing Company, LLC.
Media Clips
♦ Short segments of a digital recording, DVD, video
from YouTube, other forms of multimedia
♦ May be viewed by students as a basis for:
– Discussions (e.g., postclinical conference, online discussion
boards)
– Small-group activities
– Critique in a written assignment
♦ Students can visualize the patient and context
34
© 2013 Springer Publishing Company, LLC.
Written Assignments
♦ Use: Assess problem solving, higher level
thinking, understanding content relevant to
clinical practice
♦ Misuse: Students complete the same
assignments repetitively throughout a course
after outcomes have been met
35
© 2013 Springer Publishing Company, LLC.
Written Assignments: Journals
♦ Opportunity for students to describe and reflect on
events and experiences in clinical practice
♦ Means of engaging in dialogue with the teacher
♦ Print or electronic format
♦ Students need to be clear about the purpose of the
journal activity
36
© 2013 Springer Publishing Company, LLC.
Written Assignments:
Journals (cont’d)
♦ Written guidelines for journal entries
– How many entries?
– What types of entries?
– How frequently?
♦ Teacher gives prompt and meaningful feedback
♦ Decide if graded or ungraded
– Grading inappropriate if journal is used to express feelings,
values, beliefs
37
© 2013 Springer Publishing Company, LLC.
Written Assignments:
Nursing Care Plans
♦ Learn components of the nursing process
♦ Weaknesses when used for clinical evaluation
– Linear process—does not facilitate assessment of
complex thinking necessary for clinical practice
– Students often paraphrase from textbook
– Teachers should be cautious about the number of
care plans required in a course
38
© 2013 Springer Publishing Company, LLC.
Written Assignments:
Concept Maps
♦ Tool for visually showing relationships among
concepts
♦ Help students organize knowledge as they
plan for clinical activities
– Can “see” how aspects of care relate
♦ Best used for formative evaluation
– Can be graded with criteria established for
evaluation
39
© 2013 Springer Publishing Company, LLC.
Written Assignments: Case Method,
Unfolding Cases, and Case Study
♦ Methods for assessing higher-level learning
♦ Individual or group activities
♦ Useful for formative evaluation and student
self-assessment
♦ Short cases, unfolding cases, and case studies
can be graded
– Establish scoring criteria for responses to the
questions with the case
40
© 2013 Springer Publishing Company, LLC.
Written Assignments: Papers
♦ Short papers about clinical practice
– Useful for assessing higher level thinking and
other cognitive skills
– Often better than long papers in which students
may summarize textbook and other literature
– Can be assessed formatively or summatively
41
© 2013 Springer Publishing Company, LLC.
Written Assignments:
Papers (cont’d)
♦ Term papers
– Critique and synthesize relevant literature and
relate it to patient care
– Assess drafts formatively to improve writing skills
– Grade final product
42
© 2013 Springer Publishing Company, LLC.
Portfolio
♦ Collections of products that document
student achievement
– Demonstrate clinical competencies
– Show work completed over period of time
♦ Types—can be combined for clinical
evaluation
– Best work (graded)
– Growth and development (formative)
43
© 2013 Springer Publishing Company, LLC.
Using Portfolios for
Clinical Evaluation
♦ Identify purpose of the portfolio
– Best work or growth and learning
– Formative, summative, or both
♦ Identify the type of entries and content to be included
in the portfolio
– Content to be chosen by student, faculty member or both?
♦ Decide how portfolio will be evaluated
– Individual entries and the portfolio overall
– Scoring rubric?
44
© 2013 Springer Publishing Company, LLC.
Electronic Portfolio
♦ Developed and stored electronically
– Facilitates updating and revision
♦ Can be easily sent to others for feedback
or scoring
♦ Limited or no cost
♦ Can be modified for job search
♦ Can include a variety of multimedia
45
© 2013 Springer Publishing Company, LLC.
Conferences
♦ Method for evaluating competency in oral
communication skills
♦ Can be used to assess higher-level thinking
and problem solving
♦ Can be used for formative or summative
evaluation or both
– If summative, need specific criteria and
scoring rubric
46
© 2013 Springer Publishing Company, LLC.
Conferences
♦ Types of conferences
– Preclinical
– Postclinical
– Interprofessional
♦ Format
– Face-to-face
– Online
47
© 2013 Springer Publishing Company, LLC.
Criteria for Evaluating Conferences
♦ Ability of students to:
– Present ideas clearly and logically
– Participate actively in group discussion
– Offer relevant ideas
– Demonstrate knowledge of content
– Offer different perspectives or share reflections
– Assume a leadership role, if relevant, in group
process
48
© 2013 Springer Publishing Company, LLC.
Group Projects
♦ Short- or long-term work
♦ Can assess products developed by the group
and ability of students to work cooperatively
♦ Rubrics should be used for assessing
– Should be geared to the specific project
– Should reflect the goals of group work
• Students can assess participation and contributions
of peers
49
© 2013 Springer Publishing Company, LLC.
Grading Group Projects
♦ Group grade
– Each member of the group receives the same
grade
– Does not account for individual student effort and
contribution to the group product
♦ Combination of individual and group grades
– Students identify their contributions to the overall
product; teacher or self-assessment
– Students prepare individual and group projects
50
© 2013 Springer Publishing Company, LLC.
Self-assessment
♦ Ability of students to evaluate their own
clinical competencies and identify need for
further learning
♦ Difficult for some students to assess their own
performance
– Need supportive environment, guidance, practice
♦ Appropriate only for formative evaluation
51
© 2013 Springer Publishing Company, LLC.
Clinical Evaluation in
Distance Education
♦ Preceptors may observe and evaluate individual
students
♦ Adjunct or part-time faculty members may evaluate
a group of students at remote site
♦ Intensive on-campus or regional site skill acquisition
workshop
♦ Students may independently demonstrate clinical
skills and procedures in simulation setting
– Record performance for evaluation by teacher
52
© 2013 Springer Publishing Company, LLC.
Clinical Evaluation in
Distance Education (cont’d)
♦ Critical decision: Which clinical skills and
competencies need to be observed and the
performance rated?
– Determines evaluation methods to use
♦ Should use consistent methods across all clinical sites
♦ Need to orient all evaluators of student performance
– Promotes reliability
53

Chapter 14 ppt eval & testing 4e formatted 01.10 mo checked

  • 1.
    © 2013 SpringerPublishing Company, LLC. Chapter 14 Clinical Evaluation Methods &Oermann Gaberson Evaluation and Testing in Nursing Education 4th edition
  • 2.
    © 2013 SpringerPublishing Company, LLC. Steps in Clinical Evaluation Process ♦ Decide on purpose of evaluation–why are you evaluating student? ♦ Formative or summative? ♦ Establish standards for evaluation. – Letter grade or P/F? ♦ Select evaluator. – Faculty? Preceptor? Self? Multiple? 2
  • 3.
    © 2013 SpringerPublishing Company, LLC. Steps in Clinical Evaluation Process ♦ Decide on evaluation method based on competencies to be achieved – Use a variety of appropriate methods ♦ Collect and interpret data ♦ Use results to make decisions about students ♦ Evaluate your process—need for changes? 3
  • 4.
    © 2013 SpringerPublishing Company, LLC. Selecting Clinical Evaluation Methods ♦ Select based on clinical competencies ♦ Vary the methods ♦ Select realistic evaluation methods to use ♦ Decide if formative or summative ♦ Review purpose and number required of each assignment ♦ Consider your time 4
  • 5.
    © 2013 SpringerPublishing Company, LLC. Clinical Evaluation Methods • Observation • Notes about performance • Checklists • Rating scales • Simulations • Standardized patients • Objective Structured Clinical Evaluation • Media clips • Written assignments • Journal 5
  • 6.
    © 2013 SpringerPublishing Company, LLC. Clinical Evaluation Methods (cont’d) • Nursing care plans • Concept maps • Case method, unfolding cases, case study • Papers • Portfolios • Conferences • Group projects • Self-assessment 6
  • 7.
    © 2013 SpringerPublishing Company, LLC. Observation ♦ Predominant method of assessing clinical performance ♦ Should be guided by clinical outcomes or competencies – Help the teacher focus the observation 7
  • 8.
    © 2013 SpringerPublishing Company, LLC. Observation: Threats to Validity and Reliability ♦ Teacher’s values and biases may influence: – What is observed – Inferences and judgments about quality of performance ♦ Overreliance on first impressions – Should observe over time and in different situations before drawing conclusions 8
  • 9.
    © 2013 SpringerPublishing Company, LLC. Observation: Threats to Validity and Reliability (cont’d) ♦ Focus of observation may differ with each instructor and time period ♦ Teacher may make incorrect judgments about the observation – Discuss observations with students, obtain their perceptions, and modify inferences when indicated 9
  • 10.
    © 2013 SpringerPublishing Company, LLC. Observation: Threats to Validity and Reliability (cont’d) ♦ Every observation in the clinical setting is only a sample of the student’s performance – Observation of the same student at another time may reveal a different level of performance – Same is true for observations of teacher performance 10
  • 11.
    © 2013 SpringerPublishing Company, LLC. Documenting Observations ♦ Methods for recording observations and their context – Notes about performance – Checklists – Rating scales 11
  • 12.
    © 2013 SpringerPublishing Company, LLC. Notes About Performance ♦ Descriptions of observed student performance – May include a judgment about the quality of the performance ♦ Should be recorded as soon after the observation as possible ♦ Methods of recording – Handwitten notes, flow sheets, narratives – Using technology such as tablet computer or iPad 12
  • 13.
    © 2013 SpringerPublishing Company, LLC. Notes About Performance (cont’d) ♦ Share with students frequently to give them feedback about their performance – Incorporate student input when indicated ♦ Use conferences with students to review pattern of performance over time ♦ Serve as documentation for ratings on the clinical evaluation tool 13
  • 14.
    © 2013 SpringerPublishing Company, LLC. Checklist ♦ List of specific behaviors or activities to be observed and means of indicating whether or not they were present ♦ Often lists steps of a procedure or skill ♦ Facilitate teacher’s observation of performance ♦ Facilitate learners’ self-assessment prior to assessment by the teacher 14
  • 15.
    © 2013 SpringerPublishing Company, LLC. Developing a Checklist ♦ List steps or actions in order – Focus on critical items and sequencing ♦ List errors that learners often make – Alerts the assessor to observe for these ♦ Develop into a form to check steps or behaviors as they are performed – Allow flexibility of sequencing if multiple ways of performing a procedure safely 15
  • 16.
    © 2013 SpringerPublishing Company, LLC. Rating Scales ♦ Also called clinical evaluation tools or instruments ♦ Means of recording judgments about observed performance of learners ♦ Two parts – List of outcomes or competencies learner is to demonstrate in clinical practice – Scale for rating students’ performance of them 16
  • 17.
    © 2013 SpringerPublishing Company, LLC. Rating Scales (cont’d) ♦ Most useful for summative evaluation of performance ♦ Also may be used to evaluate specific clinical activities – e.g., student’s presentation of a case in postclinical conference ♦ Useful for giving specific feedback to students about their performance 17
  • 18.
    © 2013 SpringerPublishing Company, LLC. Types of Rating Scales ♦ Two-point scales – Pass-fail – Satisfactory-unsatisfactory ♦ Multiple-level scales – Letters (e.g., A, B, C, D, F) – Numbers (e.g., 5, 4, 3, 2, 1) 18
  • 19.
    © 2013 SpringerPublishing Company, LLC. Types of Rating Scales ♦ Multiple-level scales – Qualitative labels (e.g., “excellent,” “very good,” “good,” “fair,” “poor”) – Frequency labels (e.g., “always,” “often,” “sometimes,” “never”) ♦ Matrix combining different qualities of the performance 19
  • 20.
    © 2013 SpringerPublishing Company, LLC. Issues With Rating Scales ♦ Interrater consistency (reliability) – Do all users agree on meaning of ratings? • e.g., difference between “good” and “excellent” – Problem even when using descriptors with letters, numbers, or labels used to rate outcomes or competencies 20
  • 21.
    © 2013 SpringerPublishing Company, LLC. Issues With Rating Scales (cont’d) ♦ Scales based on frequency labels – e.g., “always,” “sometimes” – Limited opportunities for students to practice and demonstrate a high level of some skills – How to rate a student who has had only one opportunity to practice a skill? 21
  • 22.
    © 2013 SpringerPublishing Company, LLC. Common Errors With Rating Scales ♦ Leniency error – Tendency to rate all students toward the high end of the scale ♦ Severity error – Tendency to rate all students toward the low end of the scale ♦ Central tendency error – Hesitancy to mark either end of the rating scale – Tendency to use only the midpoint of the scale 22
  • 23.
    © 2013 SpringerPublishing Company, LLC. Common Errors With Rating Scales (cont’d) ♦ Halo effect – Basing judgment on a general impression of the student ♦ Personal bias – Teacher’s personal values or preferences influence ratings ♦ Logical error – Giving similar ratings to items that are logically related to one another 23
  • 24.
    © 2013 SpringerPublishing Company, LLC. Common Errors With Rating Scales (cont’d) ♦ Rater drift – Definition or interpretation of competencies to be observed and assessed changes over time ♦ Reliability decay – Over time, teachers may become less consistent in their ratings 24
  • 25.
    © 2013 SpringerPublishing Company, LLC. Improving Use of Rating Scales ♦ Teachers should have regular discussion of outcomes or competencies to be rated – Meaning of each outcome or competency – Describe what student performance would look like at each rating level – May use simulation to facilitate discussion • Observe a performance, assess it with the rating scale, and discuss rationale for the ratings 25
  • 26.
    © 2013 SpringerPublishing Company, LLC. Clinical Evaluation Tools ♦ Same tool for all courses or course-specific tool? – Most faculties use one tool for all courses • Competencies adapted to each course ♦ Two-level or multilevel scales? – Most faculties use pass-fail or satisfactory- unsatisfactory rating scales 26
  • 27.
    © 2013 SpringerPublishing Company, LLC. Simulation ♦ Activity that allows learners to experience a clinical situation without the risks and constraints of a real-life situation ♦ Does not replace actual experiences with patients ♦ Useful for clinical evaluation as well as instruction 27
  • 28.
    © 2013 SpringerPublishing Company, LLC. Use of Simulation for Clinical Evaluation ♦ Case scenarios that students analyze ♦ Computer simulations ♦ Models and manikins ♦ Standardized patients ♦ Human patient simulators 28
  • 29.
    © 2013 SpringerPublishing Company, LLC. Incorporating Simulation Into Evaluation Protocols ♦ Identify clinical outcomes to be assessed with simulation ♦ Identify types of simulations needed to assess the designated outcomes ♦ Determine if simulations are available or need to be developed by the faculty ♦ Determine if simulation is for formative or summative evaluation or both 29
  • 30.
    © 2013 SpringerPublishing Company, LLC. Incorporating Simulation Into Evaluation Protocols (cont’d) ♦ Determine need to develop or obtain checklists or other methods for rating performance ♦ Decide when the simulations will be used in the course ♦ Decide how the effectiveness of the simulations will be evaluated 30
  • 31.
    © 2013 SpringerPublishing Company, LLC. Standardized Patients ♦ Individuals trained to accurately portray the role of a patient with a specific diagnosis or condition ♦ Provides consistency in performance evaluation – Recreate the same patient condition and clinical situation with each student ♦ Provide written and oral feedback to the students on their performance 31
  • 32.
    © 2013 SpringerPublishing Company, LLC. Objective Structured Clinical Examination (OSCE) ♦ Students rotate through a series of stations ♦ At each station they may complete an activity or perform a task ♦ Performance evaluated with checklists or rating scales ♦ Usually used for summative evaluation 32
  • 33.
    © 2013 SpringerPublishing Company, LLC. Games ♦ Involve competition, rules (structure), and collaboration ♦ Individual games and games played against other students either individually or in teams ♦ Many require props or equipment ♦ Not appropriate for grading; use only for instruction and formative evaluation 33
  • 34.
    © 2013 SpringerPublishing Company, LLC. Media Clips ♦ Short segments of a digital recording, DVD, video from YouTube, other forms of multimedia ♦ May be viewed by students as a basis for: – Discussions (e.g., postclinical conference, online discussion boards) – Small-group activities – Critique in a written assignment ♦ Students can visualize the patient and context 34
  • 35.
    © 2013 SpringerPublishing Company, LLC. Written Assignments ♦ Use: Assess problem solving, higher level thinking, understanding content relevant to clinical practice ♦ Misuse: Students complete the same assignments repetitively throughout a course after outcomes have been met 35
  • 36.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Journals ♦ Opportunity for students to describe and reflect on events and experiences in clinical practice ♦ Means of engaging in dialogue with the teacher ♦ Print or electronic format ♦ Students need to be clear about the purpose of the journal activity 36
  • 37.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Journals (cont’d) ♦ Written guidelines for journal entries – How many entries? – What types of entries? – How frequently? ♦ Teacher gives prompt and meaningful feedback ♦ Decide if graded or ungraded – Grading inappropriate if journal is used to express feelings, values, beliefs 37
  • 38.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Nursing Care Plans ♦ Learn components of the nursing process ♦ Weaknesses when used for clinical evaluation – Linear process—does not facilitate assessment of complex thinking necessary for clinical practice – Students often paraphrase from textbook – Teachers should be cautious about the number of care plans required in a course 38
  • 39.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Concept Maps ♦ Tool for visually showing relationships among concepts ♦ Help students organize knowledge as they plan for clinical activities – Can “see” how aspects of care relate ♦ Best used for formative evaluation – Can be graded with criteria established for evaluation 39
  • 40.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Case Method, Unfolding Cases, and Case Study ♦ Methods for assessing higher-level learning ♦ Individual or group activities ♦ Useful for formative evaluation and student self-assessment ♦ Short cases, unfolding cases, and case studies can be graded – Establish scoring criteria for responses to the questions with the case 40
  • 41.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Papers ♦ Short papers about clinical practice – Useful for assessing higher level thinking and other cognitive skills – Often better than long papers in which students may summarize textbook and other literature – Can be assessed formatively or summatively 41
  • 42.
    © 2013 SpringerPublishing Company, LLC. Written Assignments: Papers (cont’d) ♦ Term papers – Critique and synthesize relevant literature and relate it to patient care – Assess drafts formatively to improve writing skills – Grade final product 42
  • 43.
    © 2013 SpringerPublishing Company, LLC. Portfolio ♦ Collections of products that document student achievement – Demonstrate clinical competencies – Show work completed over period of time ♦ Types—can be combined for clinical evaluation – Best work (graded) – Growth and development (formative) 43
  • 44.
    © 2013 SpringerPublishing Company, LLC. Using Portfolios for Clinical Evaluation ♦ Identify purpose of the portfolio – Best work or growth and learning – Formative, summative, or both ♦ Identify the type of entries and content to be included in the portfolio – Content to be chosen by student, faculty member or both? ♦ Decide how portfolio will be evaluated – Individual entries and the portfolio overall – Scoring rubric? 44
  • 45.
    © 2013 SpringerPublishing Company, LLC. Electronic Portfolio ♦ Developed and stored electronically – Facilitates updating and revision ♦ Can be easily sent to others for feedback or scoring ♦ Limited or no cost ♦ Can be modified for job search ♦ Can include a variety of multimedia 45
  • 46.
    © 2013 SpringerPublishing Company, LLC. Conferences ♦ Method for evaluating competency in oral communication skills ♦ Can be used to assess higher-level thinking and problem solving ♦ Can be used for formative or summative evaluation or both – If summative, need specific criteria and scoring rubric 46
  • 47.
    © 2013 SpringerPublishing Company, LLC. Conferences ♦ Types of conferences – Preclinical – Postclinical – Interprofessional ♦ Format – Face-to-face – Online 47
  • 48.
    © 2013 SpringerPublishing Company, LLC. Criteria for Evaluating Conferences ♦ Ability of students to: – Present ideas clearly and logically – Participate actively in group discussion – Offer relevant ideas – Demonstrate knowledge of content – Offer different perspectives or share reflections – Assume a leadership role, if relevant, in group process 48
  • 49.
    © 2013 SpringerPublishing Company, LLC. Group Projects ♦ Short- or long-term work ♦ Can assess products developed by the group and ability of students to work cooperatively ♦ Rubrics should be used for assessing – Should be geared to the specific project – Should reflect the goals of group work • Students can assess participation and contributions of peers 49
  • 50.
    © 2013 SpringerPublishing Company, LLC. Grading Group Projects ♦ Group grade – Each member of the group receives the same grade – Does not account for individual student effort and contribution to the group product ♦ Combination of individual and group grades – Students identify their contributions to the overall product; teacher or self-assessment – Students prepare individual and group projects 50
  • 51.
    © 2013 SpringerPublishing Company, LLC. Self-assessment ♦ Ability of students to evaluate their own clinical competencies and identify need for further learning ♦ Difficult for some students to assess their own performance – Need supportive environment, guidance, practice ♦ Appropriate only for formative evaluation 51
  • 52.
    © 2013 SpringerPublishing Company, LLC. Clinical Evaluation in Distance Education ♦ Preceptors may observe and evaluate individual students ♦ Adjunct or part-time faculty members may evaluate a group of students at remote site ♦ Intensive on-campus or regional site skill acquisition workshop ♦ Students may independently demonstrate clinical skills and procedures in simulation setting – Record performance for evaluation by teacher 52
  • 53.
    © 2013 SpringerPublishing Company, LLC. Clinical Evaluation in Distance Education (cont’d) ♦ Critical decision: Which clinical skills and competencies need to be observed and the performance rated? – Determines evaluation methods to use ♦ Should use consistent methods across all clinical sites ♦ Need to orient all evaluators of student performance – Promotes reliability 53