AGENDA- Learning Collaborative Session 7
March 8, 3:00-4:30pm (EST)
 Welcome and Review Moodle/Assignments
 Questions on NP finances
 Residency Program Policies and Procedures
 Curriculum: Evaluation of the Learner
 How and Why to Assess Your Residents
 Assessment Tools and Process
 Action Period Items
 Begin working on policy and procedures
 Continue Curriculum development
 Progress Checklist and MONTHLY REPORTS!
Monthly
Reports
due EVERY
MONTH!
• 1) Is the Program set up as a separate Cost Center. If so,
what is costed directly to the Residency Program Cost
Center?
– Are the revenues from the residents credited to this cost
center?
– Are the salaries of the mentors and continuity clinics charged to
the cost center?
• 2) Is the program considered fully in scope, was this
separately added to your scope – how has this been
handled with respect to UDS and FFR and SAC 330 budget?
• 3) Is the work of the residents at the off site specialty
rotations covered by FTCA or Gap policies?
Creating policy and procedures for your program
• Policies VS. Procedures
• What policies do you need to create for the
residency?
• What procedures may you need to adapt to fit
program? (ie PTO)
• What policies does your organization already
have?
3
Policies and Procedures
Policies and Procedures
• Residency Specific Policies
1. Ramp Up Policy
2. Precepting Policy
3. Patient Panel Transfer Policy
• Accreditation Policies
• Check accreditation standards on what policies are
required
4
Policies and Procedures
• Start putting pen to paper to develop Policies and
Procedures
• Create program manuals
• Program staff
• Residents
• Important to have these established
for training new staff
5
6
Curriculum Development
Assessment of Resident
Performance
7
DRIVERS LEADERSHIP/BOARD/FINANCES
MARKETING, RECRUITMENT
CURRICULUM
Domains/subdomains
Space/equipment Space/equipment
Policies
Patients DIDACTIC
Preceptors Faculty
ASSESS LEARNER SCHEDULE ASSESS LEARNER
REMEDIATION OF LEARNER REMEDIATION OF LEARNER
ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION
CLINICAL
TOPICS
/KSAs
MISSION
Program goals/objectives
Learner
outcomes/competencies
Learning ObjectivesKnowledge:
– Understand the purpose of assessment
– Know the characteristics of good assessment
– Understand how assessment builds trainee and
programmatic performance
Attitude:
– Appreciate the importance of good assessment
– Embrace the challenge
Skills:
– To be gained by independent / group work building on
information provided in the presentation
Overview of the Session
• Defining terms: difference between evaluation and
assessment
• How assessment/evaluation fits in the bigger picture of
curriculum and program development
– Integrated throughout the program
– Creates explicit expectations for trainee
– Building blocks for program evaluation
– Engine for trainee and program improvement
• Characteristics of effective assessment and evaluation
• Examples of techniques/methods
• Discussion
Definitions
• Assessment
 Process of measuring learning (describing, collecting, recording, and scoring
information), generally focusing on observable KSAs
 Gathering of information about learner performance that is relevant to stated
competencies/outcomes
 The goal of assessment: performance improvement, as opposed to simply being
judged.
 Provides information for changes/interventions that improve learner performance
 Formative
• Evaluation
 Process of making judgments; of comparing assessment data against established
criteria, evidence or standards to determine the extent to which learner
competencies/outcomes and program goals have been met
 Provides information for changes/interventions that improve learner/program
performance
 Summative
Definitions con’t
• Program Goals
 General and ‘fuzzy’, they are aspirational.
 Overall outline of what the program will accomplish.
• Program Objectives
 Measurable and specific.
 Introduce the curricular domains of the program, eg: Patient-Centered Care,
Professionalism, Clinical Practice.
 Within the domains are sub-domains which contain specific learner outcomes.
• Learner outcomes
 Measurable benchmarks, the intended results of the curriculum.
 Describe what the learner will actually do, and often use Bloom’s taxonomy of action
verbs.
 Summative (final) data describing learner performance is compared to the benchmarks.
It is an indicator of achieving outcomes. It is your evidence that your residents are
learning and doing what you said they would learn and do.
The Relationship between
Assessment and Evaluation
Formative Assessment
for Learner Feedback
Summative Evaluation for
Improvement
Summative Evaluation for
Programmatic Improvement
APA guidelines
Domain E: Resident–Supervisor Relations
At least semiannually, written feedback re: meeting performance
requirements:
(a) Initial written evaluation provided early enough for self-
correction;
(b) Second written evaluation early enough to provide time for
continued correction or development;
(c) Discussions / signing of evaluation by resident and supervisor;
(d) Timely written notification of problems, opportunity to discuss
them, and guidance re: remediation; and
(e) Substantive written feedback on extent to which corrective
actions are or are not successful.
NNPRFTC Standard 3: Evaluation
Evaluation components
• Institutional performance
• Programmatic
performance
• Trainee performance
• Instructor and staff
performance
• Assessment based on Program’s core
elements, competencies, and
curriculum components
• Assess performance of each trainee
through periodic and objective
assessment (formative and
summative)
• Include identification of any
deficiencies or performance concerns
• Process for trainee performance
concerns, incl. improvement plan
with measurable goals.
Models of Learner Assessment
 Learner assessment is anchored in the learning theory or model used to
create the curriculum;
 Measure important milestones specified by the learning theory in the
context of the curriculum.
_____________________________________________________
– Malcolm Knowles – Andragogy: “Adult learning”
• Involve learner in the planning and evaluation of their instruction.
• Experience (including mistakes) provides the basis for the learning activities.
• Adult learning is problem-centered rather than content-oriented. (Kearsley, 2010)
– James Englander et al (2013) 8 Clinical competencies
– Dreyfus / Brenner
• Novice to expert
• Assessment tailored to each level of proficiency
Dreyfus/Brenner
Types of Assessment/Evaluation
• Formative – formal and informal – ongoing, periodic
• Summative – formal – “final”
• Personal, peer, expert
• Surveys, Simulations, Criterion/Standard referenced
observation
• Journals
• 360
• Portfolio
• Project
Characteristics of Effective Assessment
 Reliable (replicable)
• Multiple observers
• Reproducible observations/outcomes
 Valid (meaningful)
• Useful indicator of performance, competency
• Relevant to professional practice
 Measurable/observable
• Verifiable
DRIVERS LEADERSHIP/BOARD/FINANCES
MARKETING, RECRUITMENT
CURRICULUM
Domains/subdomains
Space/equipment Space/equipment
Policies
Patients DIDACTIC
Preceptors Faculty
ASSESS LEARNER SCHEDULE ASSESS LEARNER
REMEDIATION OF LEARNER REMEDIATION OF LEARNER
ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION
CLINICAL
TOPICS
/KSAs
MISSION
Program goals/objectives
Learner
outcomes/competencies
Impact – Feedback Loop
Examples: APA Accreditation
• Competency/domain: Professionalism
• Learner Outcome: Demonstrates in behavior
and comportment the professional values and
attitudes of the discipline of psychology.
• Subdomains: Professional Values and
Attitudes, Cultural diversity, Ethics, Reflective
Practice/Self-Assessment
• Measurable outcome for subdomains:
– CHCI: Dreyfus Novice to Expert
Example: APA Accreditation w/CHCI outcomes
• Subdomain: Professional Values and Attitudes
• Components of subdomain: Integrity,
Accountability, Concern for welfare of others
• Outcome for Integrity: Monitors and
independently resolves situations that
challenge professional values and integrity
• Outcome for Accountability: Independently
accepts personal responsibility across settings
and contexts
CHCI Rating Scale for Post-doc Psychologists
1) Novice – entry level skills, knowledge, attitudes
2) Advanced Beginner -- Developing skills, knowledge
and attitude
3) Competent - Developed skills, knowledge and
attitude
4) Proficient -- Advanced skills, knowledge and attitude
5) Expert -- Authority for skills, knowledge and attitude
0) No interaction
Example: NNPRFTC Accreditation
• Competency/domain: Patient Care/ Knowledge
for practice
• Learner Outcome: Provide effective evidence-
based patient-centered care for the treatment of
health problems and the promotion of health
• Subdomains: diagnostic tests, history & physical,
prescribing, plan of care
• CHCI’s Model for assessment measurement:
Dreyfus/Benner Novice to Expert
NNPRFTC Accreditation w/CHCI outcomes
• Subdomain: History & physical
• Outcome for History & physical: Perform
comprehensive history and physical exam
• Outcome for diagnostic tests: Order
appropriate screening and diagnostic tests
• Outcome for prescribing: Order appropriate
medications
CHCI NP Residency rating scale
1 Novice Observes task only: Entry level skills,
knowledge, attitudes
2 Advance
Beginner
Needs direct supervision: Developing skills,
knowledge, attitudes
3 Competent Needs supervision periodically: Developed
skills, knowledge, attitudes
4 Proficient Able to perform without supervision:
Advanced skills, knowledge, attitudes
5 Expert Able to supervise others: Authority for skills,
knowledge, attitudes
0 N/A Not applicable, not observed, or not
performed
CHCI Assessment Proto
• Residents assessed in 8 competency domain areas
(based on NNPRFTC accreditation curriculum standards)
• Residents complete a self-assessment at baseline,
6 months and 12 months
• Preceptors complete assessment at 6 and 12 months
• Preceptor team develops 1 final assessment for each
resident
Creating Your Assessment Process
• Anchor in the curriculum and program objectives
• What is the evidence/documentation?
• What methods do you want to use?
• Use reliable and valid techniques
• When are you going to collect data?
• Conduct systematic formative (on-going) and
summative (final) data collection
• Create feedback loop – remediation and using the
information
• Measuring the impact
• Pell Institute: user-friendly toolbox that steps through every
point in the evaluation process: designing a plan, data
collection and analysis, dissemination and communication,
program improvement.
• CDC has an evaluation workbook for obesity programs;
concepts and detailed work products can be readily adapted to
NP postgraduate programs.
• The Community Tool Box, (Work Group for Community Health
at the U of Kansas): incredibly complete and understandable
resource, provides theoretical overviews, practical suggestions,
a tool box, checklists, and an extensive bibliography.
Resources:
Resources cont’
• Another wonderful resource, Designing Your Program Evaluation
Plans, provides a self-study approach to evaluation for nonprofit organizations
and is easily adapted to training programs. There are checklists and suggested
activities, as well as recommended readings.
• http://edglossary.org/assessment/
• NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education-
Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-
Evaluation
Creating an Evaluation
Process
Kathryn Rugen, PhD, FNP-BC, FAAN, FAANP
VETERANS HEALTH ADMINISTRATION
• Explain the development of the NP Residency
competency tool
• Describe the validation of the NP Residency
competency tool
34
Objectives
VETERANS HEALTH ADMINISTRATION
• Demonstrate program effectiveness
• Standardization across 5 sites
• Document competence in 7 domains
• Prepare for site accreditation
NP Competency Tool
VETERANS HEALTH ADMINISTRATION
– AACN/CCNE Masters and DNP Essentials
– AACN/NONPF Adult-Gerontology Nurse Practitioner
Core Competencies
– NCQA PCMH Standards
– Core Competencies for Interprofessional
Collaborative Practice (IPEC)
– ACGME competencies
– VA top outpatient diagnoses
– COE education core domains
– Entrustable Professional Activities
Development
VETERANS HEALTH ADMINISTRATION
• Iterative process
– VA NP experts at each site and MD education
consultant
– Post-graduate NP trainee reviewed and offered
suggestions
– Solicitied input from experienced and new NPs
throughout VA Primary Care
Content validity
VETERANS HEALTH ADMINISTRATION
• Clinical competency in planning and managing care
• Leadership
• Interprofessional team collaboration
• Patient-centered care
• Shared decision making
• Sustain relationships
• Quality improvement and population management
Domains
VETERANS HEALTH ADMINISTRATION
• NP resident and mentor complete competency tool at 1, 6,
and 12 months (total 69 items)
• Rate on 0-5 scale
– 0= not observed or not performed
– 1= observes task only
– 2= needs direct supervision
– 3= needs supervision periodically
– 4= able to perform without supervision
– 5= able to supervise others- aspirational!
NP resident responds to open ended questions
Methods
VETERANS HEALTH ADMINISTRATION
• Evaluation questions:
– identify items and domains NP residents are strongest and
weakest
– determine how NP residents progress over time
– determine agreement between trainee and mentor ratings
• Descriptive statistics to evaluate the distributional
characteristics of each item and domain, the impact
of the time on trainee and mentor
• T-test and general linear models to assess relationship
between NP resident and mentor ratings over time
Analysis
VETERANS HEALTH ADMINISTRATION
Subscale
Trainee Ratings Faculty Ratings
1 month
6
months
12
months
p-value 1 month
6
months
12
months
p-value
Clinical Competency
in
Planning/Managing
Care
n
Mean
SD
Range
37
2.75
.56
1.71-
3.85
34
3.41
.46
2.28-
4.25
35
3.75
1.43
0 -
5.00
<.0001
37
2.94
.60
1.86-
4.59
34
3.68
.49
2.89-
5.00
36
4.42
.50
3.50-
5.00
<.0001
Clinical Competency
VETERANS HEALTH ADMINISTRATION
Clinical Competency
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
Mean
Mentor_1m
Mentor_6m
Mentor_12m
Trainee_1m
Trainee_6m
Trainee_12m
VETERANS HEALTH ADMINISTRATION
Subscale
Trainee Ratings Faculty Ratings
1 month
6
months
12
months
p-value 1 month
6
months
12
months
p-value
Leadership
n
Mean
SD
Range
37
1.45
1.35
0-
4.85
34
2.41
1.58
0-
5.00
35
3.13
1.56
0-
5.00
<.0001
28
2.64
1.23
1.00-
4.33
29
3.63
.67
2.00-
5.00
36
4.44
.55
3.20-
5.00
<.0001
Leadership Competency
VETERANS HEALTH ADMINISTRATION
Leadership Competency
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
2.1 Lead PACT
team huddle
2.2 Lead case
conference
2.3 Lead team
meeting using
conflict
mgmt/resolution
2.4 Lead group
educ activities for
pts/fam, PACT
team, peers
2.5 Lead PACT
team quality
improvement
project
2.6 Lead
shared/group
medical appts
2.7 Apply
leadership
strategies to
support
collaborative
practice/team
effectiveness
Mean
Mentor_1m
Mentor_6m
Mentor_12m
Trainee_1m
Trainee_6m
Trainee_12m
VETERANS HEALTH ADMINISTRATION
• At 1 month, 24 out of 28 items were rated between 2 and 3 (2= needs direct
supervision; 3=needs supervision periodically) only four items were rated greater
than 3 by the NP Residents.
• Four items rated higher than 3 were “perform comprehensive history and
physical exam” (3.48), “perform medication reconciliation” (3.54) and
“management of hypertension” (3.13) and “management of obesity” (3.35).
• At the 12 month time point all items were rates higher than 3 and seven items
out of 28 were rated higher than 4 (able to perform without supervision) by the
NP Residents
• The seven items rated 4 or higher were “perform comprehensive history and
physical exam” (4.17), “order appropriate consults” (4.11), “perform medication
reconciliation” (4.14) “management of hypertension” (4.08), “management of
obesity” (4.11) “management of gastroesophageal reflux” ( 4.02), and
“management of osteoarthritis” (4.00).
• At the 12 month time point the mentors ratings were all above 4 (4=able to
perform without supervision) except for two items, “management military sexual
trauma” (3.58) and “ management of traumatic brain injury” (3.66).
Item Analysis -Clinical
Competence
VETERANS HEALTH ADMINISTRATION
Psychometric Analysis
• Internal consistency – the degree to which the items
are measuring the same attribute
• Cronbach’s (coefficient) alpha ranging .00 -1.0, higher
value the higher the internal consistency
• Internal consistency calculated by NP resident and
mentor for each domain and each time point; α =
0.82-0.96
• Triangulating qualitative data, qualitative data and
end of program evaluation further enhances content
validity
• Factor analysis will be used for construct validation –
identifies clusters of related variables
46
VETERANS HEALTH ADMINISTRATION
Kathryn.Rugen@va.gov
Questions
Please complete the
survey after the session!

NCA Residency Session 7 March 8 2017

  • 1.
    AGENDA- Learning CollaborativeSession 7 March 8, 3:00-4:30pm (EST)  Welcome and Review Moodle/Assignments  Questions on NP finances  Residency Program Policies and Procedures  Curriculum: Evaluation of the Learner  How and Why to Assess Your Residents  Assessment Tools and Process  Action Period Items  Begin working on policy and procedures  Continue Curriculum development  Progress Checklist and MONTHLY REPORTS! Monthly Reports due EVERY MONTH!
  • 2.
    • 1) Isthe Program set up as a separate Cost Center. If so, what is costed directly to the Residency Program Cost Center? – Are the revenues from the residents credited to this cost center? – Are the salaries of the mentors and continuity clinics charged to the cost center? • 2) Is the program considered fully in scope, was this separately added to your scope – how has this been handled with respect to UDS and FFR and SAC 330 budget? • 3) Is the work of the residents at the off site specialty rotations covered by FTCA or Gap policies?
  • 3.
    Creating policy andprocedures for your program • Policies VS. Procedures • What policies do you need to create for the residency? • What procedures may you need to adapt to fit program? (ie PTO) • What policies does your organization already have? 3 Policies and Procedures
  • 4.
    Policies and Procedures •Residency Specific Policies 1. Ramp Up Policy 2. Precepting Policy 3. Patient Panel Transfer Policy • Accreditation Policies • Check accreditation standards on what policies are required 4
  • 5.
    Policies and Procedures •Start putting pen to paper to develop Policies and Procedures • Create program manuals • Program staff • Residents • Important to have these established for training new staff 5
  • 6.
  • 7.
  • 8.
    DRIVERS LEADERSHIP/BOARD/FINANCES MARKETING, RECRUITMENT CURRICULUM Domains/subdomains Space/equipmentSpace/equipment Policies Patients DIDACTIC Preceptors Faculty ASSESS LEARNER SCHEDULE ASSESS LEARNER REMEDIATION OF LEARNER REMEDIATION OF LEARNER ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION CLINICAL TOPICS /KSAs MISSION Program goals/objectives Learner outcomes/competencies
  • 9.
    Learning ObjectivesKnowledge: – Understandthe purpose of assessment – Know the characteristics of good assessment – Understand how assessment builds trainee and programmatic performance Attitude: – Appreciate the importance of good assessment – Embrace the challenge Skills: – To be gained by independent / group work building on information provided in the presentation
  • 10.
    Overview of theSession • Defining terms: difference between evaluation and assessment • How assessment/evaluation fits in the bigger picture of curriculum and program development – Integrated throughout the program – Creates explicit expectations for trainee – Building blocks for program evaluation – Engine for trainee and program improvement • Characteristics of effective assessment and evaluation • Examples of techniques/methods • Discussion
  • 11.
    Definitions • Assessment  Processof measuring learning (describing, collecting, recording, and scoring information), generally focusing on observable KSAs  Gathering of information about learner performance that is relevant to stated competencies/outcomes  The goal of assessment: performance improvement, as opposed to simply being judged.  Provides information for changes/interventions that improve learner performance  Formative • Evaluation  Process of making judgments; of comparing assessment data against established criteria, evidence or standards to determine the extent to which learner competencies/outcomes and program goals have been met  Provides information for changes/interventions that improve learner/program performance  Summative
  • 12.
    Definitions con’t • ProgramGoals  General and ‘fuzzy’, they are aspirational.  Overall outline of what the program will accomplish. • Program Objectives  Measurable and specific.  Introduce the curricular domains of the program, eg: Patient-Centered Care, Professionalism, Clinical Practice.  Within the domains are sub-domains which contain specific learner outcomes. • Learner outcomes  Measurable benchmarks, the intended results of the curriculum.  Describe what the learner will actually do, and often use Bloom’s taxonomy of action verbs.  Summative (final) data describing learner performance is compared to the benchmarks. It is an indicator of achieving outcomes. It is your evidence that your residents are learning and doing what you said they would learn and do.
  • 13.
    The Relationship between Assessmentand Evaluation Formative Assessment for Learner Feedback Summative Evaluation for Improvement Summative Evaluation for Programmatic Improvement
  • 14.
    APA guidelines Domain E:Resident–Supervisor Relations At least semiannually, written feedback re: meeting performance requirements: (a) Initial written evaluation provided early enough for self- correction; (b) Second written evaluation early enough to provide time for continued correction or development; (c) Discussions / signing of evaluation by resident and supervisor; (d) Timely written notification of problems, opportunity to discuss them, and guidance re: remediation; and (e) Substantive written feedback on extent to which corrective actions are or are not successful.
  • 15.
    NNPRFTC Standard 3:Evaluation Evaluation components • Institutional performance • Programmatic performance • Trainee performance • Instructor and staff performance • Assessment based on Program’s core elements, competencies, and curriculum components • Assess performance of each trainee through periodic and objective assessment (formative and summative) • Include identification of any deficiencies or performance concerns • Process for trainee performance concerns, incl. improvement plan with measurable goals.
  • 16.
    Models of LearnerAssessment  Learner assessment is anchored in the learning theory or model used to create the curriculum;  Measure important milestones specified by the learning theory in the context of the curriculum. _____________________________________________________ – Malcolm Knowles – Andragogy: “Adult learning” • Involve learner in the planning and evaluation of their instruction. • Experience (including mistakes) provides the basis for the learning activities. • Adult learning is problem-centered rather than content-oriented. (Kearsley, 2010) – James Englander et al (2013) 8 Clinical competencies – Dreyfus / Brenner • Novice to expert • Assessment tailored to each level of proficiency
  • 17.
  • 18.
    Types of Assessment/Evaluation •Formative – formal and informal – ongoing, periodic • Summative – formal – “final” • Personal, peer, expert • Surveys, Simulations, Criterion/Standard referenced observation • Journals • 360 • Portfolio • Project
  • 19.
    Characteristics of EffectiveAssessment  Reliable (replicable) • Multiple observers • Reproducible observations/outcomes  Valid (meaningful) • Useful indicator of performance, competency • Relevant to professional practice  Measurable/observable • Verifiable
  • 20.
    DRIVERS LEADERSHIP/BOARD/FINANCES MARKETING, RECRUITMENT CURRICULUM Domains/subdomains Space/equipmentSpace/equipment Policies Patients DIDACTIC Preceptors Faculty ASSESS LEARNER SCHEDULE ASSESS LEARNER REMEDIATION OF LEARNER REMEDIATION OF LEARNER ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION CLINICAL TOPICS /KSAs MISSION Program goals/objectives Learner outcomes/competencies
  • 21.
  • 22.
    Examples: APA Accreditation •Competency/domain: Professionalism • Learner Outcome: Demonstrates in behavior and comportment the professional values and attitudes of the discipline of psychology. • Subdomains: Professional Values and Attitudes, Cultural diversity, Ethics, Reflective Practice/Self-Assessment • Measurable outcome for subdomains: – CHCI: Dreyfus Novice to Expert
  • 23.
    Example: APA Accreditationw/CHCI outcomes • Subdomain: Professional Values and Attitudes • Components of subdomain: Integrity, Accountability, Concern for welfare of others • Outcome for Integrity: Monitors and independently resolves situations that challenge professional values and integrity • Outcome for Accountability: Independently accepts personal responsibility across settings and contexts
  • 24.
    CHCI Rating Scalefor Post-doc Psychologists 1) Novice – entry level skills, knowledge, attitudes 2) Advanced Beginner -- Developing skills, knowledge and attitude 3) Competent - Developed skills, knowledge and attitude 4) Proficient -- Advanced skills, knowledge and attitude 5) Expert -- Authority for skills, knowledge and attitude 0) No interaction
  • 26.
    Example: NNPRFTC Accreditation •Competency/domain: Patient Care/ Knowledge for practice • Learner Outcome: Provide effective evidence- based patient-centered care for the treatment of health problems and the promotion of health • Subdomains: diagnostic tests, history & physical, prescribing, plan of care • CHCI’s Model for assessment measurement: Dreyfus/Benner Novice to Expert
  • 27.
    NNPRFTC Accreditation w/CHCIoutcomes • Subdomain: History & physical • Outcome for History & physical: Perform comprehensive history and physical exam • Outcome for diagnostic tests: Order appropriate screening and diagnostic tests • Outcome for prescribing: Order appropriate medications
  • 28.
    CHCI NP Residencyrating scale 1 Novice Observes task only: Entry level skills, knowledge, attitudes 2 Advance Beginner Needs direct supervision: Developing skills, knowledge, attitudes 3 Competent Needs supervision periodically: Developed skills, knowledge, attitudes 4 Proficient Able to perform without supervision: Advanced skills, knowledge, attitudes 5 Expert Able to supervise others: Authority for skills, knowledge, attitudes 0 N/A Not applicable, not observed, or not performed
  • 29.
    CHCI Assessment Proto •Residents assessed in 8 competency domain areas (based on NNPRFTC accreditation curriculum standards) • Residents complete a self-assessment at baseline, 6 months and 12 months • Preceptors complete assessment at 6 and 12 months • Preceptor team develops 1 final assessment for each resident
  • 30.
    Creating Your AssessmentProcess • Anchor in the curriculum and program objectives • What is the evidence/documentation? • What methods do you want to use? • Use reliable and valid techniques • When are you going to collect data? • Conduct systematic formative (on-going) and summative (final) data collection • Create feedback loop – remediation and using the information • Measuring the impact
  • 31.
    • Pell Institute:user-friendly toolbox that steps through every point in the evaluation process: designing a plan, data collection and analysis, dissemination and communication, program improvement. • CDC has an evaluation workbook for obesity programs; concepts and detailed work products can be readily adapted to NP postgraduate programs. • The Community Tool Box, (Work Group for Community Health at the U of Kansas): incredibly complete and understandable resource, provides theoretical overviews, practical suggestions, a tool box, checklists, and an extensive bibliography. Resources:
  • 32.
    Resources cont’ • Anotherwonderful resource, Designing Your Program Evaluation Plans, provides a self-study approach to evaluation for nonprofit organizations and is easily adapted to training programs. There are checklists and suggested activities, as well as recommended readings. • http://edglossary.org/assessment/ • NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education- Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3- Evaluation
  • 33.
    Creating an Evaluation Process KathrynRugen, PhD, FNP-BC, FAAN, FAANP
  • 34.
    VETERANS HEALTH ADMINISTRATION •Explain the development of the NP Residency competency tool • Describe the validation of the NP Residency competency tool 34 Objectives
  • 35.
    VETERANS HEALTH ADMINISTRATION •Demonstrate program effectiveness • Standardization across 5 sites • Document competence in 7 domains • Prepare for site accreditation NP Competency Tool
  • 36.
    VETERANS HEALTH ADMINISTRATION –AACN/CCNE Masters and DNP Essentials – AACN/NONPF Adult-Gerontology Nurse Practitioner Core Competencies – NCQA PCMH Standards – Core Competencies for Interprofessional Collaborative Practice (IPEC) – ACGME competencies – VA top outpatient diagnoses – COE education core domains – Entrustable Professional Activities Development
  • 37.
    VETERANS HEALTH ADMINISTRATION •Iterative process – VA NP experts at each site and MD education consultant – Post-graduate NP trainee reviewed and offered suggestions – Solicitied input from experienced and new NPs throughout VA Primary Care Content validity
  • 38.
    VETERANS HEALTH ADMINISTRATION •Clinical competency in planning and managing care • Leadership • Interprofessional team collaboration • Patient-centered care • Shared decision making • Sustain relationships • Quality improvement and population management Domains
  • 39.
    VETERANS HEALTH ADMINISTRATION •NP resident and mentor complete competency tool at 1, 6, and 12 months (total 69 items) • Rate on 0-5 scale – 0= not observed or not performed – 1= observes task only – 2= needs direct supervision – 3= needs supervision periodically – 4= able to perform without supervision – 5= able to supervise others- aspirational! NP resident responds to open ended questions Methods
  • 40.
    VETERANS HEALTH ADMINISTRATION •Evaluation questions: – identify items and domains NP residents are strongest and weakest – determine how NP residents progress over time – determine agreement between trainee and mentor ratings • Descriptive statistics to evaluate the distributional characteristics of each item and domain, the impact of the time on trainee and mentor • T-test and general linear models to assess relationship between NP resident and mentor ratings over time Analysis
  • 41.
    VETERANS HEALTH ADMINISTRATION Subscale TraineeRatings Faculty Ratings 1 month 6 months 12 months p-value 1 month 6 months 12 months p-value Clinical Competency in Planning/Managing Care n Mean SD Range 37 2.75 .56 1.71- 3.85 34 3.41 .46 2.28- 4.25 35 3.75 1.43 0 - 5.00 <.0001 37 2.94 .60 1.86- 4.59 34 3.68 .49 2.89- 5.00 36 4.42 .50 3.50- 5.00 <.0001 Clinical Competency
  • 42.
    VETERANS HEALTH ADMINISTRATION ClinicalCompetency 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Mean Mentor_1m Mentor_6m Mentor_12m Trainee_1m Trainee_6m Trainee_12m
  • 43.
    VETERANS HEALTH ADMINISTRATION Subscale TraineeRatings Faculty Ratings 1 month 6 months 12 months p-value 1 month 6 months 12 months p-value Leadership n Mean SD Range 37 1.45 1.35 0- 4.85 34 2.41 1.58 0- 5.00 35 3.13 1.56 0- 5.00 <.0001 28 2.64 1.23 1.00- 4.33 29 3.63 .67 2.00- 5.00 36 4.44 .55 3.20- 5.00 <.0001 Leadership Competency
  • 44.
    VETERANS HEALTH ADMINISTRATION LeadershipCompetency 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 2.1 Lead PACT team huddle 2.2 Lead case conference 2.3 Lead team meeting using conflict mgmt/resolution 2.4 Lead group educ activities for pts/fam, PACT team, peers 2.5 Lead PACT team quality improvement project 2.6 Lead shared/group medical appts 2.7 Apply leadership strategies to support collaborative practice/team effectiveness Mean Mentor_1m Mentor_6m Mentor_12m Trainee_1m Trainee_6m Trainee_12m
  • 45.
    VETERANS HEALTH ADMINISTRATION •At 1 month, 24 out of 28 items were rated between 2 and 3 (2= needs direct supervision; 3=needs supervision periodically) only four items were rated greater than 3 by the NP Residents. • Four items rated higher than 3 were “perform comprehensive history and physical exam” (3.48), “perform medication reconciliation” (3.54) and “management of hypertension” (3.13) and “management of obesity” (3.35). • At the 12 month time point all items were rates higher than 3 and seven items out of 28 were rated higher than 4 (able to perform without supervision) by the NP Residents • The seven items rated 4 or higher were “perform comprehensive history and physical exam” (4.17), “order appropriate consults” (4.11), “perform medication reconciliation” (4.14) “management of hypertension” (4.08), “management of obesity” (4.11) “management of gastroesophageal reflux” ( 4.02), and “management of osteoarthritis” (4.00). • At the 12 month time point the mentors ratings were all above 4 (4=able to perform without supervision) except for two items, “management military sexual trauma” (3.58) and “ management of traumatic brain injury” (3.66). Item Analysis -Clinical Competence
  • 46.
    VETERANS HEALTH ADMINISTRATION PsychometricAnalysis • Internal consistency – the degree to which the items are measuring the same attribute • Cronbach’s (coefficient) alpha ranging .00 -1.0, higher value the higher the internal consistency • Internal consistency calculated by NP resident and mentor for each domain and each time point; α = 0.82-0.96 • Triangulating qualitative data, qualitative data and end of program evaluation further enhances content validity • Factor analysis will be used for construct validation – identifies clusters of related variables 46
  • 47.
  • 48.
    Please complete the surveyafter the session!

Editor's Notes

  • #31 Is the trainee performing based on stated outcomes, competencies and expectations? – Assessment needs to be anchored in the program objectives – Outcomes/expectations need to be very clear up front. Assessment/evaluation flows from these. • How do we know that the trainee is performing? – What tools/methods will we use? – When/how often will we use these tools? – How does the assessment occur? Who does it? • What do we do with this information? – What if the trainee is NOT performing as expected? – What does it mean for program effectiveness?