This document provides the agenda for the eighth session of a learning collaborative. It includes time for team reports on successes, challenges, and recruitment updates. It also covers position offers, contracts, onboarding, licensing, credentialing, program evaluation, and accreditation preparation. The next session is scheduled for May 3rd. Action items include monthly reports, drafting contracts and agreements, and preparing questions for a precepting panel.
This gives the information about programme evaluation, planning of evaluation, requirement and purpose of evaluation, steps involved in evaluation, Uses of evaluation, Stakeholder and their role in evaluation, finding and analysing the result of evaluation, Standards of effective evaluation, utilization of evaluation.
Presentation by Terri Manning, Associate Vice President for Institutional Research/Director of the Center for Applied Research, Central Piedmont Community College; LACCD AtD Liaison at the 2nd Annual LACCD AtD Retreat
This gives the information about programme evaluation, planning of evaluation, requirement and purpose of evaluation, steps involved in evaluation, Uses of evaluation, Stakeholder and their role in evaluation, finding and analysing the result of evaluation, Standards of effective evaluation, utilization of evaluation.
Presentation by Terri Manning, Associate Vice President for Institutional Research/Director of the Center for Applied Research, Central Piedmont Community College; LACCD AtD Liaison at the 2nd Annual LACCD AtD Retreat
A textbook must provide, first and foremost, information to assist the reader in better understanding the topic. Second, it ought to provide the information in a way that can be easily accessed and digested, and it needs to be credible. Textbooks
that have gone through multiple editions continue to improve as a result of reviewers’ comments and readers’ feedback, and this one is no exception. Looking back over the efforts associated with this Fifth Edition, the old wedding custom of “something old, something new, something borrowed, something blue” comes to
mind. We have built upon the solid foundation of previous editions, but then added “something new.” It almost goes without saying that we have “borrowed” from others in that we both cite and quote examples of program evaluation studies
from the literature. “Something blue” . . . well, we’re not sure about that. Those who have used the Fourth Edition might be interested in knowing what has changed in this new edition. Based on reviewers’ comments we have:
• Created a new chapter to explain sampling.
• Incorporated new material on designing questionnaires.
• Overhauled the chapter on qualitative evaluation. It is now “Qualitative and Mixed Methods in Evaluation.”
• Reworked the “Formative and Process Evaluation” chapter with expanded coverage on developing logic models.
• Added new studies and references; new Internet sources of information.
• Included new examples of measurement instruments (scales) with a macro
focus.
• Inserted new checklists and guides (such as ways to minimize and monitor for potential fidelity problems—Chapter 13).
• Revised the chapter “Writing Evaluation Proposals, Reports, and Journal Articles” to give it less of an academic slant. There’s new material on writing
executive summaries and considerations in planning and writing evaluation
reports for agencies.
• Deleted the chapter on Goal Attainment Scalin
Evaluation is critical component in public policy and other forms of policy. Thus this slides gives a short overview of relevance of Evaluation in every capacity.
Information may be time-sensitive. Subscribers should use the information contained at their own risk. Please check latest information with Dr. A by emailing bugdoctor@auburn.edu.
The presentation is a systematic and comprehensive formative evaluation plan to investigate the implementation of social studies education for Democratic citizenship (SSEDC) in the mature stage. The lead evaluator will select a team to guide and conduct key actions throughout the evaluation process. The plan will begin with the Grades K-6 program description, followed by the theoretical framework, including the research questions that will guide the project over a 12-week period. The methodology will be mixed method survey design, using multiple methods to collect quantitative and qualitative data. The sampled target group will include various stakeholders in the school community, including the implementers and others as the need arises. Content and descriptive data analyses will be the suggested methods to extract themes and concepts and highlight possible findings influenced by (a) teachers’ understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are experiencing during the implementation process. The evidence will form the basis for findings and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation team will put measures in place to promote accurate results, and efficient reporting procedures. The evaluation team will put efficient reporting procedures or measures in place respected by the internal stakeholders – designers and implementers.
This video is meant for Extension educators to demonstrate the feasibility and usefulness of an Extension program and Evaluation Strategy that is based on specific goals. This presentation is a basic version and we have much more information that is part of continuous improvement in the slideshow. We will share other presentations with more information - so this is just the beginning! For program evaluation/monitoring questions, call 251-331-8416 or email bugdoctor@auburn.edu. For looking at some of my IPM Program evaluation publications, visit www.aces.edu/go/87 and click on 'IPM Evaluation Toolkit' in the menu. Thank you.
A textbook must provide, first and foremost, information to assist the reader in better understanding the topic. Second, it ought to provide the information in a way that can be easily accessed and digested, and it needs to be credible. Textbooks
that have gone through multiple editions continue to improve as a result of reviewers’ comments and readers’ feedback, and this one is no exception. Looking back over the efforts associated with this Fifth Edition, the old wedding custom of “something old, something new, something borrowed, something blue” comes to
mind. We have built upon the solid foundation of previous editions, but then added “something new.” It almost goes without saying that we have “borrowed” from others in that we both cite and quote examples of program evaluation studies
from the literature. “Something blue” . . . well, we’re not sure about that. Those who have used the Fourth Edition might be interested in knowing what has changed in this new edition. Based on reviewers’ comments we have:
• Created a new chapter to explain sampling.
• Incorporated new material on designing questionnaires.
• Overhauled the chapter on qualitative evaluation. It is now “Qualitative and Mixed Methods in Evaluation.”
• Reworked the “Formative and Process Evaluation” chapter with expanded coverage on developing logic models.
• Added new studies and references; new Internet sources of information.
• Included new examples of measurement instruments (scales) with a macro
focus.
• Inserted new checklists and guides (such as ways to minimize and monitor for potential fidelity problems—Chapter 13).
• Revised the chapter “Writing Evaluation Proposals, Reports, and Journal Articles” to give it less of an academic slant. There’s new material on writing
executive summaries and considerations in planning and writing evaluation
reports for agencies.
• Deleted the chapter on Goal Attainment Scalin
Evaluation is critical component in public policy and other forms of policy. Thus this slides gives a short overview of relevance of Evaluation in every capacity.
Information may be time-sensitive. Subscribers should use the information contained at their own risk. Please check latest information with Dr. A by emailing bugdoctor@auburn.edu.
The presentation is a systematic and comprehensive formative evaluation plan to investigate the implementation of social studies education for Democratic citizenship (SSEDC) in the mature stage. The lead evaluator will select a team to guide and conduct key actions throughout the evaluation process. The plan will begin with the Grades K-6 program description, followed by the theoretical framework, including the research questions that will guide the project over a 12-week period. The methodology will be mixed method survey design, using multiple methods to collect quantitative and qualitative data. The sampled target group will include various stakeholders in the school community, including the implementers and others as the need arises. Content and descriptive data analyses will be the suggested methods to extract themes and concepts and highlight possible findings influenced by (a) teachers’ understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are experiencing during the implementation process. The evidence will form the basis for findings and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation team will put measures in place to promote accurate results, and efficient reporting procedures. The evaluation team will put efficient reporting procedures or measures in place respected by the internal stakeholders – designers and implementers.
This video is meant for Extension educators to demonstrate the feasibility and usefulness of an Extension program and Evaluation Strategy that is based on specific goals. This presentation is a basic version and we have much more information that is part of continuous improvement in the slideshow. We will share other presentations with more information - so this is just the beginning! For program evaluation/monitoring questions, call 251-331-8416 or email bugdoctor@auburn.edu. For looking at some of my IPM Program evaluation publications, visit www.aces.edu/go/87 and click on 'IPM Evaluation Toolkit' in the menu. Thank you.
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
This session describes different kinds of program evaluations, and key evaluation considerations. These presentations are are part of a workshop series that was implemented in Nepal and 2016 as part of the INGENAES initiative.
Presented by:
Dr. Lisa D’Adamo-Weinstein, Director of Academic Support , SUNY Empire State College
Dr. Tacy Holliday, Governance Coordinator, Montgomery College, NCLCA Learning Center Leadership Level
Description: Measuring and evaluating student success is crucial to retention efforts and program development. Join us as we talk about the key elements necessary to measure student success in your tutoring and learning centers. We will assist you in developing an assessment plan for your own center.
Webinar - How to improve the evaluation of complex systems?Leonardo ENERGY
Evaluation is intended to provide policy makers and practitioners with feedback and recommendations to improve policy making and implementation. The success of an evaluation and the impact of its findings hinges on the way policy makers and practitioners are involved in and perceive the evaluation process. The more complex the policy, the more challenging this relationship becomes.
Based on the practical example of analysing the evaluation of the Defra’s Reward & Recognition Scheme (RRF), this webinar will explore research insights around these topics:
* Evaluation experiences and challenges;
* Relationship between evaluation and policy-making especially looking at the policy cycle; and
* The understanding and potential use of complexity in policymaking.
The end goal being to unpack how to better conduct evaluations of complex systems and improve evidence-backed policymaking focusing on the UK context but, hopefully, offering lessons that can be extrapolated across other European contexts and to various policy fields.
The COVID-19 pandemic has created several challenges for our country’s health care infrastructure, and the community health center workforce is no exception. Join us as we describe strategies to get patients back into dental care. Along with these strategies, participants will learn how to recognize challenges in dental practices, as well as how to engage the interdisciplinary care team through role redesign and integration to increase access to comprehensive care.
NTTAP Webinar Series - June 7, 2023: Integrating HIV Care into Training and E...CHC Connecticut
In order for health centers to provide compassionate and respectful HIV prevention, care, and treatment in comprehensive primary care settings, the clinical workforce must be knowledgeable, confident, and competent in their ability to do so.
We’ll explore the need to integrate HIV care into training and education for the clinical care team, as well as educational models to train the next generation. Using Community Health Center Inc.’s Center for Key Populations Fellowship for Nurse Practitioners (NPs) as a framework for best practices, experts will discuss how to implement specialty care for key populations in your training programs. Additionally, participants will gain awareness of the importance of training the clinical workforce on key population competencies in HIV programs (e.g. HCV, MOUD, LGBTQI+ health, homelessness, and harm reduction).
Utilizing the Readiness to Train Assessment Tool (RTAT™) To Assess Your Capac...CHC Connecticut
Improve educational training experiences at your health center by assessing your capacity and infrastructure to host health professions students.
Join the upcoming hands-on interactive activity session to learn how to utilize the Readiness to Train Assessment Tool (RTAT™). This tool was developed by HRSA-funded National Training and Technical Assistance Partners (NTTAP) at Community Health Center, Inc. (CHC) to understand organizational readiness to host health professions student training programs.
NTTAP Webinar Series - May 18, 2023: The Changing Landscape of Behavioral Hea...CHC Connecticut
The COVID-19 pandemic has resulted in significant shifts in the mode of care from face-to-face to virtual interactions. Join us as we discuss the challenges currently facing behavioral health care and at least one strategy for each. Along with these strategies, panelists will go over what integrated behavioral health care was and is before and following COVID-19, as well as what actions should be taken going forward to increase access to comprehensive care.
Panelists:
• Dr. Tim Kearney, PhD, Chief Behavioral Health Officer, Community Health Center, Inc.
• Melinda Gladden, LCSW, PMHC, Behavioral Health Clinician, Community Health Center, Inc.
• Jodi Anderson, LMFT, Virtual Telehealth Group Coordinator, Community Health Center, Inc.
NTTAP Webinar Series - April 13, 2023: Quality Improvement Strategies in a Te...CHC Connecticut
Join us for a webinar on quality improvement in team-based care!
Building a quality improvement (QI) infrastructure within team-based care is an organizational strategy that will establish a culture of continuous improvement across departments and improve quality in all domains of performance.
Participants will learn about:
• QI infrastructure
• Facilitating QI committees
• Coach training within health centers
Faculty will also provide an example of how trained coaches use QI tools to test and implement changes within an organization.
Implementation of Timely and Effective Transitional Care Management ProcessesCHC Connecticut
Join us to discuss best practices for integrating daily follow-ups for patients recently hospitalized for health emergencies. Effectively following up with patients is a critical responsibility for integrated care teams.
Experts will share how their teams respond to patients to identify care gaps and support the transition of care. Workflow descriptions will provide participants with the tools to support their work to adapt specific steps into their model of team-based care.
Panelists:
• Mary Blankson, DNP, APRN, FNP-C, FAAN, Chief Nursing Officer, Community Health Center, Inc.
• Veena Channamsetty, MD, FAAFP, Chief Medical Officer, Community Health Center, Inc.
• Bibian Ladino-Davis, Behavioral Health Coordinator, Weitzman Institute
Implement Behavioral Health Training Programs to Address a Crucial National S...CHC Connecticut
Health centers are uniquely positioned to address the unprecedented need for behavioral health services but are challenged by the workforce shortage. Participants will gain the knowledge needed to begin conceptualization of a training pathway.
Join us to discuss the considerations of sponsoring an in-house training program across all educational levels, including the benefits, program structure, design, curriculum, supervisors' role, and required resources.
Experts will provide participants with examples from practicum and postdoctoral level training programs to help them gain confidence in developing a behavioral health training pathway.
HIV Prevention: Combating PrEP Implementation ChallengesCHC Connecticut
Expert faculty present case-based scenarios illustrating common challenges to integrating HIV PrEP in primary care. As part of improving clinical workforce development, this session will delve into a variety of specific PrEP implementation challenges. Participants will leave with strategies to overcome these obstacles to establish or strengthen their PrEP program.
Panelists:
• Marwan Haddad, MD, MPH, AAHIVS, Medical Director, Center for Key Populations, Community Health Center, Inc.,
• Jeannie McIntosh, APRN, FNP-C, AAHIVS, Family Nurse Practitioner, Center for Key Populations, Community Health Center, Inc.
NTTAP Webinar Series - December 7, 2022: Advancing Team-Based Care: Enhancing...CHC Connecticut
Join us as expert faculty outline the differences between case management, care coordination and complex care management to frame up a discussion on strategies to leverage effective models for both in-person and remote services.
Expert faculty will discuss the role of the medical assistant and the nurse in care management, as well as how standing orders and delegated orders support this work. This session will discuss how telehealth and remote patient monitoring enhancements can support complex care management for patients with chronic conditions.
Participants will leave this session with the knowledge and tools to begin or enhance implementation of chronic care management by enhancing the role of the medical assistant, nurse and the technology that supports the clinical care.
Panelists:
• Mary Blankson, DNP, APRN, FNP-C, Chief Nursing Officer, Community Health Center, Inc.
• Tierney Giannotti, MPA, Senior Program Manager, Population Health, Community Health Center Inc.
NTTAP Webinar: Postgraduate NP/PA Residency: Discussing your Key Program Staf...CHC Connecticut
Expert faculty will discuss the drivers, benefits, and processes of implementing a postgraduate residency training program at your health center. This session will dive deeper into a discussion on the responsibilities of key program staff, preceptors, mentors, and faculty for successful implementation. This webinar will equip participants with a road map to go from planning to implementation and offer an opportunity for coaching support.
Panelists:
• Program Director of the Nurse Practitioner Residency Program, Charise Corsino, MA
• Clinical Program Director of the Nurse Practitioner Residency Program, Nicole Seagriff, DNP, APRN, FNP-BC
Training the Next Generation within Primary CareCHC Connecticut
This webinar discussed the various avenues of workforce development including:
• training non-clinical roles
• the value of an administrative fellowship
• the key questions to ask before establishing a fellowship at your agency
The discussion referenced CHC Chief Operating Officer Meredith Johnson and CHC Project Manager Megan Coffinbargar’s publication “Establishing an Administrative Fellowship Program: A Practical Toolkit to Support and Develop Future Community Health Center Leaders” for the National Association of Community Health Centers (NACHC).
Panelists:
• April Joy Damian, PhD, MSc, CHPM, PMP, Vice President and Director of the Weitzman Institute, Community Health Center, Inc.
• Megan Coffinbargar, MHA, Project Manager, Optimizing Virtual Care Initiative, Community Health Center, Inc.
Global launch of the Healthy Ageing and Prevention Index 2nd wave – alongside...ILC- UK
The Healthy Ageing and Prevention Index is an online tool created by ILC that ranks countries on six metrics including, life span, health span, work span, income, environmental performance, and happiness. The Index helps us understand how well countries have adapted to longevity and inform decision makers on what must be done to maximise the economic benefits that comes with living well for longer.
Alongside the 77th World Health Assembly in Geneva on 28 May 2024, we launched the second version of our Index, allowing us to track progress and give new insights into what needs to be done to keep populations healthier for longer.
The speakers included:
Professor Orazio Schillaci, Minister of Health, Italy
Dr Hans Groth, Chairman of the Board, World Demographic & Ageing Forum
Professor Ilona Kickbusch, Founder and Chair, Global Health Centre, Geneva Graduate Institute and co-chair, World Health Summit Council
Dr Natasha Azzopardi Muscat, Director, Country Health Policies and Systems Division, World Health Organisation EURO
Dr Marta Lomazzi, Executive Manager, World Federation of Public Health Associations
Dr Shyam Bishen, Head, Centre for Health and Healthcare and Member of the Executive Committee, World Economic Forum
Dr Karin Tegmark Wisell, Director General, Public Health Agency of Sweden
Defecation
Normal defecation begins with movement in the left colon, moving stool toward the anus. When stool reaches the rectum, the distention causes relaxation of the internal sphincter and an awareness of the need to defecate. At the time of defecation, the external sphincter relaxes, and abdominal muscles contract, increasing intrarectal pressure and forcing the stool out
The Valsalva maneuver exerts pressure to expel faeces through a voluntary contraction of the abdominal muscles while maintaining forced expiration against a closed airway. Patients with cardiovascular disease, glaucoma, increased intracranial pressure, or a new surgical wound are at greater risk for cardiac dysrhythmias and elevated blood pressure with the Valsalva maneuver and need to avoid straining to pass the stool.
Normal defecation is painless, resulting in passage of soft, formed stool
CONSTIPATION
Constipation is a symptom, not a disease. Improper diet, reduced fluid intake, lack of exercise, and certain medications can cause constipation. For example, patients receiving opiates for pain after surgery often require a stool softener or laxative to prevent constipation. The signs of constipation include infrequent bowel movements (less than every 3 days), difficulty passing stools, excessive straining, inability to defecate at will, and hard feaces
IMPACTION
Fecal impaction results from unrelieved constipation. It is a collection of hardened feces wedged in the rectum that a person cannot expel. In cases of severe impaction the mass extends up into the sigmoid colon.
DIARRHEA
Diarrhea is an increase in the number of stools and the passage of liquid, unformed feces. It is associated with disorders affecting digestion, absorption, and secretion in the GI tract. Intestinal contents pass through the small and large intestine too quickly to allow for the usual absorption of fluid and nutrients. Irritation within the colon results in increased mucus secretion. As a result, feces become watery, and the patient is unable to control the urge to defecate. Normally an anal bag is safe and effective in long-term treatment of patients with fecal incontinence at home, in hospice, or in the hospital. Fecal incontinence is expensive and a potentially dangerous condition in terms of contamination and risk of skin ulceration
HEMORRHOIDS
Hemorrhoids are dilated, engorged veins in the lining of the rectum. They are either external or internal.
FLATULENCE
As gas accumulates in the lumen of the intestines, the bowel wall stretches and distends (flatulence). It is a common cause of abdominal fullness, pain, and cramping. Normally intestinal gas escapes through the mouth (belching) or the anus (passing of flatus)
FECAL INCONTINENCE
Fecal incontinence is the inability to control passage of feces and gas from the anus. Incontinence harms a patient’s body image
PREPARATION AND GIVING OF LAXATIVESACCORDING TO POTTER AND PERRY,
An enema is the instillation of a solution into the rectum and sig
Leading the Way in Nephrology: Dr. David Greene's Work with Stem Cells for Ki...Dr. David Greene Arizona
As we watch Dr. Greene's continued efforts and research in Arizona, it's clear that stem cell therapy holds a promising key to unlocking new doors in the treatment of kidney disease. With each study and trial, we step closer to a world where kidney disease is no longer a life sentence but a treatable condition, thanks to pioneers like Dr. David Greene.
CHAPTER 1 SEMESTER V - ROLE OF PEADIATRIC NURSE.pdfSachin Sharma
Pediatric nurses play a vital role in the health and well-being of children. Their responsibilities are wide-ranging, and their objectives can be categorized into several key areas:
1. Direct Patient Care:
Objective: Provide comprehensive and compassionate care to infants, children, and adolescents in various healthcare settings (hospitals, clinics, etc.).
This includes tasks like:
Monitoring vital signs and physical condition.
Administering medications and treatments.
Performing procedures as directed by doctors.
Assisting with daily living activities (bathing, feeding).
Providing emotional support and pain management.
2. Health Promotion and Education:
Objective: Promote healthy behaviors and educate children, families, and communities about preventive healthcare.
This includes tasks like:
Administering vaccinations.
Providing education on nutrition, hygiene, and development.
Offering breastfeeding and childbirth support.
Counseling families on safety and injury prevention.
3. Collaboration and Advocacy:
Objective: Collaborate effectively with doctors, social workers, therapists, and other healthcare professionals to ensure coordinated care for children.
Objective: Advocate for the rights and best interests of their patients, especially when children cannot speak for themselves.
This includes tasks like:
Communicating effectively with healthcare teams.
Identifying and addressing potential risks to child welfare.
Educating families about their child's condition and treatment options.
4. Professional Development and Research:
Objective: Stay up-to-date on the latest advancements in pediatric healthcare through continuing education and research.
Objective: Contribute to improving the quality of care for children by participating in research initiatives.
This includes tasks like:
Attending workshops and conferences on pediatric nursing.
Participating in clinical trials related to child health.
Implementing evidence-based practices into their daily routines.
By fulfilling these objectives, pediatric nurses play a crucial role in ensuring the optimal health and well-being of children throughout all stages of their development.
Navigating the Health Insurance Market_ Understanding Trends and Options.pdfEnterprise Wired
From navigating policy options to staying informed about industry trends, this comprehensive guide explores everything you need to know about the health insurance market.
1. AGENDA- Learning Collaborative Session 8
April 6, 3:00-4:30pm (EST)
Team Report Outs
Successes and Challenges, Recruitment updates, Questions for Faculty
Position Offers/Contracts/Agreements
Onboarding/Licensing/Credentialing
Evaluation of the Program
Preparing for Accreditation
Action Period Items
Monthly Reports
Draft Contracts/Agreements
Precepting Panel questions
Next Session:
May 3
3. Offers, Contracts and Agreements
OFFERS:
• Determine how and when to communicate offers. Offers and
declinations and are done by use of the ranking log
• Determine length of time for a decision- at CHC this is 48 hours.
• In the case of a “tie”, interviewers must discuss candidates and
choose.
• Prepare for “back up offers” and a waiting list
4. Contracts and Agreements
• Immediately following the offer should be a formal employment contract.
• Determine method of delivery (electronic or direct mail) and length of time to
return signed contract
• The contract can be a modified version of your organization’s existing
employment contract. Items that may differ in the contract include:
• Term of the contract- 12 month residency program
• Practice location
• Salary
• PTO
• CME
• Employment requirement post residency year- determine length of
commitment and subsequent year salaries.
6. Licensing and Credentialing
• Offers have been made and accepted – start immediately!
• The process is a domino effect and timelines are short
• Follow your organizations general policy – adjust as needed
• Be prepared for delays based on states candidates come from
• Guide your candidates through the process and keep track of
their status
7. Licensing and Credentialing
NP Residents
1. Sit for and pass boards
2. Apply for state RN license
3. Apply for state APRN license
4. Apply for state controlled substance license
5. Apply for federal DEA license
Post Doc Residents
1. Post docs are unlicensed and work under the
supervisor’s license.
2. Verify that work under another’s license is a billable
service in your state. There is wide variability. In
CT Husky (Medicaid) is billable but most private
insurances are not
3. Be aware of licensing requirements in your state or
state post doc wishes to seek licensure in and
provide appropriate supervision and documentation
8. Onboarding
• In addition to licensing and credentialing process – Residents
must be on boarded
• Leverage your HR department to help apply the organizations
process for onboarding all new staff
• HR connects with Residents prior to start date and is also
invited to orientation
• Residents are employees and their onboarding should look
very similar
• We will cover orientation in more detail later!
10. Overview of the Session
Definitions and Process of Good Program Evaluation
How to Design Meaningful Evaluation
– Integrated Throughout the Program – Recruitment to
Graduation
– Creates explicit expectations for trainee
– Documents programmatic success
– Fosters improvement positive growth, creativity and innovation
Characteristics of Useful Evidence
11. Learning Objectives
Knowledge:
– Understand the purpose of evaluation
– Know the characteristics of good evaluation
– Understand the process of evaluation
– Understand the connection with curriculum
Attitude:
– Embrace the challenge
– Value the outcomes
Skills
– To be gained by independent / group work focused
on local training program
12. Definitions:
Evaluation: systematic investigation of merit, worth, or significance of effort;
Program evaluation: evaluate specific projects and activities that target
audiences may take part in;
Stakeholders: those who care about the program or effort.
Approach: practical, ongoing evaluation involving program participants, community
members, and other stakeholders.
Importance:
1. Helping to clarify program plans;
2. Improving communication among participants and partners;
3. Gathering the feedback needed to improve and be accountable for program
outcomes/effectiveness;
4. Gain Insight about best practices and innovation;
5. Determine the impact of the program;
6. Empower program participants and contribute to organizational growth.
13. 1. Develop a Written Plan Linked to Curriculum
2. Collect Data
3. Analyze Data
4. Communicate and Improve
4 Basic Steps to Program Evaluation
14. Fitting the Pieces Together: Program Evaluation
Program
Curriculum
Preceptor
Faculty
Staff
Trainee
Institution
Overall
Program
15. Program Evaluation Feedback Loops
Trainee performance
Instructor and staff
performance
Program curriculum
performance
Programmatic and Institutional
performance
16. Evaluation Process How Do You Do It?
Steps in Evaluation:
Engage stakeholders
Describe the program
Focus the evaluation design
Gather credible evidence
Justify conclusions Analyze,
synthesize and interpret findings,
provide alternate explanations
Feedback, follow up and disseminate
Ensure use and share lessons learned
17. Level 1: Reaction (Satisfaction Surveys) Was it worth time; was it
successful? What were biggest strengths/weaknesses? Did they like
physical plant?
Level 2: Learning (Observations/interviews) observable/measurable
behavior change before, during, after program.
Level 3: Behavior (Observations/interviews) New or changed behavior
on the job? Can they teach others? Are trainees aware of change?
Level 4: Results (Program Goals/Institutional goals) Improved
employee retention? Increased productivity for new employees?
Higher morale?
Kirkpatrick Model of Evaluation
18. • What will be evaluated?
• What criteria will be used to judge program
performance?
• What standards of performance on the criteria must be
reached for the program to be considered successful?
• What evidence will indicate performance on the criteria
relative to the standards?
• What conclusions about program performance are
justified based on the available evidence?
Questions Guiding the Evaluation Process
19. Basic Questions – Administrative Example
What? Postgraduate Training Program
Criteria?
# of qualified applicants; # of trainees who remain with the
program; ROI
Standards of Performance?
# applicants; Half trainees hired at conclusion of year; On-
boarding costs reduced; Billable hours increase w/ramp-up
Evidence?
HR data / reports; Financials
Conclusions? Is the investment worthwhile?
20. Accuracy, Utility, Feasibility, Propriety
Anchored in the goals and objectives of the curriculum
Formative and summative
Use measurable and observable criteria of acceptable performance
Multiple, expert ratings/raters: Multiple observations give confidence in findings and
provides an estimate of reliability (reproducibility or consistency in ratings).
Conclusions need to be relevant and meaningful. Validity is based on a synthesis of
measurements that are commonly accepted, meaningful, and accurate (to the extent
that expert judgments are accurate).
Goals of Good Evaluation
21. Credible evidence -- Raw material of a good evaluation.
Believable, trustworthy, and relevant answers to evaluation questions
Indicators (evidence)
Translate general concepts about program and expected effects into specific, measurable parts. (eg:
increase in patient panel / billable hours over 1 year)
Sources
People, documents, or observations (eg: trainees, faculty, patients, billable hours, reflective journals).
Use multiple sources -- enhances the evaluation's credibility.
Integrate qualitative and quantitative information -- more complete and more useful for needs and
expectations of a wider range of stakeholders.
Quantity
Determine how much evidence will be gathered in an evaluation.
All evidence collected should have a clear, anticipated use.
Logistics
Written Plan: Methods, timing (formative and summative), physical infrastructure to gather/handle
evidence.
Must be consistent with cultural norms of the community, must ensure confidentiality is protected.
22. Learning Objectives
Knowledge:
– Understand the goals and purpose of evaluation
– Know the characteristics of good evaluation
– Understand the process of evaluation
– Understand the connection with curriculum
Attitude:
– Embrace the challenge
– Value the outcomes
Skills
– To be gained by independent / group work focused on local
training program
23. The Community Tool Box, (Work Group for Community Health at the U of Kansas):
incredibly complete and understandable resource, provides theoretical overviews,
practical suggestions, a tool box, checklists, and an extensive bibliography.
Pell Institute: user-friendly toolbox that steps through every point in the evaluation
process: designing a plan, data collection and analysis, dissemination and
communication, program improvement.
CDC has an evaluation workbook for obesity programs; concepts and detailed work
products can be readily adapted to NP postgraduate programs.
Another wonderful resource, Designing Your Program Evaluation Plans, provides a self-
study approach to evaluation for nonprofit organizations and is easily adapted to training
programs. There are checklists and suggested activities, as well as recommended
readings.
NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education-
Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-Evaluation
Resources:
24. Action Items
Action Period Items
Monthly Reports
Draft Contracts/Agreements
Precepting Panel questions
Next Session:
May 3
Editor's Notes
Hello – I am so pleased to join you today. I want to thank CHC, the Weitzman Institute, and the NCA project team for inviting me here to discuss program evaluation and its relevance for postgraduate training programs. I am Candice Rettie, the Executive Director of the National Nurse Residency and Training Consortium, called the Consortium or NNPRFTC, for short.
Next slide please.
To gain insight.This happens, for example, when deciding whether to use a new approach (e.g., would a neighborhood watch program work for our community?) Knowledge from such an evaluation will provide information about its practicality. For a developing program, information from evaluations of similar programs can provide the insight needed to clarify how its activities should be designed.
To improve how things get done.This is appropriate in the implementation stage when an established program tries to describe what it has done. This information can be used to describe program processes, to improve how the program operates, and to fine-tune the overall strategy. Evaluations done for this purpose include efforts to improve the quality, effectiveness, or efficiency of program activities.
To determine what the effects of the program are. Evaluations done for this purpose examine the relationship between program activities and observed consequences. For example, are more students finishing high school as a result of the program? Programs most appropriate for this type of evaluation are mature programs that are able to state clearly what happened and who it happened to. Such evaluations should provide evidence about what the program's contribution was to reaching longer-term goals such as a decrease in child abuse or crime in the area. This type of evaluation helps establish the accountability, and thus, the credibility, of a program to funders and to the community.
To affect those who participate in it. The logic and reflection required of evaluation participants can itself be a catalyst for self-directed change. And so, one of the purposes of evaluating a program is for the process and results to have a positive influence. Such influences may:
Empower program participants (for example, being part of an evaluation can increase community members' sense of control over the program);
Supplement the program (for example, using a follow-up questionnaire can reinforce the main messages of the program);
Promote staff development (for example, by teaching staff how to collect, analyze, and interpret evidence); or
Contribute to organizational growth (for example, the evaluation may clarify how the program relates to the organization's mission).
Develop a Plan
Collect Data
Analyze Data
Communicate & Improve
5 Components:
Program curriculum;
Trainee -- performance, feedback, and remediation as necessary;
Clinical faculty/instructor and support staff -- performance, feedback, and remediation as necessary;
Organizational -- Adequacy of support including operations and finances;
Overall programmatic self-evaluation -- including outcome measures and corresponding action plans.
The steps in evaluation always keep in mind the Standards for good evaluation:
Engage stakeholders
Describe the program
Focus the evaluation design
Gather credible evidence
Justify conclusions -- Analyze, synthesize and interpret findings – provide alternate explanations
Ensure use and share lessons learned – Feedback, follow up and disseminate
The 4 Standards are used to assess the quality of the evaluation -- described in more detail in the Community tool box resource prepared by the University of Kansas – includes the accuracy of the data, the meaningfulness of the data,
As you design your devaluation process these are important questions to ask the evaluation group. The answers will determine how effective / how meaningfull your evaluation is. You’ll want to revisit these questions periodically. You may find out that a specific criteria – the number of children who receive vacinations is difficult to gather, b/c they get vaccinated at other sites….
And the last question – what conclusions can be drawn – put on the hat of someone who doesn’t know your catchment area or what you do – are there other explianations for the outcomes? What is the context of the outcomes…
As you design your devaluation process these are important questions to ask the evaluation group. The answers will determine how effective / how meaningfull your evaluation is. You’ll want to revisit these questions periodically. You may find out that a specific criteria – the number of children who receive vacinations is difficult to gather, b/c they get vaccinated at other sites….
And the last question – what conclusions can be drawn – put on the hat of someone who doesn’t know your catchment area or what you do – are there other explianations for the outcomes? What is the context of the outcomes…
What – entity
Criteria --
This requires thinking broadly about what counts as "evidence." Such decisions are always situational; they depend on the question being posed and the motives for asking it
Indicators
The goal is to collect information that will convey a credible, well-rounded picture of the program and its efforts.
Having credible evidence strengthens the evaluation results as well as the recommendations that follow from them. CREDIBILITY: evaluation's overall credibility. One way to do this is by using multiple procedures for gathering, analyzing, and interpreting data. Encouraging participation by stakeholders can also enhance perceived credibility. When stakeholders help define questions and gather data, they will be more likely to accept the evaluation's conclusions and to act on its recommendations.
Sources
The criteria used to select sources should be clearly stated so that users and other stakeholders can interpret the evidence accurately and assess if it may be biased. Use multiple sources provides an opportunity to include different perspectives about the program and enhances the evaluation's credibility. n addition, some sources provide information in narrative form (for example, a person's experience when taking part in the program) and others are numerical (for example, how many people were involved in the program).
Logistics
By logistics, we mean the methods, timing, and physical infrastructure for gathering and handling evidence. WRITTEN PLAN
Techniques for gathering evidence in an evaluation must be in keeping with the cultural norms of the community. Data collection procedures should also ensure that confidentiality is protected.