This chapter discusses using a program's logic model to focus an evaluation. It describes how a logic model shows the relationship between inputs, activities, outputs, and outcomes of a program. An evaluation should select which specific elements to evaluate, as evaluating the entire program at once would be impossible.
The chapter outlines three types of evaluation questions - effort, effectiveness, and efficiency. Effort questions relate to inputs and activities, effectiveness questions relate to outputs and outcomes, and efficiency questions relate to costs and benefits. It also discusses process evaluations, which focus on inputs and activities, and outcome evaluations, which focus on outcomes and goals. The document provides guidance on using a logic model to develop evaluation questions to focus an evaluation.
Learn about how to become a member at the Melrose Senior Community Center. Learn why membership is so important, what we will do with your membership funds, and what our goals are for the future. It takes all of us to make this place GREAT!
Exploring the geography of the registered addresses of car models through a b...Guy Lansley
In 2013 there were 29.2 million registered cars in Great Britain, and the 2011 UK Censuses confirmed that almost 75% of households had access to at least one car or van. Despite this, the DVLA’s database of car model registrations remains underexplored as an indicator of household characteristics. Car ownership itself has been frequently considered as a census proxy variable for affluence in the past. However, this is now a dated interpretation as car ownership has become more widespread across society and the value of automobiles range considerably, additionally ownership is influenced by several factors asides from disposable income. Understanding the geography of different car models is likely be more informative of local population characteristics as the choice of model purchase is dependent on several factors, notably including the cost and the purpose of the vehicle. In partnership with the DFT and the DVLA, a car classification was produced which grouped every car model registered in Great Britain in 2011 into 10 distinctive categories based on the key characteristics of the vehicle. The DVLA then made the total number of registered cars for each classification category available at a small are geography (LSOA) to be analysed. The dataset was then explored to reveal distinctive spatial patterns exerted between car model types at the neighbourhood level. The findings were then compared to key 2011 Census variables and 2011 house price data from the Land Registry to understand how social standing and life stage relate to patterns in car consumption between neighbourhoods.
Learn about how to become a member at the Melrose Senior Community Center. Learn why membership is so important, what we will do with your membership funds, and what our goals are for the future. It takes all of us to make this place GREAT!
Exploring the geography of the registered addresses of car models through a b...Guy Lansley
In 2013 there were 29.2 million registered cars in Great Britain, and the 2011 UK Censuses confirmed that almost 75% of households had access to at least one car or van. Despite this, the DVLA’s database of car model registrations remains underexplored as an indicator of household characteristics. Car ownership itself has been frequently considered as a census proxy variable for affluence in the past. However, this is now a dated interpretation as car ownership has become more widespread across society and the value of automobiles range considerably, additionally ownership is influenced by several factors asides from disposable income. Understanding the geography of different car models is likely be more informative of local population characteristics as the choice of model purchase is dependent on several factors, notably including the cost and the purpose of the vehicle. In partnership with the DFT and the DVLA, a car classification was produced which grouped every car model registered in Great Britain in 2011 into 10 distinctive categories based on the key characteristics of the vehicle. The DVLA then made the total number of registered cars for each classification category available at a small are geography (LSOA) to be analysed. The dataset was then explored to reveal distinctive spatial patterns exerted between car model types at the neighbourhood level. The findings were then compared to key 2011 Census variables and 2011 house price data from the Land Registry to understand how social standing and life stage relate to patterns in car consumption between neighbourhoods.
Mit forsøg på at vise 50 nyttige WordPress plugins der kan bruges i vidt forskellige situationer. Og det på kun 45 minutter. Det endte med at blive lidt flere efter oplægget på WordCamp 2010 i Købenavm, Danmark.
This is the transcript of the first session in the Lights Camera Profits Workshop. The session focuses on what are web videos and what it can do for your business.
To learn more about web videos, visit http://www.melbournevideoproduction.com.au/video-seo/web-video-tutorials/
Tại sao tên miền lại được đánh giá cao tại quốc tế, được mua bán chuyển nhượng hàng triệu USD, mời tham khảo bài thuyết trình của ngài Monte Cahn (SnapNames/Moniker)
Workbook for Designing a Process Evaluation MoseStaton39
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
Workbook for Designing a Process Evaluation .docxAASTHA76
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid .
Workbook for Designing a Process Evaluation MikeEly930
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
There's more to learning evaluation than surveys and smile sheets. In this recent webinar, Andrew Downes laid down practical, straightforward advice on how to take your learning evaluation further and measure whether your learning programs are having the impact they were designed to achieve.
Here's the slides!
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Mit forsøg på at vise 50 nyttige WordPress plugins der kan bruges i vidt forskellige situationer. Og det på kun 45 minutter. Det endte med at blive lidt flere efter oplægget på WordCamp 2010 i Købenavm, Danmark.
This is the transcript of the first session in the Lights Camera Profits Workshop. The session focuses on what are web videos and what it can do for your business.
To learn more about web videos, visit http://www.melbournevideoproduction.com.au/video-seo/web-video-tutorials/
Tại sao tên miền lại được đánh giá cao tại quốc tế, được mua bán chuyển nhượng hàng triệu USD, mời tham khảo bài thuyết trình của ngài Monte Cahn (SnapNames/Moniker)
Workbook for Designing a Process Evaluation MoseStaton39
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
Workbook for Designing a Process Evaluation .docxAASTHA76
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid .
Workbook for Designing a Process Evaluation MikeEly930
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
There's more to learning evaluation than surveys and smile sheets. In this recent webinar, Andrew Downes laid down practical, straightforward advice on how to take your learning evaluation further and measure whether your learning programs are having the impact they were designed to achieve.
Here's the slides!
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Training Needs Analysis (TNA) in one of the essential and basic activity of a HR Manager. Unless scientifically drawn, TNA won't bring fruits of productivity and performance improvement.
Bringing User-CenteredDesign Practices intoAgile Development Projectsabcd82
Bringing User-CenteredDesign Practices intoAgile Development Projects -This full day tutorial seeks to explain Agile Development\'s incremental release and iterative development strategy from the perspective of a user centered design practitioner. Practical advice is given on making Agile development more user-centric.
I attached another student post powerpoint.Response GuidelinesRe.docxmaple8qvlisbey
I attached another student post powerpoint.
Response Guidelines
Review the posts of your peers and respond to one of them. Address the items they would like you to focus on, but make sure your feedback considers both the content of the material and its presentation. Tell them what you liked about the presentation. Make a suggestion or two for improvement. The thoughtful feedback you give your peers will not only help them improve their work but will also provide you with insights about your own work.
This is the information:
THE ETHICAL DILEMMA SPYING ON UNILEVER INTRODUCTION In Business Ethics as a Rational Choice, John Hooker cited a case study to analyze rational choice based on an issue with espionage. In 2001, John Pepper, Chairman of the Board at
Procter
and Gamble, found out that some of his contractors were spying on
Unilever
, one of his competitors. Information they found was also in the business media a day before, he discovered. Was this ethical, based on generalizable, utilitarian, and virtue ethics? Was it GENERALIZABLE? Generalizable means there must be a reason behind an action, and the action is justified for everyone (p.7). Was it utilitarian? Utilitarian analysis states that the rational choice must maximize utility (p. 6). The marketing professionals did not have to search in the trash for information since the day before they did it, similar information was already in the media. Therefore their actions were unjustifiable. no: it was not generalizable Was it virtue ethics? Virtue ethics must be consistent with broad cultural acceptable behaviors Conclusion P &G's espionage activity did not pass the code of ethics test, since it needed to pass all three to be considered rationally ethical. Therefore, John Pepper's reactions to the issue was valid and justified. Their actions failed in the generalization, utilitarian, and virtue ethics tests. The net usage of the information they found in the trash did not surpass the information found in the news, because it was the same exact information. Therefore it was useless of them to go into the trash in search of secrets. no: it was not utilitarian It is unacceptable in our culture to have our professionals diving into dumpsters to spy on other firms in order to get ahead. Especially after the information was aired in the media, why was this company conducting this espionage. no: it was not virtue ethical References Hooker, J. (2011). Business ethics as rational choice. Upper Saddle River, NJ: Pearson Education.
OK
Study Information:
·
Program Skill Assessments
Activity Context
This study helps you develop the skills to master the following course competency:
Communicate in a manner that is professional and consistent with expectations for members of the business professions.
Activity Instructions
Two key competencies that will help you throughout your academic program and business career are the abilities to communicate effectively in writing and to work accurately an.
Hosted by Mentoring Partnership of Minnesota on October 30, 2012.
The Mentoring Best Practices Research Project, funded by the Office of Juvenile Justice and Delinquency Prevention (OJJDP), is being conducted in collaboration with Global Youth Justice and the National Partnership for Juvenile Services.
Presented October 18, 2012 - Part of 2012 Collaborative Mentoring Webinar Series
Education Northwest/National Mentoring Center, Friends For Youth, Indiana Mentoring Partnership, Kansas Mentors, Mentoring Partnerships of Minnesota and of Southwest Pennsylvania, Mentor Michigan, Mobius Mentors, Oregon Mentors and other partners are working together in 2012 to deliver this free monthly webinar series for mentoring professionals.
For updates about upcoming webinars, subscribe to the Chronicle of Evidence Based Mentoring forum: http://chronicle.umbmentoring.org/category/forum/ and at MENTOR/The National Mentoring Partnership.
January 19, 2012 - 1/12 in 2012 Collaborative Mentoring Webinar Series
Featured panelists:
David DuBois, Ph.D., University of Illinois at Chicago &
Tom Keller, Ph.D., Portland State University
Part of monthly Quality In Action webinar series hosted by the Mentoring Partnership of Minnesota. Why Youth Mentoring Relationships End with Dr. Renee Spencer, September 2011.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Introduction to AI for Nonprofits with Tapp Network
Handout #5 - QIAMay4
1. 4
Chapter
Using Your Logic
Model To Focus
the Evaluation
I
n Chapter 3, you and your evaluation team worked together to create or review a
logic model that depicts the different components of your program and their
relationship to one another. In this chapter, you will be using your completed
logic model to focus your evaluation effort. As you have seen, even fairly
Key terms in this straightforward programs have many different activities, outputs, and outcomes
chapter: associated with them. Evaluating your entire program at the same time would be an
Evaluation questions impossible task. Therefore, you and your evaluation team must decide which specific
Effort questions elements of your program you will evaluate at this time. This chapter will guide you in
Effectiveness questions using your logic model to select the specific aspects of your program you will evaluate.
Efficiency questions
Process evaluation
Outcome evaluation
Reviewing Your Logic Model
Remember that the major components of the logic model are:
• Inputs
• Activities
• Outputs
• Outcomes (immediate, intermediate, and long-term)
• Goals
Looking at different parts of your logic model will allow you to answer different kinds
of questions about your program (evaluation questions). Three basic questions can
be answered by an evaluation: effort, effectiveness, and efficiency. Each type of
question corresponds to a particular portion of the logic model.
4.1
2. S E L F - E V A L U A T I O N W O R K B O O K
Measures at Each Program Level
Effort questions. What services did we actually provide and to whom? For example,
How many youth did we match with mentors? Effort questions most often relate to the input
and activities sections of the evaluation model.
Effectiveness questions. Did you achieve the immediate results and/or long-term
outcomes that you wanted? For example: Did the kids in our mentoring program miss fewer
days of school this semester than nonmentored kids? These types of questions address the
output, outcomes, and goals sections of your evaluation model.
As you consider the potential effectiveness questions you might want to ask in your
evaluation, it is important to think about the differences between outcomes and goals.
Remember the discussion in Chapter 3 of the differences between the various program
components. Goals were defined as broad statements of purpose for your community
and agency. Most often, they are long-term changes achieved by several programs
working together. As you think about your evaluation questions, it is tempting to try to
design your evaluation to show that your mentoring program achieves its stated goals,
for example, reducing the dropout level in your community. However, it is highly
unlikely that a local evaluation of a particular program will reveal goal achievement. In
other words, it is unlikely that an evaluation of your mentoring program will
demonstrate goal attainment: that your community’s dropout rates have decreased.
It is far better for programs to design evaluations around program components where
they are more likely to be able to see that they have affected a change. For example,
though you most likely cannot show that your program has had an effect on your
community’s dropout rates, you can design an evaluation that will allow you to measure
whether your mentoring program has had an impact on students’ grades, attitudes
toward school, self-esteem, or other outcomes that are related to the overall goal of
dropout rates. In other words, it is much more practical and highly recommended by
this workbook that you choose to design your effectiveness questions around
outcomes rather than goals.
Efficiency questions. These questions address the cost per unit of service related to the
benefits achieved. For example, How much did it cost to provide mentoring to all the youth
enrolled in our program? Is this more or less costly based on the benefits attained than other types of
interventions we could have provided? Most efficiency questions can be answered
meaningfully only after long-term outcomes and goals have been assessed. You may be
able to calculate the cost of providing mentoring to each youth in your program, and
this information may have a variety of uses. However, you cannot weigh the cost
against the benefit of this intervention without assessing other long-term outcomes and
goals and without knowing the cost of other possible alternatives.
4.2
3. S E L F - E V A L U A T I O N W O R K B O O K
Thus far, we have discussed evaluation in terms of effort, effectiveness, and efficiency
questions. Each of these types of questions provides a particular kind of information
that can be useful to agencies in a variety of areas.
Effort Questions Effectiveness Questions
Inputs Activities Outputs Outcomes
Inputs Activities Outputs Outcomes
Goal
Inputs Outcomes
Activities Outputs
Inputs Outcomes
Activities Outputs
Efficiency Questions
Two other terms that are common in evaluation are process evaluation and outcome
evaluation. These terms refer to the types of questions that your evaluation is asking.
Process evaluations focus on inputs and activities. They look at whether your
program is doing what it set out to do in terms of inputs, activities, and outputs.
Process evaluations are concerned with effort questions. Outcome evaluations, on
the other hand, are concerned with whether your program is achieving what it set out
to achieve in terms of immediate, intermediate, or long-term outcomes or in terms of
goals. Outcome evaluations are concerned with effectiveness questions. Both types of
evaluations have merit and can provide useful information to programs.
It is important to remember that you and your evaluation team can design a self-
evaluation of your mentoring program that has both process and outcome elements.
4.3
4. S E L F - E V A L U A T I O N W O R K B O O K
While one part of your evaluation might want to focus on whether you are recruiting as
many mentors as you had hoped to recruit, another aspect of your evaluation may
focus on how mentoring is affecting school performance. There is no reason why you
cannot ask effort and effectiveness questions in the same evaluation.
You and your evaluation team do, however, want to avoid falling into the trap of
asking too many evaluation questions at the same time. Many times, evaluation teams
want to answer all of the questions they have about their programs the first time they
design and implement an evaluation. It is far better to focus on only one or two aspects
of your program than it is to try to answer too many questions at once. For your first
evaluation effort, select one or two evaluation questions. Remember that evaluation is
an ongoing process. Any questions that you and your evaluation team do not address
in this evaluation process can be answered in the next.
Process versus Outcome Evaluation
Worksheet 4.1: Developing Your Evaluation Questions will help you begin to think about
specific evaluation questions you might want to ask. Try to think of at least one
evaluation question for each level of the logic model. If your list of questions is still too long,
how will you know which questions to pursue? One way is to identify which questions will What you will need:
provide you with the information that you actually can use. This may sound simple, but Worksheet 4.1: Developing
often many program staff have never actually considered what they will do with the Your Evaluation Questions
information they gather or how the lessons learned might be used to change the
direction of their program. Refer back to the work you and your evaluation team
completed in Chapter 1.
• What did each member of the team hope to gain from the evaluation?
• What did they want to do with the information?
• Which of the questions you have listed on Worksheet 4.1 best supports those
purposes?
Other questions that can help you and your evaluation team members develop the
evaluation questions follow:
• What do we need to know about our program to solve a problem we have identified?
• When we have the information, who will be interested in knowing it?
• Why do we need to know this?
• What can and will we do differently based on this information?
• If we wanted to make a convincing marketing presentation about the program, what
would we like to be able to say?
4.4
5. S E L F - E V A L U A T I O N W O R K B O O K
Conclusion
In this chapter, you reviewed the components of the logic model that were introduced
in Chapter 3. It is possible to measure components at each level (input, activity, output,
outcome). Additionally, measurement at each level tends to produce answers to one of
three basic questions: effort, efficiency, and effectiveness. Finally, you and your team
developed a preliminary list of different evaluation questions and selected evaluation
questions that are more useful to your organization at this time. Now that you and your
team have developed your evaluation questions, it is time to begin thinking about when
and whom to measure and what type of design to use.
BEFORE YOU LEAVE THIS CHAPTER, MAKE SURE YOU—
have completed Worksheet 4.1: Developing Your Evaluation Questions;
understand the different components of your program’s logic
model and recognize that it is possible to measure at each level; and
have made a preliminary selection of evaluation questions to be
pursued at this time.
PROGRAM EXAMPLE:: THE CAN SELF--EVALUATIION
PROGRAM EXAMPLE THE CAN SELF EVALUAT ON
After reviewing their completed logic model, the CAN evaluation team used
Worksheet 4.1: Developing Your Evaluation Questions to create a list of potential
effort, effectiveness, and efficiency questions that their evaluation could answer.
Keeping in mind that one of their primary purposes was to discover information
that could support their grant applications for continued funding, the CAN focused
on elements of their logic model that concerned academic performance and ATOD
resistance. After listing various types of evaluation questions (see Appendix B for a
copy of their completed Worksheet 4.1), the CAN evaluation team decided to focus
on three effectiveness questions:
Are students more likely to complete homework on time and correctly after
participating in the mentoring program?
Do students’ attitudes toward ATOD use change after participation in the
mentoring program?
Do students receive higher grades after participating in the mentoring
program?
4.5