EVALUATION 2.0
EVALUATION 2.0 ≠ STATS!
KIRKPATRICK
“In the guidelines… no information is given
on how to use statistics. The subject is too
complex to be included here.” pg 42
5 LEVELS OF EVALUATION
I. Reaction
II. Knowledge
III. Behavior
IV. Society
V. ROI
BASED ON:
Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006).
Evaluating training programs (ed.). San
Francisco: TATAMcgraw Hill. ix-3.
AND:
Phillips, J. J. (2012). Return on investment
in training and performance improvement
programs. Routledge.
AND JOE ARTICLES
Chazdon, S., Horntvedt, J., Templin, E. (2016). From Knowledge to Action: Tips for
Encouraging and Measuring Program-Related Behavior Change. Journal of
Extension, 54(2).
Clements, J. (1999). Results? Behavior change. Journal of Extension, 37(2).
Martin, E., & Warner, L.A. (2015). Using commitment as a tool to promote behavior
change in Extension Programming, Journal of Extension, 53(4).
Pratt, C., & Bowman, S. (2008). Principles of Effective Behavior Change: Application
to Extension family educational programming. Journal of Extension, 46(5).
THE GOAL IS CHANGE
Desire
Know what to do and how to do it
Climate
Benefits for change
CHALLENGES
• Each level becomes more difficult and time consuming.
• Each level builds on the success of the previous level.
• It is difficult to achieve success at each level, but even more difficult to
measure success at each level.
• Forces your program to strengthen itself or retire.
• Requires planning and long-term commitment.
• Shouldn’t do entire process for every program.
OPPORTUNITIES
• They like it! Word-of-mouth support from clientele.
• Improved confidence, knowledge and skills.
• Lives actually improved in a clear and measurable way.
• Society actually improved in a clear and measurable way.
• Systematic long-term administrative, financial, and moral support to continue
your programs.
EXPERIMENTAL DESIGN
Pre-post Tests
Experimental and Control Groups
Random Assignment
Anonymity
THERE IS SUCH A THING AS
OVERKILLExperimental design is not always
necessary.
Make sure the final usage of data merits
the time and costs associated with the use
of rigid data controls.
In Extension we don’t have the luxury of
wasting time collecting data that we don’t
really need.
RETROSPECTIVE - SELF
EVALUATION
1 - REACTION
AKA Customer satisfaction
What did you like?
What didn’t you like?
Easiest form of evaluation.
DID YOU LIKE…
Topic
Instructor
Facilities
Schedule
Meals
Activities
Audio/visual
Handouts
Temperature
Etc.
QUANTIFY: LIKERT SCALE
IF SATISFACTION NEEDS TO
IMPROVEMake a change. Leader, instructor,
location, etc.
Modify the situation – train the leader,
instructor, audiovisuals, etc.
Live with it.
Change the standard.
100% RESPONSE RATE
Pencil and paper.
Show of hands.
No anonymity.
DRAFT YOUR OWN QUESTION
USING A LIKERT SCALE
Group Discussion
Group Share
SOMETIMES LIKING THE PROGRAM
ISN’T ENOUGH!
2 - KNOWLEDGE
AKA Knowledge, Skills and Attitude
Prerequisite for Level 3 behavior change.
More difficult and time-consuming than
Level 1 evaluation.
MAIN POINTS
Use experimental design when practical
Content-based tests
Use paper and pencil test for 100%
response rate.
Use live performance test for skills.
SKILL VS BEHAVIOR
Skill: What you are able to do. The ability
to perform a specific task/action. Short-
term.
Behavior: What you actually do. The
cumulative set of specific tasks/actions.
Long-term.
SKILLS
Teach someone how to use Google.
Measurement: Can they find a specific
web page?
Time how long it takes to find a
specific web page.
PROS AND CONS
Self assessment VS Knowledge/skill
based assessment.
Retrospective VS Pre/post
RETROSPECTIVE
DRAFT YOUR OWN QUESTION
USING A LIKERT SCALE
Group Discussion
Group Share
100 % RESPONSE?
50-70 % is a more realistic goal.
3. BEHAVIOR
Level 2 changes are prerequisite to Level
3 behavior change.
Much more complex and time-consuming
than Levels 1 and 2
NIFA
Most important, perhaps, is that the
National Institute for Food and Agriculture
continues to push for impacts that affect
conditions rather than simply knowledge
changes. Their effort to collect impacts
from across the country encourages
evaluation specialists to look beyond
knowledge change (National Institute for
Food and Agriculture, 2015).
BEHAVIOR CHANGE AND
EXTENSIONIn 1975, Claude Bennett noted in the
Journal of Extension that behavior change
was among the highest levels of evidence
for evaluation of Extension education
(Bennett, 1975).
Workman and Scheer's (2012) meta-
analysis of evaluation articles published in
the Journal of Extension found that about
27% of articles focused on behavior
change.
OPPORTUNITY TO CHANGE
BEHAVIOR
1-3 days after program?
1 month after program?
3 months after program?
6 months?
1 year?
3 years?
TIMING OF EVALUATION
“…it is impossible to predict when a
change in behavior will occur....change in
behavior may occur at any time or it may
not happen at all.” pg 52
EVALUATE BEHAVIOR
When?
Too soon…
Too late…
How frequent?
Different stages of behavior change will
be manifested at different points in
time. How to get a good response?
1-3 assessments?
INCENTIVES/REWARDS
Intrinsic – Participants know and respect
you on some personal level. Rapport.
Satisfaction, pride, sense of achievement
in program or behavior itself.
Extrinsic – Praise, recognition, gift cards,
payment, raise.
100 % RESPONSE RATE?
Very difficult.
Always consider the costs and benefits.
Something is better than nothing.
25-35% of programs.
COSTS AND BENEFITS
“Another important consideration is how
many times the program will be taught. If it
is run only once and it will not be repeated,
there is little justification for spending time
and money to evaluate possible changes
in behavior. However if program is going to
be repeated, the time and money spent
evaluating it can be justified by the
possible improvements in future
programs.” Kirkpatrick, pg 60
“...something beats nothing, and I
encourage trainers to do some evaluating
of behavior even if it isn't elaborate or
scientific.” Kirkpatrick, pg 61
SOMETHING IS BETTER THAN
NOTHING
Sample your participants:
Low response rates.
Pros and cons
Intentionally single out certain
participants for a case study on their
behavior.
CHAZDON, HORNTVED, TEMPLIN
(JOE)
Participants define their own action plan.
Set their own specific action goals.
Follow-up is expected, and more personal.
Requires more class time to implement.
Post-program education/social media
groups.
Social networking, learning “buddies.”
CHAZDON, HORNTVED, TEMPLIN
The intensity or length of a program is not
an impediment to measuring behavior
change. Even a one-time, 1-hr workshop
can produce behavior change.
TTM
The move from one stage of change to
another indicates we have made an
impact, even though the practice has not
been completely internalized.
TRANSTHEORETICAL MODEL (TTM)
OF BEHAVIOR CHANGE
Precontemplation
Contemplation
Preparation
Action
Maintenance
MARTIN, WARNER (JOE) 2015
By asking target audience members to
commit to something, Extension
professionals can increase the likelihood
of audience members adopting a new
behavior.
4 - SOCIETY
Workman and Scheer (2012) noted, "Too
often, Extension personnel fail to
document impact of programs by
collecting real evidence of behavior
change or greater end results that benefit
society" (Problem Statement, Purpose,
and Objectives section, para. 1).
CHALLENGES
Clearly identify what your program is
supposed to do.
Clearly measure that your program
accomplishes what you want it to do.
WHAT SHOULD YOUR PROGRAM
DO?What should participants learn and do?
Cumulative behavior changes across all
participants?
Quality of life
For participants specifically.
For society in general.
HOW TO MEASURE
Third-party research!
You don’t have to do it yourself.
OPR (Other People’s Research) -
There is usually some amount of
research that helps connect individual
behavior to larger society benefits.
COSTS
Due to OPR, this is usually not as complex
or time consuming as level 3.
FAITH
Direct evidence usually does not exist.
It is almost always prohibitively costly.
The ability to extrapolate meaning and
believe in it, and sell it is where the real
value is found.
TIME AND EFFORT
Level 4 evaluation should not happen on
every program.
10% of all programs (Phillips, 2012).
“HOW TO EVALUATE AT LEVEL 4?”
Culmination of levels 1-3.
5 – ROI (RETURN ON INVESTMENT)
$$$ input vs $$$ output
MOST DIFFICULT FOR TWO
REASONSNot only because levels 1-4 must be
accomplished first.
Results have potential to be either very
supportive or very threatening of what we
are doing. You won’t know until you
actually do it.
FORMULAS
Benefits/Costs Ratio
BCR = Program Benefits/Program
Costs
ROI (%) = Net Program
Benefits/Program Costs X 100
CRITERIA
Simple
Economical (3-5% of the total program
buget)
Credible
Part of the program from the beginning
5% of your programs (Phillips, 2012).
CHALLENGES
Nobody cares?
May set expectations high for future
programs.
Fear of failure.
Fear of the unknown.
Discipline and planning.
BENEFITS
Indisputable worth of program.
“You will lose money/value if you
choose not to support my program.”
Help your program compete for support
and resources.
Helps you prioritize your own programs.
Forces you to improve long-term
programs.
LEVEL 5 WORKSHEET
ROI (%) = 47,000-13,000/13,000 X 100 = 261%
BCR = 47,000/13,000 = $3.61
CONTACT
Luke Erickson
Erickson@uidaho.edu
208-359-6215
Rexburg, Idaho

University Cooperative Extension Evaluation 2.0

  • 1.
  • 2.
  • 3.
    KIRKPATRICK “In the guidelines…no information is given on how to use statistics. The subject is too complex to be included here.” pg 42
  • 4.
    5 LEVELS OFEVALUATION I. Reaction II. Knowledge III. Behavior IV. Society V. ROI
  • 5.
    BASED ON: Kirkpatrick, D.L., & Kirkpatrick, J. D. (2006). Evaluating training programs (ed.). San Francisco: TATAMcgraw Hill. ix-3.
  • 6.
    AND: Phillips, J. J.(2012). Return on investment in training and performance improvement programs. Routledge.
  • 7.
    AND JOE ARTICLES Chazdon,S., Horntvedt, J., Templin, E. (2016). From Knowledge to Action: Tips for Encouraging and Measuring Program-Related Behavior Change. Journal of Extension, 54(2). Clements, J. (1999). Results? Behavior change. Journal of Extension, 37(2). Martin, E., & Warner, L.A. (2015). Using commitment as a tool to promote behavior change in Extension Programming, Journal of Extension, 53(4). Pratt, C., & Bowman, S. (2008). Principles of Effective Behavior Change: Application to Extension family educational programming. Journal of Extension, 46(5).
  • 8.
    THE GOAL ISCHANGE Desire Know what to do and how to do it Climate Benefits for change
  • 9.
    CHALLENGES • Each levelbecomes more difficult and time consuming. • Each level builds on the success of the previous level. • It is difficult to achieve success at each level, but even more difficult to measure success at each level. • Forces your program to strengthen itself or retire. • Requires planning and long-term commitment. • Shouldn’t do entire process for every program.
  • 10.
    OPPORTUNITIES • They likeit! Word-of-mouth support from clientele. • Improved confidence, knowledge and skills. • Lives actually improved in a clear and measurable way. • Society actually improved in a clear and measurable way. • Systematic long-term administrative, financial, and moral support to continue your programs.
  • 13.
    EXPERIMENTAL DESIGN Pre-post Tests Experimentaland Control Groups Random Assignment Anonymity
  • 14.
    THERE IS SUCHA THING AS OVERKILLExperimental design is not always necessary. Make sure the final usage of data merits the time and costs associated with the use of rigid data controls. In Extension we don’t have the luxury of wasting time collecting data that we don’t really need.
  • 15.
  • 17.
    1 - REACTION AKACustomer satisfaction What did you like? What didn’t you like? Easiest form of evaluation.
  • 18.
  • 19.
  • 24.
    IF SATISFACTION NEEDSTO IMPROVEMake a change. Leader, instructor, location, etc. Modify the situation – train the leader, instructor, audiovisuals, etc. Live with it. Change the standard.
  • 25.
    100% RESPONSE RATE Penciland paper. Show of hands. No anonymity.
  • 26.
    DRAFT YOUR OWNQUESTION USING A LIKERT SCALE Group Discussion Group Share
  • 27.
    SOMETIMES LIKING THEPROGRAM ISN’T ENOUGH!
  • 28.
    2 - KNOWLEDGE AKAKnowledge, Skills and Attitude Prerequisite for Level 3 behavior change. More difficult and time-consuming than Level 1 evaluation.
  • 29.
    MAIN POINTS Use experimentaldesign when practical Content-based tests Use paper and pencil test for 100% response rate. Use live performance test for skills.
  • 30.
    SKILL VS BEHAVIOR Skill:What you are able to do. The ability to perform a specific task/action. Short- term. Behavior: What you actually do. The cumulative set of specific tasks/actions. Long-term.
  • 31.
    SKILLS Teach someone howto use Google. Measurement: Can they find a specific web page? Time how long it takes to find a specific web page.
  • 32.
    PROS AND CONS Selfassessment VS Knowledge/skill based assessment. Retrospective VS Pre/post
  • 35.
  • 37.
    DRAFT YOUR OWNQUESTION USING A LIKERT SCALE Group Discussion Group Share
  • 38.
    100 % RESPONSE? 50-70% is a more realistic goal.
  • 39.
    3. BEHAVIOR Level 2changes are prerequisite to Level 3 behavior change. Much more complex and time-consuming than Levels 1 and 2
  • 40.
    NIFA Most important, perhaps,is that the National Institute for Food and Agriculture continues to push for impacts that affect conditions rather than simply knowledge changes. Their effort to collect impacts from across the country encourages evaluation specialists to look beyond knowledge change (National Institute for Food and Agriculture, 2015).
  • 41.
    BEHAVIOR CHANGE AND EXTENSIONIn1975, Claude Bennett noted in the Journal of Extension that behavior change was among the highest levels of evidence for evaluation of Extension education (Bennett, 1975). Workman and Scheer's (2012) meta- analysis of evaluation articles published in the Journal of Extension found that about 27% of articles focused on behavior change.
  • 42.
    OPPORTUNITY TO CHANGE BEHAVIOR 1-3days after program? 1 month after program? 3 months after program? 6 months? 1 year? 3 years?
  • 43.
    TIMING OF EVALUATION “…itis impossible to predict when a change in behavior will occur....change in behavior may occur at any time or it may not happen at all.” pg 52
  • 44.
    EVALUATE BEHAVIOR When? Too soon… Toolate… How frequent? Different stages of behavior change will be manifested at different points in time. How to get a good response? 1-3 assessments?
  • 45.
    INCENTIVES/REWARDS Intrinsic – Participantsknow and respect you on some personal level. Rapport. Satisfaction, pride, sense of achievement in program or behavior itself. Extrinsic – Praise, recognition, gift cards, payment, raise.
  • 46.
    100 % RESPONSERATE? Very difficult. Always consider the costs and benefits. Something is better than nothing. 25-35% of programs.
  • 47.
    COSTS AND BENEFITS “Anotherimportant consideration is how many times the program will be taught. If it is run only once and it will not be repeated, there is little justification for spending time and money to evaluate possible changes in behavior. However if program is going to be repeated, the time and money spent evaluating it can be justified by the possible improvements in future programs.” Kirkpatrick, pg 60
  • 48.
    “...something beats nothing,and I encourage trainers to do some evaluating of behavior even if it isn't elaborate or scientific.” Kirkpatrick, pg 61
  • 49.
    SOMETHING IS BETTERTHAN NOTHING Sample your participants: Low response rates. Pros and cons Intentionally single out certain participants for a case study on their behavior.
  • 50.
    CHAZDON, HORNTVED, TEMPLIN (JOE) Participantsdefine their own action plan. Set their own specific action goals. Follow-up is expected, and more personal. Requires more class time to implement. Post-program education/social media groups. Social networking, learning “buddies.”
  • 51.
    CHAZDON, HORNTVED, TEMPLIN Theintensity or length of a program is not an impediment to measuring behavior change. Even a one-time, 1-hr workshop can produce behavior change.
  • 52.
    TTM The move fromone stage of change to another indicates we have made an impact, even though the practice has not been completely internalized.
  • 53.
    TRANSTHEORETICAL MODEL (TTM) OFBEHAVIOR CHANGE Precontemplation Contemplation Preparation Action Maintenance
  • 54.
    MARTIN, WARNER (JOE)2015 By asking target audience members to commit to something, Extension professionals can increase the likelihood of audience members adopting a new behavior.
  • 55.
    4 - SOCIETY Workmanand Scheer (2012) noted, "Too often, Extension personnel fail to document impact of programs by collecting real evidence of behavior change or greater end results that benefit society" (Problem Statement, Purpose, and Objectives section, para. 1).
  • 56.
    CHALLENGES Clearly identify whatyour program is supposed to do. Clearly measure that your program accomplishes what you want it to do.
  • 57.
    WHAT SHOULD YOURPROGRAM DO?What should participants learn and do? Cumulative behavior changes across all participants? Quality of life For participants specifically. For society in general.
  • 58.
    HOW TO MEASURE Third-partyresearch! You don’t have to do it yourself. OPR (Other People’s Research) - There is usually some amount of research that helps connect individual behavior to larger society benefits.
  • 59.
    COSTS Due to OPR,this is usually not as complex or time consuming as level 3.
  • 60.
    FAITH Direct evidence usuallydoes not exist. It is almost always prohibitively costly. The ability to extrapolate meaning and believe in it, and sell it is where the real value is found.
  • 61.
    TIME AND EFFORT Level4 evaluation should not happen on every program. 10% of all programs (Phillips, 2012).
  • 62.
    “HOW TO EVALUATEAT LEVEL 4?” Culmination of levels 1-3.
  • 63.
    5 – ROI(RETURN ON INVESTMENT) $$$ input vs $$$ output
  • 64.
    MOST DIFFICULT FORTWO REASONSNot only because levels 1-4 must be accomplished first. Results have potential to be either very supportive or very threatening of what we are doing. You won’t know until you actually do it.
  • 65.
    FORMULAS Benefits/Costs Ratio BCR =Program Benefits/Program Costs ROI (%) = Net Program Benefits/Program Costs X 100
  • 69.
    CRITERIA Simple Economical (3-5% ofthe total program buget) Credible Part of the program from the beginning 5% of your programs (Phillips, 2012).
  • 70.
    CHALLENGES Nobody cares? May setexpectations high for future programs. Fear of failure. Fear of the unknown. Discipline and planning.
  • 71.
    BENEFITS Indisputable worth ofprogram. “You will lose money/value if you choose not to support my program.” Help your program compete for support and resources. Helps you prioritize your own programs. Forces you to improve long-term programs.
  • 72.
  • 73.
    ROI (%) =47,000-13,000/13,000 X 100 = 261% BCR = 47,000/13,000 = $3.61
  • 74.