4.18.24 Movement Legacies, Reflection, and Review.pptx
AFCPE 2016 Symposium workshop-Measuring & Reporting Impact of Financial Education
1. Measuring and Reporting
the Impact of
Financial Education
Barbara O’Neill, Ph.D., CFP®, CRPC®
Distinguished Professor, Rutgers University
oneill@aesop.rutgers.edu
3. The Million Dollar “So
What?” Question…
At the end of the day…did your educational
program make a difference?
How do you know?
4. • What gets measured gets funded
• With evaluation results, you can…
– assess impact of programs on learners
– see if you accomplished what you planned
– know if a program was “worth it”
– celebrate success and learn from failure
– make informed decisions to improve, hold, or fold programs
– promote your program and win public support
We are in an “Accountability Era”
5. Option #1: Hire a Third
Party Program Evaluator
• Some funders require this (e.g., FINRA Investor
Education Foundation)
• No potential for conflict of interest by one entity
performing both the education and evaluation role
• Takes evaluation “monkey” off busy educators’ back
• Build into grant funding requests
6. Option #2: “Do it Yourself”: Conduct
Evaluations Alone Or With Colleagues
• Use evaluation methods that you understand and
feel comfortable with
• Learn associated skills (e.g., how to set up an
online survey)
• Partner with a researcher for statistical analyses
• Begin with the end in mind: evaluation metrics
should determine program content and methods
8. Building a Strong
Financial Education Program
INPUTS OUTPUTS OUTCOMES
Program
investments
Activities Participation Short Medium
What
We
Invest
What
We Do
Who We
Reach
What Results
SO WHAT??
What is the VALUE?
Long-
term
10. OUTPUTS
What We Do Who We Reach
ACTIVITIES
Assess needs and assets
Design curriculum
Educate students
Conduct workshops
Facilitate learning groups
Sponsor conferences
Work with the media
Partner – collaborate
PARTICIPATION
Participants
Clients
Customers
Users
Groups
Reactions - Satisfaction
11. OUTCOMES
What Results for Individuals, Organizations, Communities..…
SHORT
Learning
Awareness
Knowledge
Attitudes
Skills
Opinion
Aspirations
Motivation
MEDIUM
Action
Behavior
Practice
Decisions
Policies
Social action
LONG-TERM
Conditions
Human
Economic
Civic
Environment
12. Cost-Benefit Analysis
• Compares the cost of inputs to the benefit of outcomes
• The larger the dollar value of benefits relative to
program costs (for example, a 20:1 multiple versus a
5:1 multiple), the better.
• Multiple often grows over time because many program
costs are incurred “up front”
• Resource: https://joe.org/joe/1999august/tt3.php
13. Return on Investment
(ROI) Calculations
• Follow methodology used by businesses to make decisions
• The higher the multiple, the better
• Resource: https://joe.org/joe/2008february/tt4.php
Formula:
(Benefits - Cost) ÷ Cost x 100
Example:
$1.1 million of economic impact
($1.1 million - $200,000 =$900,000) ÷$200,000 x 100 =
450%
Program generated $4.50 in net benefits for every $1 spent
14. OUTCOMES
Commonly Measured Items
That Are Not Outcomes
• Participant satisfaction
• Number of people taught
• Units of education completed
• Number of events held
• Time and money spent
• Level of effort
16. Impact Evaluation Data
Collection Methods
• Surveys (Paper or Online)
– Post-evaluation only (short programs)
– Pre- and post-evaluation
– Follow-up (e.g., 3 months later)
• Focus groups
• Interviews
• Observations
• Tests of knowledge/ability
• RARE: Control groups and longitudinal studies
17. Typical Survey Questions
• General reactions to the program
• Changes in knowledge
• Changes in motivation, confidence, and abilities
• Intended changes in behavior
• Actual changes in behavior
• Future programming needs and preferences
• Demographics of participants
• Qualitative/open-ended responses
18. Post-Then-Pre
(Retrospective) Evaluation
Method
• Compares knowledge and attitudes before and
after a financial education intervention
• Administered once at the end of the intervention
• Helps avoid biases due to people thinking they
know more than they know
19. Case Example:
Humpty Dumpty
“Humpty Dumpty sat on a wall,
Humpty Dumpty had a great fall,
All the king’s horses and all the king’s men
Couldn’t put Humpty together again”
21. Post-Then-Pre (Retrospective)
Evaluation Method
• Why? Because people don’t know what they don’t know!
• Who knew what Humpty Dumpty really was?
• Five point rating scale
– 1 = Strongly Disagree
– 5 = Strongly Agree
• Post: After listening to Barb, I know the history of the
Humpty Dumpty nursery rhyme
• Pre: Before listening to Barb, I knew the history of the
Humpty Dumpty nursery rhyme
22. Post-Then-Pre Evaluation
• Also known as a “Retrospective” evaluation
• Helps identify changes in knowledge, attitudes, and
behavior
24. Likert-Type Scales
• Scaled responses used frequently in survey research
• Assumes distances between each item are equal
• Typically five or seven levels of responses
• Example: 1= Strongly disagree, 2= Disagree, 3= Neither
agree or disagree (Neutral), 4=Agree, 5= Strongly Agree
• Items can be analyzed separately or summed to create a
scale (score for a group of items)
• Can compare means for items or use t-tests for statistical
significance
https://www.clemson.edu/centers-
institutes/tourism/documents/sample-scales.pdf
25. Rubrics
• Scoring guide used to evaluate responses to a question
• Often used by teachers to communicate expectations for
assignments
Resources:
http://rubistar.4teachers.org/
http://nextgenpersonalfinance.
org/forum/rubrics/
http://rci.rutgers.edu/~boneill/a
ssignments/Assignment-
Grading-Rubric-Table.pdf
26. Action Items Method
• Requires program participants to define their own action
plans using a worksheet that includes their signature
– What I will do
– Where I will do it (family, work, community)
– When and how I will do it (timeline and action steps)
– Who will help me
• Follow-up evaluation to ask learners about goal
attainment several months after program completion
• Participants themselves define “success”
• Resource: https://joe.org/joe/2016april/tt1.php
27. Critical Incident Technique
• Qualitative evaluation research method where
program participants tell personal stories
• Often used with “Train the Trainer” programs for
professionals (e.g., teachers, librarians)
• Incidents are categorized and deemed successful
(positive results) or unsuccessful (negative results)
http://www.joe.org/joe/2013june/tt2.php (Journal of Extension article)
28. More About the Critical
Incident Technique
http://www.adb.org/documents/information/knowledge-
solutions/the-critical-incident-technique.pdf
• Ask subjects to describe- through interviews-incidents that
they handled well or poorly (need not be spectacular events)
– Example: How did library staff handle patrons’ personal finance
questions?
• Provides rich personal perspectives
• Pre-intervention and Post-intervention comparison of
number and type of incidents
30. CIT Questions in NYPL
Financial Education
Evaluation
• Think about your experiences helping patrons with personal finance
questions. Remember a time when you had a successful experience
helping someone with these types of questions. Please write down what
happened.
• What made this a successful, positive experience?
• Think about your experiences helping patrons with personal finance
questions. Remember a time when you had an unsuccessful experience
helping someone with these types of questions. Please write down what
happened.
• What made this an unsuccessful or challenging experience?
31. CIT Summary in Final NYPL
Project Evaluation Report
“The change in the percentages of the types of incidents is indicative of
change in attitudes and abilities. Also, a new category appeared in the
successful incidents, Increased Knowledge/Confidence/Satisfaction,
which was not present in the Pre-training Survey. Furthermore, for the
unsuccessful critical incidents, the category Lack of Training/
Knowledge which was the most frequently seen theme for the Pre-
training Survey was not found at all in the Post-training Survey.
These findings resonate with quantitative findings and provide confirming
evidence that the Money Matters training has been successful in
improving participants’ knowledge of personal finance and their
ability to be successful in handling patron inquiries. Additionally, staff
members provided eloquent testament in their qualitative responses that
their skills have been enhanced and their attitudes and behaviors became
more positive for ably handling personal finance queries” (Radford, 2013).
33. Triangulation (Multiple
Methods) Evaluation Approach
• Unique Twitter hashtag: #eXasw
• Follow-up follower/friend survey
• Follow-up project participant survey
• bit.ly analytics to determine number of clicks on unique
embedded links
• Pre- and post-ASW Twitter influence metrics (Klout score)
34. Klout Score Progression
• Klout is a measure of a person’s “influence” on twitter
• Based on an algorithm with factors such as number of
followers and number of retweets
• Go to www.klout.com and log in with Twitter or Facebook
– 2011: 11.22 to 19.68
– 2012: 20.3 to 29.3
– 2013: 32.6 to 39.6
– 2014: 29.76 to 38.84
– 2015: 23.8 to 30.7
35. Twitter Chat
Evaluation Methods
• Online survey (e.g., Qualtrics) link embedded into final tweets
(with prizes as an incentive to complete)
Dr. Barbara O'Neill @moneytalk1Apr 29
Please let us know if U found #SSHWchat helpful & take this brief survey:
https://rutgers.qualtrics.com/SE/?SID=SV_9X0V77kUKhOqlWB … Will pull winners at 1:30 pm #sshwchat
• TweetReach: http://tweetreach.com/ or Hashtracking:
https://www.hashtracking.com/ (type in hashtag to pull a report)
• Follow-up contact from participants (Twitter direct messages, e-
mail, etc.)
• Traffic to Web site during and after a chat
• Other?
37. Archive a Twitter Chat
With Storify and Track Use
https://storify.com/ Storify is an organized collection of tweets
with links, photos, etc. to “tell a story” from Twitter content
Video: https://www.youtube.com/watch?v=l9iHniFjiVc
Storify Samples:
https://storify.com/RutgersNJAES/small-steps-to-health-and-wealth
https://storify.com/RutgersSEBS/cook-douglass-community-day-2014-at-rutgers
https://storify.com/wisebread/how-are-you-saving-for-retirement
https://storify.com/JerryBuchko/mcpd2014-twitter-chat-archive-1
39. Who Needs to Know
About the Results of
Your Educational
Programs? Why?
40. Impact Statements:
Intentions
As a result of participating in this financial education
program, X% of participants reported that they…
• plan to do/use/adopt…
• are more knowledgeable about…
• are more confident in their ability to…
• are more likely than before to do/use/adopt…
• will do/use/adopt…
…a particular attitude, piece of information, or behavior.
41. Social Media
Impact Reports
• Number of participants
• TweetReach report outreach numbers
• Participant survey data
• Data from other feedback methods
• Sample Evaluation Report:
http://www.slideshare.net/BarbaraONeill/ssh
w-twitter-chat-impact-statement-0414
43. Public Value Statements
• Focus on an outcome that matters to
stakeholder(s)
• Use stakeholder’s language
• Avoid jargon and empty words
• Should be short and believable
49. Aggregating Data From
Multiple Sources
• Common curriculum and evaluation tools (easiest)
• Different curricula, same topic, and common indicators
such as
– Saving money
– Reducing debt
– Enrolling in 401(k) plan
• There needs to be a “cat herder” to manage the process
and a convenient automated data collection system
53. NEFE White Paper: Perspectives on
Evaluation in Financial Education:
Landscape, Issues, and Studies
• Children
• Youth
• Young Adults
• Working Adults
• Military Personnel
• Low-Income
Consumers
• Student Loans
• Homeownership
• Retirement Planning
• Financial Advising
54. Questions and
Comments?
Barbara O'Neill, Ph.D., CFP®, CRPC
Extension Specialist in Financial Resource Management
and Distinguished Professor, Rutgers University
Phone: 848-932-9126
E-mail: oneill@aesop.rutgers.edu
Internet: http://njaes.rutgers.edu/money/
Twitter: http://twitter.com/moneytalk1