Measuring Financial Education Success:
Evaluation Methods, Assessment Tools,
        and Impact Statements


      Barbara O’Neill, Ph.D, CFP®, CRPC®
        Professor II, Rutgers University
           oneill@aesop.rutgers.edu
Are You Evaluating Your Financial
      Education Programs?

        If You Are, How?
The Million Dollar “So What?” Question…

  At the end of the day…did your financial education
  program make a difference?




How do you know?
The Current State of Financial Education
  Program Evaluation
• Current evaluation efforts are still far from satisfactory

• General lack of evaluation capacity

• Evaluation is often treated as an after-thought

• Outcomes (e.g., measures of changed behavior) are often confused with
  program outputs (e.g., # of participants)

• References:

    – Lyons: http://www.ipfp.k-
      state.edu/documents/jpf/04/04/education.pdf

    – Federal Reserve Bulletin:
      http://heinonline.org/HOL/Page?handle=hein.journals/fedred88&div
      =158&g_sent=1&collection=journals
We are in an “Accountability Era”
• What gets measured gets funded


• With evaluation results, you can
   – assess impact of programs on learners
   – see if you accomplished what you planned
   – know if a program was “worth it”
   – celebrate success and learn from failure
   – make informed decisions to improve, hold, or fold programs
   – promote your program and win public support
Introducing the Logic Model
• See http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
• Begins with the end in mind
• Explains what a program is and what it will accomplish
• Shows relationships between inputs, outputs, and outcomes
Building a Strong
              Financial Education Program


INPUTS               OUTPUTS                        OUTCOMES
                                                                    Long-
               Activities   Participation   Short      Medium        term
  Program
investments


    What       What          Who We                  What Results
     We        We Do          Reach
   Invest
                                                    SO WHAT??
                                              What is the VALUE?
INPUTS


       Staff
       Money
       Time
     Volunteers
      Partners
Equipment/Technology
      Policies
      Research
OUTPUTS
        What We Do                     Who We Reach


         ACTIVITIES                      PARTICIPATION
Assess needs and assets          Participants
Design curriculum                Clients
Educate students                 Customers
Conduct workshops                Users

Facilitate learning groups       Groups

Sponsor conferences
Work with the media              Reactions - Satisfaction

Partner – collaborate
OUTCOMES
     What Results for Individuals, Organizations, Communities..…



         SHORT               MEDIUM            LONG-TERM
         Learning              Action                   Conditions



Awareness              Behavior                 Human

Knowledge              Practice                 Economic

Attitudes              Decisions                Civic

Skills                 Policies                 Environment

Opinion                Social action
Aspirations
Motivation
Commonly Measured Items
              That Are Not Outcomes

• Participant satisfaction

• Number of people taught

• Units of education completed
                                 OUTCOMES
• Number of events held

• Time and money spent

• Level of effort
Impact Evaluation Data Collection Methods
 • Surveys
    – Post-evaluation only (short programs)
    – Pre- and post-evaluation
    – Follow-up (e.g., 3 months later)

 • Focus groups
 • Interviews
 • Observations
 • Tests of knowledge/ability
 • RARE: Control groups and longitudinal studies
Typical Survey Questions

• General reactions to the program
• Changes in knowledge
• Changes in motivation, confidence, and abilities
• Intended changes in behavior
• Actual changes in behavior
• Future programming needs and preferences
• Demographics
• Qualitative/open-ended responses
Post-Then Pre Evaluation Method
• Often BETTER than traditional pre-test-post-test method

• First ask subjects about knowledge and/or behavior after
  an educational intervention

• Then ask subjects about knowledge and/or behavior
  before the educational intervention in the PAST TENSE

• Avoids pre-test sensitivity and response shift bias

   – People often don’t know what they don’t know before an
     educational intervention
Case Example: Humpty Dumpty

“Humpty Dumpty sat on a wall,


Humpty Dumpty had a great fall,


All the king’s horses and all the king’s men


Couldn’t put Humpty together again”
So Who/What Was Humpty Dumpty?




                      Probably
                      NOT what
                      you think!
Post-Then-Pre Evaluation Survey
• Why? Because people don’t know what they don’t know!
• Who knew what Humpty Dumpty really was?

• Five point rating scale
   – 1 = Strongly Disagree
   – 5 = Strongly Agree

• Post: After listening to Barb, I know the history of the
  Humpty Dumpty nursery rhyme

• Pre: Before listening to Barb, I knew the history of the
  Humpty Dumpty nursery rhyme
Post-Then-Pre Evaluation Surveys
• Also known as a “Retrospective Pre-Test”
• Helps identify changes in knowledge, attitudes, and
  behavior
Another Post-Then-Pre Example

Big Advantage: Avoids the problem of learners under-
estimating what they don’t know before a program
NEFE Evaluation Toolkit
http://toolkit.nefe.org
Downloadable Evaluation Manual
Build Your Own Evaluations Online
Quantitative Evaluation Methods
• Cost-Benefit Analysis
  –   Program costs divided into reported economic benefits
  –   The larger the multiple, the better (e.g., 15:1 vs. 3:1)
  –   Example: $3,046,975 ÷ $120,000 = $25.39 (Money 2000)
  –   See http://www.joe.org/joe/1999august/tt3.php


• Return on Investment (ROI)
  – Benefit-Cost
       Cost         x 100
  – Example: $1.1 million - $200,000 = $900,000
                       $200,000          $200,000 x 100 = 4.5
  – “Even after all program costs were subtracted, the program
    generated $4.50 in net benefits for every $1 invested.”
  – 450% ROI
More Qualitative Evaluation Methods
• Time Value of Money Analyses and/or Plain Math
  –   Conservative estimates: e.g., potential future savings
  –   Assume positive results for a small number of students
  –   Assume modest interest rates
  –   Best if supported with actual follow-up behavior change data
  –   Example: If just 100 of the 300 students who were reached
      saved $10 per week, each would have saved $500 in a year
      ($50,000 total) which would be worth $28,000 in 30 years at
      4% interest ($2.8 million total)

• Extrapolation From Published Cost Estimates
  – Find a research-based data source related to program
  – Example: money saved by weight loss, lower recidivism, etc.
  – See http://www.joe.org/joe/2008february/tt4p.shtml
Qualitative Evaluation-
       The Critical Incident Technique
• http://www.apa.org/pubs/databases/psycinfo/cit-
  article.pdf (Flanagan)

• Qualitative research technique

• Using in NY Public Library staff training grant project

• Ask subjects to describe through interviews incidents that
  they handled well or poorly (need not be spectacular events)

• Provides rich personal perspectives and good quotes

• Respondents like to tell stories and feel their experiences
  are important
Creative Evaluation-Xtranormal Video
Scenario Knowledge Gain Assessment
• Creating an educational video is easy...
  –   Choose from hundreds of actors (avatars or stikz)
  –   Choose their voices and other sounds
  –   Choose their gestures
  –   Select your background
  –   Type or record your dialogue
• http://www.xtranormal.com/


                                    Examples:
                                    http://www.xtranormal.com/watch/1364604
                                    6/nypl-money-matters-identity-theft

                                    http://www.xtranormal.com/watch/1364598
                                    3/nypl-money-matters-credit-debt-10
Social Media Evaluation Methods
• Impact STILL matters!!!

• eXtension Financial Security for All Community of Practice:
  one of FIRST social media financial education evaluation
  studies ever

• 2011 and 2012 America Saves Week (ASW) social media
  project with a focused evaluation methodology

• Read more in 2011 Journal of NEAFCS (O’Neill et al.):
  http://www.neafcs.org/assets/Journal/NEAFCS-2011-
  Journal.pdf.
Triangulation (Multiple Methods)
      Evaluation Approach
• Unique Twitter hashtag: #eXasw
   –   Save loose change. Saving $1 + change/day allows u 2 save $500/yr. Small changes=big savings!
       For more info, http://bit.ly/ASaves #eXasw

   –   Do u track your spending? America Saves says 2 review monthly purchases & put $ into savings. For
       more info, http://bit.ly/ASaves #eXasw


• Follow-up follower/friend survey

• Follow-up project participant survey

• bit.ly analytics to determine the number of clicks on
  unique embedded links

• Pre- and post-ASW Twitter influence metrics
Monitoring Twitter Impact
http://klout.com
Monitoring Twitter Impact
http://www.peerindex.net
2011 Follower/Friend Survey Comments
• “Keep up the good work,”
• “These tips are timely and beneficial. I appreciate the
  effort to help us help ourselves,”
• “The tweets made me think about the ways we are
  managing our money,”
• “I welcome any and all suggestions for increasing my
  financial well-being.”
• One respondent complained about a lack of money to
  save
• There was one solicitation by a commercial firm
One Last Step:
Share Your Program Evaluation Results



             WHAT?
             SO
             WHAT?
             NOW
             WHAT?
Tips for “Telling Your Story”

• Use simple descriptive statistics (e.g., counts,
  percentages, and averages)
• Don’t use jargon
• Don’t overstate your results (e.g., causality)
• Blend quantitative and qualitative data
• Clearly describe who the results represent (i.e.,
  demographic characteristics of participants)
• Be honest about your program’s strengths and
  weaknesses, while highlighting the positive
Impact Statements: Intentions

    As a result of participating in this financial education
    program, X% of participants reported that they…

•   plan to do/use/adopt…
•   are more knowledgeable about…
•   are more confident in their ability to…
•   are more likely than before to do/use/adopt…
•   will do/use/adopt…

…a particular attitude, piece of information, or behavior.
Impact Statements: Actions

    As a result of participating in this financial education
    program, X% of participants reported that they…

•   are now doing…
•   did…
•   used…
•   increased their knowledge of…
•   adopted…
•   changed…

… a particular attitude, piece of information, or behavior.
Questions and Comments?
Barbara O'Neill, Ph.D., CFP®, CRPC

Extension Specialist in Financial Resource Management and
Professor II
Rutgers University

Phone: 848-932-9126

E-mail: oneill@aesop.rutgers.edu

Internet: http://njaes.rutgers.edu/money2000/

Twitter: http://twitter.com/moneytalk1

NEAFCS 2012 Financial Ed eval-assessment-impact-o'neill-09-12

  • 1.
    Measuring Financial EducationSuccess: Evaluation Methods, Assessment Tools, and Impact Statements Barbara O’Neill, Ph.D, CFP®, CRPC® Professor II, Rutgers University oneill@aesop.rutgers.edu
  • 2.
    Are You EvaluatingYour Financial Education Programs? If You Are, How?
  • 3.
    The Million Dollar“So What?” Question… At the end of the day…did your financial education program make a difference? How do you know?
  • 4.
    The Current Stateof Financial Education Program Evaluation • Current evaluation efforts are still far from satisfactory • General lack of evaluation capacity • Evaluation is often treated as an after-thought • Outcomes (e.g., measures of changed behavior) are often confused with program outputs (e.g., # of participants) • References: – Lyons: http://www.ipfp.k- state.edu/documents/jpf/04/04/education.pdf – Federal Reserve Bulletin: http://heinonline.org/HOL/Page?handle=hein.journals/fedred88&div =158&g_sent=1&collection=journals
  • 5.
    We are inan “Accountability Era” • What gets measured gets funded • With evaluation results, you can – assess impact of programs on learners – see if you accomplished what you planned – know if a program was “worth it” – celebrate success and learn from failure – make informed decisions to improve, hold, or fold programs – promote your program and win public support
  • 6.
    Introducing the LogicModel • See http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html • Begins with the end in mind • Explains what a program is and what it will accomplish • Shows relationships between inputs, outputs, and outcomes
  • 7.
    Building a Strong Financial Education Program INPUTS OUTPUTS OUTCOMES Long- Activities Participation Short Medium term Program investments What What Who We What Results We We Do Reach Invest SO WHAT?? What is the VALUE?
  • 8.
    INPUTS Staff Money Time Volunteers Partners Equipment/Technology Policies Research
  • 9.
    OUTPUTS What We Do Who We Reach ACTIVITIES PARTICIPATION Assess needs and assets Participants Design curriculum Clients Educate students Customers Conduct workshops Users Facilitate learning groups Groups Sponsor conferences Work with the media Reactions - Satisfaction Partner – collaborate
  • 10.
    OUTCOMES What Results for Individuals, Organizations, Communities..… SHORT MEDIUM LONG-TERM Learning Action Conditions Awareness Behavior Human Knowledge Practice Economic Attitudes Decisions Civic Skills Policies Environment Opinion Social action Aspirations Motivation
  • 11.
    Commonly Measured Items That Are Not Outcomes • Participant satisfaction • Number of people taught • Units of education completed OUTCOMES • Number of events held • Time and money spent • Level of effort
  • 12.
    Impact Evaluation DataCollection Methods • Surveys – Post-evaluation only (short programs) – Pre- and post-evaluation – Follow-up (e.g., 3 months later) • Focus groups • Interviews • Observations • Tests of knowledge/ability • RARE: Control groups and longitudinal studies
  • 13.
    Typical Survey Questions •General reactions to the program • Changes in knowledge • Changes in motivation, confidence, and abilities • Intended changes in behavior • Actual changes in behavior • Future programming needs and preferences • Demographics • Qualitative/open-ended responses
  • 14.
    Post-Then Pre EvaluationMethod • Often BETTER than traditional pre-test-post-test method • First ask subjects about knowledge and/or behavior after an educational intervention • Then ask subjects about knowledge and/or behavior before the educational intervention in the PAST TENSE • Avoids pre-test sensitivity and response shift bias – People often don’t know what they don’t know before an educational intervention
  • 15.
    Case Example: HumptyDumpty “Humpty Dumpty sat on a wall, Humpty Dumpty had a great fall, All the king’s horses and all the king’s men Couldn’t put Humpty together again”
  • 16.
    So Who/What WasHumpty Dumpty? Probably NOT what you think!
  • 17.
    Post-Then-Pre Evaluation Survey •Why? Because people don’t know what they don’t know! • Who knew what Humpty Dumpty really was? • Five point rating scale – 1 = Strongly Disagree – 5 = Strongly Agree • Post: After listening to Barb, I know the history of the Humpty Dumpty nursery rhyme • Pre: Before listening to Barb, I knew the history of the Humpty Dumpty nursery rhyme
  • 18.
    Post-Then-Pre Evaluation Surveys •Also known as a “Retrospective Pre-Test” • Helps identify changes in knowledge, attitudes, and behavior
  • 19.
    Another Post-Then-Pre Example BigAdvantage: Avoids the problem of learners under- estimating what they don’t know before a program
  • 20.
  • 21.
  • 22.
    Build Your OwnEvaluations Online
  • 23.
    Quantitative Evaluation Methods •Cost-Benefit Analysis – Program costs divided into reported economic benefits – The larger the multiple, the better (e.g., 15:1 vs. 3:1) – Example: $3,046,975 ÷ $120,000 = $25.39 (Money 2000) – See http://www.joe.org/joe/1999august/tt3.php • Return on Investment (ROI) – Benefit-Cost Cost x 100 – Example: $1.1 million - $200,000 = $900,000 $200,000 $200,000 x 100 = 4.5 – “Even after all program costs were subtracted, the program generated $4.50 in net benefits for every $1 invested.” – 450% ROI
  • 24.
    More Qualitative EvaluationMethods • Time Value of Money Analyses and/or Plain Math – Conservative estimates: e.g., potential future savings – Assume positive results for a small number of students – Assume modest interest rates – Best if supported with actual follow-up behavior change data – Example: If just 100 of the 300 students who were reached saved $10 per week, each would have saved $500 in a year ($50,000 total) which would be worth $28,000 in 30 years at 4% interest ($2.8 million total) • Extrapolation From Published Cost Estimates – Find a research-based data source related to program – Example: money saved by weight loss, lower recidivism, etc. – See http://www.joe.org/joe/2008february/tt4p.shtml
  • 25.
    Qualitative Evaluation- The Critical Incident Technique • http://www.apa.org/pubs/databases/psycinfo/cit- article.pdf (Flanagan) • Qualitative research technique • Using in NY Public Library staff training grant project • Ask subjects to describe through interviews incidents that they handled well or poorly (need not be spectacular events) • Provides rich personal perspectives and good quotes • Respondents like to tell stories and feel their experiences are important
  • 26.
    Creative Evaluation-Xtranormal Video ScenarioKnowledge Gain Assessment • Creating an educational video is easy... – Choose from hundreds of actors (avatars or stikz) – Choose their voices and other sounds – Choose their gestures – Select your background – Type or record your dialogue • http://www.xtranormal.com/ Examples: http://www.xtranormal.com/watch/1364604 6/nypl-money-matters-identity-theft http://www.xtranormal.com/watch/1364598 3/nypl-money-matters-credit-debt-10
  • 27.
    Social Media EvaluationMethods • Impact STILL matters!!! • eXtension Financial Security for All Community of Practice: one of FIRST social media financial education evaluation studies ever • 2011 and 2012 America Saves Week (ASW) social media project with a focused evaluation methodology • Read more in 2011 Journal of NEAFCS (O’Neill et al.): http://www.neafcs.org/assets/Journal/NEAFCS-2011- Journal.pdf.
  • 28.
    Triangulation (Multiple Methods) Evaluation Approach • Unique Twitter hashtag: #eXasw – Save loose change. Saving $1 + change/day allows u 2 save $500/yr. Small changes=big savings! For more info, http://bit.ly/ASaves #eXasw – Do u track your spending? America Saves says 2 review monthly purchases & put $ into savings. For more info, http://bit.ly/ASaves #eXasw • Follow-up follower/friend survey • Follow-up project participant survey • bit.ly analytics to determine the number of clicks on unique embedded links • Pre- and post-ASW Twitter influence metrics
  • 29.
  • 30.
  • 31.
    2011 Follower/Friend SurveyComments • “Keep up the good work,” • “These tips are timely and beneficial. I appreciate the effort to help us help ourselves,” • “The tweets made me think about the ways we are managing our money,” • “I welcome any and all suggestions for increasing my financial well-being.” • One respondent complained about a lack of money to save • There was one solicitation by a commercial firm
  • 32.
    One Last Step: ShareYour Program Evaluation Results WHAT? SO WHAT? NOW WHAT?
  • 33.
    Tips for “TellingYour Story” • Use simple descriptive statistics (e.g., counts, percentages, and averages) • Don’t use jargon • Don’t overstate your results (e.g., causality) • Blend quantitative and qualitative data • Clearly describe who the results represent (i.e., demographic characteristics of participants) • Be honest about your program’s strengths and weaknesses, while highlighting the positive
  • 34.
    Impact Statements: Intentions As a result of participating in this financial education program, X% of participants reported that they… • plan to do/use/adopt… • are more knowledgeable about… • are more confident in their ability to… • are more likely than before to do/use/adopt… • will do/use/adopt… …a particular attitude, piece of information, or behavior.
  • 35.
    Impact Statements: Actions As a result of participating in this financial education program, X% of participants reported that they… • are now doing… • did… • used… • increased their knowledge of… • adopted… • changed… … a particular attitude, piece of information, or behavior.
  • 36.
    Questions and Comments? BarbaraO'Neill, Ph.D., CFP®, CRPC Extension Specialist in Financial Resource Management and Professor II Rutgers University Phone: 848-932-9126 E-mail: oneill@aesop.rutgers.edu Internet: http://njaes.rutgers.edu/money2000/ Twitter: http://twitter.com/moneytalk1