NEAFCS 2012 Financial Ed eval-assessment-impact-o'neill-09-12
Upcoming SlideShare
Loading in...5

NEAFCS 2012 Financial Ed eval-assessment-impact-o'neill-09-12



NEAFCS 2012 workshop presentation on evaluating financial education programs

NEAFCS 2012 workshop presentation on evaluating financial education programs



Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    NEAFCS 2012 Financial Ed eval-assessment-impact-o'neill-09-12 NEAFCS 2012 Financial Ed eval-assessment-impact-o'neill-09-12 Presentation Transcript

    • Measuring Financial Education Success:Evaluation Methods, Assessment Tools, and Impact Statements Barbara O’Neill, Ph.D, CFP®, CRPC® Professor II, Rutgers University
    • Are You Evaluating Your Financial Education Programs? If You Are, How?
    • The Million Dollar “So What?” Question… At the end of the day…did your financial education program make a difference?How do you know?
    • The Current State of Financial Education Program Evaluation• Current evaluation efforts are still far from satisfactory• General lack of evaluation capacity• Evaluation is often treated as an after-thought• Outcomes (e.g., measures of changed behavior) are often confused with program outputs (e.g., # of participants)• References: – Lyons: http://www.ipfp.k- – Federal Reserve Bulletin: =158&g_sent=1&collection=journals
    • We are in an “Accountability Era”• What gets measured gets funded• With evaluation results, you can – assess impact of programs on learners – see if you accomplished what you planned – know if a program was “worth it” – celebrate success and learn from failure – make informed decisions to improve, hold, or fold programs – promote your program and win public support
    • Introducing the Logic Model• See• Begins with the end in mind• Explains what a program is and what it will accomplish• Shows relationships between inputs, outputs, and outcomes
    • Building a Strong Financial Education ProgramINPUTS OUTPUTS OUTCOMES Long- Activities Participation Short Medium term Programinvestments What What Who We What Results We We Do Reach Invest SO WHAT?? What is the VALUE?
    • INPUTS Staff Money Time Volunteers PartnersEquipment/Technology Policies Research
    • OUTPUTS What We Do Who We Reach ACTIVITIES PARTICIPATIONAssess needs and assets ParticipantsDesign curriculum ClientsEducate students CustomersConduct workshops UsersFacilitate learning groups GroupsSponsor conferencesWork with the media Reactions - SatisfactionPartner – collaborate
    • OUTCOMES What Results for Individuals, Organizations, Communities..… SHORT MEDIUM LONG-TERM Learning Action ConditionsAwareness Behavior HumanKnowledge Practice EconomicAttitudes Decisions CivicSkills Policies EnvironmentOpinion Social actionAspirationsMotivation
    • Commonly Measured Items That Are Not Outcomes• Participant satisfaction• Number of people taught• Units of education completed OUTCOMES• Number of events held• Time and money spent• Level of effort
    • Impact Evaluation Data Collection Methods • Surveys – Post-evaluation only (short programs) – Pre- and post-evaluation – Follow-up (e.g., 3 months later) • Focus groups • Interviews • Observations • Tests of knowledge/ability • RARE: Control groups and longitudinal studies
    • Typical Survey Questions• General reactions to the program• Changes in knowledge• Changes in motivation, confidence, and abilities• Intended changes in behavior• Actual changes in behavior• Future programming needs and preferences• Demographics• Qualitative/open-ended responses
    • Post-Then Pre Evaluation Method• Often BETTER than traditional pre-test-post-test method• First ask subjects about knowledge and/or behavior after an educational intervention• Then ask subjects about knowledge and/or behavior before the educational intervention in the PAST TENSE• Avoids pre-test sensitivity and response shift bias – People often don’t know what they don’t know before an educational intervention
    • Case Example: Humpty Dumpty“Humpty Dumpty sat on a wall,Humpty Dumpty had a great fall,All the king’s horses and all the king’s menCouldn’t put Humpty together again”
    • So Who/What Was Humpty Dumpty? Probably NOT what you think!
    • Post-Then-Pre Evaluation Survey• Why? Because people don’t know what they don’t know!• Who knew what Humpty Dumpty really was?• Five point rating scale – 1 = Strongly Disagree – 5 = Strongly Agree• Post: After listening to Barb, I know the history of the Humpty Dumpty nursery rhyme• Pre: Before listening to Barb, I knew the history of the Humpty Dumpty nursery rhyme
    • Post-Then-Pre Evaluation Surveys• Also known as a “Retrospective Pre-Test”• Helps identify changes in knowledge, attitudes, and behavior
    • Another Post-Then-Pre ExampleBig Advantage: Avoids the problem of learners under-estimating what they don’t know before a program
    • NEFE Evaluation Toolkit
    • Downloadable Evaluation Manual
    • Build Your Own Evaluations Online
    • Quantitative Evaluation Methods• Cost-Benefit Analysis – Program costs divided into reported economic benefits – The larger the multiple, the better (e.g., 15:1 vs. 3:1) – Example: $3,046,975 ÷ $120,000 = $25.39 (Money 2000) – See• Return on Investment (ROI) – Benefit-Cost Cost x 100 – Example: $1.1 million - $200,000 = $900,000 $200,000 $200,000 x 100 = 4.5 – “Even after all program costs were subtracted, the program generated $4.50 in net benefits for every $1 invested.” – 450% ROI
    • More Qualitative Evaluation Methods• Time Value of Money Analyses and/or Plain Math – Conservative estimates: e.g., potential future savings – Assume positive results for a small number of students – Assume modest interest rates – Best if supported with actual follow-up behavior change data – Example: If just 100 of the 300 students who were reached saved $10 per week, each would have saved $500 in a year ($50,000 total) which would be worth $28,000 in 30 years at 4% interest ($2.8 million total)• Extrapolation From Published Cost Estimates – Find a research-based data source related to program – Example: money saved by weight loss, lower recidivism, etc. – See
    • Qualitative Evaluation- The Critical Incident Technique• article.pdf (Flanagan)• Qualitative research technique• Using in NY Public Library staff training grant project• Ask subjects to describe through interviews incidents that they handled well or poorly (need not be spectacular events)• Provides rich personal perspectives and good quotes• Respondents like to tell stories and feel their experiences are important
    • Creative Evaluation-Xtranormal VideoScenario Knowledge Gain Assessment• Creating an educational video is easy... – Choose from hundreds of actors (avatars or stikz) – Choose their voices and other sounds – Choose their gestures – Select your background – Type or record your dialogue• Examples: 6/nypl-money-matters-identity-theft 3/nypl-money-matters-credit-debt-10
    • Social Media Evaluation Methods• Impact STILL matters!!!• eXtension Financial Security for All Community of Practice: one of FIRST social media financial education evaluation studies ever• 2011 and 2012 America Saves Week (ASW) social media project with a focused evaluation methodology• Read more in 2011 Journal of NEAFCS (O’Neill et al.): Journal.pdf.
    • Triangulation (Multiple Methods) Evaluation Approach• Unique Twitter hashtag: #eXasw – Save loose change. Saving $1 + change/day allows u 2 save $500/yr. Small changes=big savings! For more info, #eXasw – Do u track your spending? America Saves says 2 review monthly purchases & put $ into savings. For more info, #eXasw• Follow-up follower/friend survey• Follow-up project participant survey• analytics to determine the number of clicks on unique embedded links• Pre- and post-ASW Twitter influence metrics
    • Monitoring Twitter Impact
    • Monitoring Twitter Impact
    • 2011 Follower/Friend Survey Comments• “Keep up the good work,”• “These tips are timely and beneficial. I appreciate the effort to help us help ourselves,”• “The tweets made me think about the ways we are managing our money,”• “I welcome any and all suggestions for increasing my financial well-being.”• One respondent complained about a lack of money to save• There was one solicitation by a commercial firm
    • One Last Step:Share Your Program Evaluation Results WHAT? SO WHAT? NOW WHAT?
    • Tips for “Telling Your Story”• Use simple descriptive statistics (e.g., counts, percentages, and averages)• Don’t use jargon• Don’t overstate your results (e.g., causality)• Blend quantitative and qualitative data• Clearly describe who the results represent (i.e., demographic characteristics of participants)• Be honest about your program’s strengths and weaknesses, while highlighting the positive
    • Impact Statements: Intentions As a result of participating in this financial education program, X% of participants reported that they…• plan to do/use/adopt…• are more knowledgeable about…• are more confident in their ability to…• are more likely than before to do/use/adopt…• will do/use/adopt……a particular attitude, piece of information, or behavior.
    • Impact Statements: Actions As a result of participating in this financial education program, X% of participants reported that they…• are now doing…• did…• used…• increased their knowledge of…• adopted…• changed…… a particular attitude, piece of information, or behavior.
    • Questions and Comments?Barbara ONeill, Ph.D., CFP®, CRPCExtension Specialist in Financial Resource Management andProfessor IIRutgers UniversityPhone: 848-932-9126E-mail: oneill@aesop.rutgers.eduInternet: