Role of Evaluation in Decision Making
and Program Improvement:
Case Study of a Volunteer Stream Monitoring
Program
SWCS 2017, Madison
Amulya Rao, University of Wisconsin-Extension
Vikram Koundinya, University of Wisconsin-Extension
Peggy Compton, University of Wisconsin-Extension
Evaluation
Evaluation: “The systematic determination of the merit, worth (or value) of an object.” 1
Program evaluation: “Systematic collection of information about the activities,
characteristics, and results of programs to (1) to make judgments about the program, (2)
improve or further develop program effectiveness, (3) inform decisions, and/or (4) increase
understanding.” 2
Evaluation typologies 3:
• Formative evaluation: Information to improve programs; Is the program running
smoothly? What are the problems? How can the program be more effective/efficient?
• Summative evaluation: Accountability-oriented assessments; Has the program worked or
not? Have the stated goals been met?
Water Action Volunteers (WAV)
• Wisconsin-wide citizen-science stream monitoring program
• Partnership between UW–Extension, WDNR and WI citizen volunteers
• Piloted in 1996
• In 2016: Nearly 600 adult volunteers, 650 unique stream sites, 3900+ site visits, in 59
of 72 Wisconsin counties, and 2,000+ students participated
• Goals are to preserve and protect Wisconsin’s 86,000 miles of rivers by:
o Educating and empowering citizens
o Obtaining high-quality data useful for WDNR decision-making
o Sharing data and knowledge
• Admin structure: Statewide Coordinators – Local Coordinators – Volunteers
WAV Evaluation
Drivers for evaluation:
• In 2015, both WAV coordinators left
and new coordinators were hired
• A new advisory team was formed to
review WAV’s administrative model
• Although WAV has grown, resources
have not expanded
Approaches to Evaluation
1. Management-oriented approach 4:
• Primary focus is to serve the decision-makers
• The needs of the decision-makers guide the direction of evaluation
• Stresses utility of evaluation: Who will use the results? How will the results be used?
• Evaluators work closely with the administrators and collect information that will allow decisions
to be made
2. Participant-oriented approach 5:
• Needs of stakeholders are the primary focus; Evaluation is done in response to the needs of the
stakeholders
• Useful when there are questions about program implementation difficulties
Evaluation Questions
1. How can WAV be sustained in the
future?
• What is going well with the current
administrative structure of WAV?
• What improvements can be made to the
current administrative structure of WAV?
• How can the various/different types of data be
used most effectively?
• Are volunteers collecting data that WDNR
needs?
Evaluation Questions
2. How can WAV provide better service to
volunteers?
• Is the current training structure adequate to meet
the educational and data quality goals of the WAV
program?
• How does the current administrative and training
structure influence volunteer motivation and
retention?
• If WAV is not equally accessible throughout the
state, what structural and administrative changes
are needed to make it more accessible?
Methodology
• Purposive sampling was used
• 11 volunteers, coordinators and
WDNR affiliated were chosen
because of their familiarity with the
program and its audience
• Phone interviews were conducted by
evaluators who have no role in
WAV
• Data analysis: Partial transcription
followed by coding and theming
Results: How can WAV be sustained in the future?
What is going well with the current administrative structure of WAV?
• Interviewees agreed that their needs were being met, were satisfied with the responsiveness of
administrators, and they feel supported by them
What improvements can be made to the current administrative structure of WAV?
• Increased in-person interaction with administrators; Improve communication and sharing among
coordinators
How can the collected data be used most effectively?
• Increase communication with DNR specialists; Regularly communicate uses of data; Provide
feedback on collected data
“This sort of communication
would validate WAV’s importance
and purpose to us…we want to
receive feedback on the quality of
the data so that issues can be
resolved.”
Results: How can WAV provide better service to volunteers?
Is the current training structure adequate to meet the educational and data quality goals
of the WAV program?
• Most interviewees agreed trainings were meeting the demand; Training on entering
data into the database and habitat assessment for volunteers and coordinators were
identified as pressing needs
How does the current administrative and training structure influence volunteer
motivation and retention?
• Interviewees noted that administrative and training structures might not influence
volunteer motivation; Recommended sharing information on how data is being used
Impact of Evaluation Results on Decisions
Based on the evaluation results, WAV administrators made the following decisions:
• Additional local coordinators will be recruited, potential for regional coordinators will
be explored, and avenues for financial compensation for each will be considered.
• New communication channels between WDNR and WAV volunteers will be explored.
• Additional training, especially for data entry and habitat assessment, will be offered to
volunteers and coordinators.
• Evaluation helped validate things that they already knew
Takeaways
• This case study demonstrates how evaluation can be used to effectively inform
decisions and its utility in program improvement
• Think critically about when to pause and collect data from stakeholders especially
in a time of transition
• Highlight the value of using a participatory approach in planning and designing the
evaluation
References
1. Scriven, M. (1991). Evaluation thesaurus (4th ed.) Beverly Hills, CA: Sage
2. Quinn, P. M. (1986). Utilization-focused evaluation. Beverly Hills.
3. Scriven, M. (1967). The methodology of evaluation. En RE Stake (Ed.), AERA
Monograph Series on Curriculum Evaluation N. 1. Chicago: Rand Mc Nally.
4. Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation:
Alternative approaches and practical guidelines.
5. Stake, R. E. (1975). Evaluating the arts in education: A responsive approach,
Columbus, OH: Merrill.
Contact Information
Email: amulya.vishweshwer@wisc.edu
Evaluation Unit, Environmental Resources Center, University of Wisconsin-
Extension: erc.cals.wisc.edu

Role of Evaluation in Decision Making

  • 1.
    Role of Evaluationin Decision Making and Program Improvement: Case Study of a Volunteer Stream Monitoring Program SWCS 2017, Madison Amulya Rao, University of Wisconsin-Extension Vikram Koundinya, University of Wisconsin-Extension Peggy Compton, University of Wisconsin-Extension
  • 2.
    Evaluation Evaluation: “The systematicdetermination of the merit, worth (or value) of an object.” 1 Program evaluation: “Systematic collection of information about the activities, characteristics, and results of programs to (1) to make judgments about the program, (2) improve or further develop program effectiveness, (3) inform decisions, and/or (4) increase understanding.” 2 Evaluation typologies 3: • Formative evaluation: Information to improve programs; Is the program running smoothly? What are the problems? How can the program be more effective/efficient? • Summative evaluation: Accountability-oriented assessments; Has the program worked or not? Have the stated goals been met?
  • 3.
    Water Action Volunteers(WAV) • Wisconsin-wide citizen-science stream monitoring program • Partnership between UW–Extension, WDNR and WI citizen volunteers • Piloted in 1996 • In 2016: Nearly 600 adult volunteers, 650 unique stream sites, 3900+ site visits, in 59 of 72 Wisconsin counties, and 2,000+ students participated • Goals are to preserve and protect Wisconsin’s 86,000 miles of rivers by: o Educating and empowering citizens o Obtaining high-quality data useful for WDNR decision-making o Sharing data and knowledge • Admin structure: Statewide Coordinators – Local Coordinators – Volunteers
  • 4.
    WAV Evaluation Drivers forevaluation: • In 2015, both WAV coordinators left and new coordinators were hired • A new advisory team was formed to review WAV’s administrative model • Although WAV has grown, resources have not expanded
  • 5.
    Approaches to Evaluation 1.Management-oriented approach 4: • Primary focus is to serve the decision-makers • The needs of the decision-makers guide the direction of evaluation • Stresses utility of evaluation: Who will use the results? How will the results be used? • Evaluators work closely with the administrators and collect information that will allow decisions to be made 2. Participant-oriented approach 5: • Needs of stakeholders are the primary focus; Evaluation is done in response to the needs of the stakeholders • Useful when there are questions about program implementation difficulties
  • 6.
    Evaluation Questions 1. Howcan WAV be sustained in the future? • What is going well with the current administrative structure of WAV? • What improvements can be made to the current administrative structure of WAV? • How can the various/different types of data be used most effectively? • Are volunteers collecting data that WDNR needs?
  • 7.
    Evaluation Questions 2. Howcan WAV provide better service to volunteers? • Is the current training structure adequate to meet the educational and data quality goals of the WAV program? • How does the current administrative and training structure influence volunteer motivation and retention? • If WAV is not equally accessible throughout the state, what structural and administrative changes are needed to make it more accessible?
  • 8.
    Methodology • Purposive samplingwas used • 11 volunteers, coordinators and WDNR affiliated were chosen because of their familiarity with the program and its audience • Phone interviews were conducted by evaluators who have no role in WAV • Data analysis: Partial transcription followed by coding and theming
  • 9.
    Results: How canWAV be sustained in the future? What is going well with the current administrative structure of WAV? • Interviewees agreed that their needs were being met, were satisfied with the responsiveness of administrators, and they feel supported by them What improvements can be made to the current administrative structure of WAV? • Increased in-person interaction with administrators; Improve communication and sharing among coordinators How can the collected data be used most effectively? • Increase communication with DNR specialists; Regularly communicate uses of data; Provide feedback on collected data “This sort of communication would validate WAV’s importance and purpose to us…we want to receive feedback on the quality of the data so that issues can be resolved.”
  • 10.
    Results: How canWAV provide better service to volunteers? Is the current training structure adequate to meet the educational and data quality goals of the WAV program? • Most interviewees agreed trainings were meeting the demand; Training on entering data into the database and habitat assessment for volunteers and coordinators were identified as pressing needs How does the current administrative and training structure influence volunteer motivation and retention? • Interviewees noted that administrative and training structures might not influence volunteer motivation; Recommended sharing information on how data is being used
  • 11.
    Impact of EvaluationResults on Decisions Based on the evaluation results, WAV administrators made the following decisions: • Additional local coordinators will be recruited, potential for regional coordinators will be explored, and avenues for financial compensation for each will be considered. • New communication channels between WDNR and WAV volunteers will be explored. • Additional training, especially for data entry and habitat assessment, will be offered to volunteers and coordinators. • Evaluation helped validate things that they already knew
  • 12.
    Takeaways • This casestudy demonstrates how evaluation can be used to effectively inform decisions and its utility in program improvement • Think critically about when to pause and collect data from stakeholders especially in a time of transition • Highlight the value of using a participatory approach in planning and designing the evaluation
  • 13.
    References 1. Scriven, M.(1991). Evaluation thesaurus (4th ed.) Beverly Hills, CA: Sage 2. Quinn, P. M. (1986). Utilization-focused evaluation. Beverly Hills. 3. Scriven, M. (1967). The methodology of evaluation. En RE Stake (Ed.), AERA Monograph Series on Curriculum Evaluation N. 1. Chicago: Rand Mc Nally. 4. Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. 5. Stake, R. E. (1975). Evaluating the arts in education: A responsive approach, Columbus, OH: Merrill.
  • 14.
    Contact Information Email: amulya.vishweshwer@wisc.edu EvaluationUnit, Environmental Resources Center, University of Wisconsin- Extension: erc.cals.wisc.edu