This document discusses evaluation of deliberative and democratic engagement initiatives. It addresses why evaluation is important, some of the challenges involved, and different approaches to evaluation including process evaluation and impact/outcome evaluation. Small group activities are used to get participant input on what they want to know about projects and how they would go about evaluating them. Data collection methods like archival records, observations, interviews and surveys are presented. Additional online resources on evaluation are also listed.
Strategies for Gathering Data to Support Evaluation of Dialogue and Deliberation Initiatives (Nabatchi and Scully: 2014)
1. Patrick Scully, Clearview Consulting
pscully@clearviewconsultingllc.com
Tina Nabatchi, Maxwell School of Syracuse University
tnabatch@syr.edu
Supporting the
Evaluation of D&D
Initiatives
2. Group Discussion
What are you here? What do you really want
to know?
Why do you evaluate/assess?
What have been your challenges with
evaluating/assessing your work?
3. The Importance of Evaluation
What’s working and where things can be improved
Improving program management
Ensuring and improving accountability
Efficient use of finances and resources
Upholding ethics and values
Build a sense of ownership
Contribute to research, theory, and practice
4. The Challenges of Evaluation
What to evaluate? Management, process, and/or
impacts?
What constitutes “success” and “effectiveness”?
No agreed upon methods or tools
HUGE process variations
Different audiences have different needs and
interests
Evaluation can be scary.
5. Small Group Discussion:
What do YOU want to know?
Think of a program or project you’ve heard
about at NCDD.
With that project in mind, identify 1 or 2 things
you want to know more about. Why is this
information important to you?
Select the top two questions that your group
finds most interesting. (Don’t worry about
word-smithing precise language.)
7. Process Evaluation
Process Evaluation: assesses the management
and administration of a program
Focuses on inputs and outputs
Answers the “What” questions:
What was the program intended to be, and what is it in
reality?
What worked and what did not in program design,
delivery, and management?
What areas of the program can be developed and
improved?
What is the potential for replication of the program?
8. Impact/Outcome Evaluation
Impact Evaluation: assesses the impacts and
outcomes of a program
Focuses on outcomes and results
Answers the “So what” questions:
What are the outcomes or results of the program? (for
individuals, groups, organizations, communities, policies, etc.)
Do outcomes vary across participants and/or over time?
Is the program effective and efficient in comparison to
alternatives?
Should the program be continued, expanded, modified, or
eliminated?
9. Collecting Data
Archival Data (program reports and records;
meeting minutes; news coverage)
Observation of event
Event design data
Interviews
Surveys
10. Small Group Discussion
Think back to the top 2 questions your small
group identified.
How would you go about answering those
questions?
What are 2-3 kinds of information you could
collect to answer these questions? How
would you collect that information?
11. Additional Resources
Evaluating Deliberative Processes
www.ncchpp.ca/docs/DeliberativeFS3_evaluation_EN.pdf
Evaluation Toolkit
http://toolkit.pellinstitute.org/evaluation-guide/collect-data/
Quantitative and Qualitative Data Collection
http://mypeer.org.au/monitoring-evaluation/data-collection-
methods/
Data Collection Tools
http://serve.mt.gov/wp-content/uploads/2010/02/Data-
Collection-Methods-cbi.pdf
Participedia
www.participedia.net
Evaluation Tools for Racial Equity
www.racialequitytools.org