Innovation Network's Veena Pankaj and ORS Impact's Mel Howlett share dataviz products that can be used throughout the evaluation lifecycle, including theory of change, social network analysis, data placemat, strategic debrief deck, H-form, visual report deck, visual executive summary, and timeline.
Innovation Network's own workbook on evaluation planning. Can be used alone or in conjunction with the Evaluation Plan Builder at the Point K Learning Center.
Coalition Assessment: Approaches for Measuring Capacity and ImpactInnovation Network
Coalition Assessment: Approaches for Measuring Capacity and Impact
Innovation Network
by Veena Pankaj, Kat Athanasiades, and Ann Emery
February 2014
Download the paper here: www.innonet.org/research
Why assess coalition capacity? How should a coalition be assessed? How can coalition assessment data be analyzed and used?
Coalitions are an important tool in the advocacy and policy change toolbox. They can be used to promote an issue, increase visibility, and eventually propel an issue to the forefront of a political or social agenda. They can provide a lot of horsepower—harnessing the combined power and expertise of many entities all at once. And they are a valuable technique for crafting more durable solutions generated by a broad constituency. For all of these reasons, developing and strengthening coalitions is a common strategy among advocates and advocacy funders.
For evaluators, coalition assessment is a growing field of experimentation and learning. Innovation Network has been evaluating coalitions since 2006, beginning with the Coalition for Comprehensive Immigration Reform, a national effort to secure passage by the U.S. Congress for comprehensive immigration reform. Over the years, we have evaluated many different types of coalitions throughout the United States. Our coalition partners have worked at national, state, regional, and local levels on a variety of advocacy and policy change goals, such as healthy community design or childhood nutrition. This white paper provides practitioners and funders with insights into the coalition assessment process along with concrete examples and lessons we’ve learned from our own work.
Innovation Network's own workbook (revised in 2010), offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center.
Jennifer Kuschner, Program Development and Evaluation Specialist, UW-Extension
Kerry Zaleski, Monitoring and Evaluation Project Coordinator, UW-Extension
This interactive session provided participants with an overview of what a logic model is and how to use one for planning, implementation, evaluation or communicating about co-curricular community service activities. The session also provided an opportunity to work in teams to create participant’s own logic model.
Innovation Network's own workbook on evaluation planning. Can be used alone or in conjunction with the Evaluation Plan Builder at the Point K Learning Center.
Coalition Assessment: Approaches for Measuring Capacity and ImpactInnovation Network
Coalition Assessment: Approaches for Measuring Capacity and Impact
Innovation Network
by Veena Pankaj, Kat Athanasiades, and Ann Emery
February 2014
Download the paper here: www.innonet.org/research
Why assess coalition capacity? How should a coalition be assessed? How can coalition assessment data be analyzed and used?
Coalitions are an important tool in the advocacy and policy change toolbox. They can be used to promote an issue, increase visibility, and eventually propel an issue to the forefront of a political or social agenda. They can provide a lot of horsepower—harnessing the combined power and expertise of many entities all at once. And they are a valuable technique for crafting more durable solutions generated by a broad constituency. For all of these reasons, developing and strengthening coalitions is a common strategy among advocates and advocacy funders.
For evaluators, coalition assessment is a growing field of experimentation and learning. Innovation Network has been evaluating coalitions since 2006, beginning with the Coalition for Comprehensive Immigration Reform, a national effort to secure passage by the U.S. Congress for comprehensive immigration reform. Over the years, we have evaluated many different types of coalitions throughout the United States. Our coalition partners have worked at national, state, regional, and local levels on a variety of advocacy and policy change goals, such as healthy community design or childhood nutrition. This white paper provides practitioners and funders with insights into the coalition assessment process along with concrete examples and lessons we’ve learned from our own work.
Innovation Network's own workbook (revised in 2010), offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center.
Jennifer Kuschner, Program Development and Evaluation Specialist, UW-Extension
Kerry Zaleski, Monitoring and Evaluation Project Coordinator, UW-Extension
This interactive session provided participants with an overview of what a logic model is and how to use one for planning, implementation, evaluation or communicating about co-curricular community service activities. The session also provided an opportunity to work in teams to create participant’s own logic model.
CSU Extension, Engagement and the Logic modelSteven Newman
Presentation delivered to graduate class Principles of Extension.
Much of the material generated in this lecture were from the extension, logic model, scholarship of engagement were taken from the University of Wisconsin-Extension, Program Development and Evaluation program.
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Federal, state, provincial and foundation grant applications in both the United States and Canada are increasingly requiring the use of logic models in their grant applications. Depending on the level of complexity required, these can prove a major stumbling block, especially with looming deadlines. The purpose of this seminar is to unlock the mystery surrounding their development and use. At the conclusion, we will not promise that you will like them any better, just understand them and fear them less.
2. grantseeking creating a program logic modelRebecca White
Grants for beginners. Module 2 of grant seeking series. Covers how to develop a program logic model for grant development. Basic program logic models include highlighting the situation and priorities; development of overall program goal; determining program outcomes, outputs and inputs; identifying any assumptions and external factors that are in play; and developing an program evaluation plan.
Evaluating Communication Programmes, Products and Campaigns: Training workshopGlenn O'Neil
A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
Two Examples of Program Planning, Monitoring and EvaluationMEASURE Evaluation
Presented by Laili Irani, Senior Policy Analyst for the Population Reference Bureau, as part of the Measuring Success Toolkit webinar in September 2012.
Make Your Data Count: Tips & Tools for Visual Reporting Innovation Network
In their session at YNPNdc's 2015 Annual Conference in Washington, DC, Johanna Morariu and Katherine Haugh presented on helpful tips and tools for visual reporting. This handout includes their suggestions, great examples of visual reports, and tools anyone can access to create powerful visual reports.
Putting Data in Context: Timelining for Evaluators (HANDOUT)Innovation Network
Creating a timeline is a method for picturing or seeing events as they take place over time. The full PowerPoint slides of this presentation are also available in SlideShare. Search for the title "Putting Data in Context: Timelining for Evaluators".
[Link: http://www.slideshare.net/InnoNet_Eval/putting-data-in-context-timelining-for-evaluators ]
CSU Extension, Engagement and the Logic modelSteven Newman
Presentation delivered to graduate class Principles of Extension.
Much of the material generated in this lecture were from the extension, logic model, scholarship of engagement were taken from the University of Wisconsin-Extension, Program Development and Evaluation program.
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Federal, state, provincial and foundation grant applications in both the United States and Canada are increasingly requiring the use of logic models in their grant applications. Depending on the level of complexity required, these can prove a major stumbling block, especially with looming deadlines. The purpose of this seminar is to unlock the mystery surrounding their development and use. At the conclusion, we will not promise that you will like them any better, just understand them and fear them less.
2. grantseeking creating a program logic modelRebecca White
Grants for beginners. Module 2 of grant seeking series. Covers how to develop a program logic model for grant development. Basic program logic models include highlighting the situation and priorities; development of overall program goal; determining program outcomes, outputs and inputs; identifying any assumptions and external factors that are in play; and developing an program evaluation plan.
Evaluating Communication Programmes, Products and Campaigns: Training workshopGlenn O'Neil
A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
Two Examples of Program Planning, Monitoring and EvaluationMEASURE Evaluation
Presented by Laili Irani, Senior Policy Analyst for the Population Reference Bureau, as part of the Measuring Success Toolkit webinar in September 2012.
Make Your Data Count: Tips & Tools for Visual Reporting Innovation Network
In their session at YNPNdc's 2015 Annual Conference in Washington, DC, Johanna Morariu and Katherine Haugh presented on helpful tips and tools for visual reporting. This handout includes their suggestions, great examples of visual reports, and tools anyone can access to create powerful visual reports.
Putting Data in Context: Timelining for Evaluators (HANDOUT)Innovation Network
Creating a timeline is a method for picturing or seeing events as they take place over time. The full PowerPoint slides of this presentation are also available in SlideShare. Search for the title "Putting Data in Context: Timelining for Evaluators".
[Link: http://www.slideshare.net/InnoNet_Eval/putting-data-in-context-timelining-for-evaluators ]
Do-It-Yourself Logic Models: Examples, Templates, and ChecklistsInnovation Network
Logic models are nonprofit road maps: they help you diagram where you are now and where you hope to be in the future. They are used for program planning, program management, fundraising, communications, consensus-building, and evaluation planning.
Want to make a logic model, but not sure where to start? In this 90-minute webinar, Johanna Morariu and Ann Emery taught about the nuts and bolts of logic models--what they are, how to make them, who should be involved in the process, and how often to update them. We’ll provide you with tools like a logic model template, free online logic model builder, and a logic model checklist. We’ll also share several examples from real nonprofits so that you’re ready to hit the ground running.
To learn more, please visit www.innonet.org.
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...Innovation Network
Presentation by Innovation Network's Johanna Morariu and Ann K. Emery at the Joint Affinity Groups Unity Conference, held June 6, 2014 in Washington, DC.
Evaluation is a systematic process to understand what a program does and how well the program does it. Evaluation results can be used to maintain or improve program quality and to ensure that future planning can be more evidence-based.
in this topic i cover SWOT analysis, mile stone , Gantt chart, PERT, CPM, Bennett's hierarchy evaluation , logical framework approach
How to write an development project evaluation report. Format and principle guidelines for mid-term and for completed projects. This format can be used for any kind of development project.
Elaborating on Process and Results: A Comprehensive Analysis and Interpretationassignmentcafe1
Join us on an insightful journey as we dive deep into the process and results of a comprehensive analysis, providing a thorough understanding and interpretation of the findings. In this engaging SlideShare presentation, we will explore the intricacies of conducting a rigorous analysis, deciphering the outcomes, and deriving meaningful insights.
In this presentation, we will introduce the importance of conducting a comprehensive analysis in various fields and disciplines. Understand the value of robust research methodologies, data collection techniques, and analytical tools in gaining a deeper understanding of complex phenomena. Explore how the analysis process illuminates patterns, relationships, and trends that inform decision-making, shape strategies, and drive improvements.
We will delve into the key steps involved in conducting a comprehensive analysis. Explore the stages of data collection, data cleaning, and data transformation to ensure the integrity and quality of the dataset. Understand various analysis methods, such as quantitative analysis, qualitative analysis, or mixed-method approaches, depending on the nature of the research questions and available data.
Furthermore, we will discuss the interpretation and presentation of the results. Learn how to interpret statistical findings, identify significant patterns, and extract meaningful insights from the data. Explore the effective visualization and communication techniques to present the results in a clear and compelling manner, catering to different audiences and stakeholders.
Through case studies, examples, and practical insights, we will showcase the importance of interpreting the results in the context of the research objectives and real-world implications. Discuss the process of drawing conclusions, making recommendations, and translating findings into actionable strategies. Explore the ethical considerations of data analysis and the responsible use of findings.
Whether you are a researcher, analyst, decision-maker, or simply interested in the analytical process, this presentation will provide you with a comprehensive understanding of analysis methods and interpretation techniques. Let us embark on a journey of analysis, unraveling the complexities, and deriving meaningful insights that drive informed decisions and impactful outcomes.
What is the importance of communication skills for a business analyst.docxshivanikaale214
Communication skills are crucial for business analysts, aiding in stakeholder engagement, requirement gathering, and project success. Effective facilitation of workshops and defining project scope further underscores their importance in driving collaboration and meeting objectives.
Similar to Dataviz Products throughout the Evaluation Lifecycle (20)
Refreshing Evaluation in Support of the Social Movements RevivalInnovation Network
There is a growing social consciousness in America and a revival of using social movements as a vehicle for social change—with increasing nonprofit involvement and philanthropic funding support. Since the mid-2000’s there have been several notable movements that have taken hold of the public consciousness: the immigration reform movement and DREAMers, The Occupy Movement, Gay Marriage, climate change movement, Black Lives Matter, and a nascent, potential movement developing in protest of the Trump Administration. While evaluating movements has some parallels to established evaluation practice, it also represents some thorny challenges. In a session presented at the American Evaluation Association Conference on November 10, 2017, we explore and share what we are learning about evaluating social movements, including: what we know about social movements, their components, characteristics, and types; what aspects of social movements are ripe for evaluation; and what existing evaluation approaches are well suited to evaluating social movements.
During the 2015 American Evaluation Association's Annual Conference in Chicago, Katherine Haugh and Deborah Grodzicki conducted a real time data mini-study to see which evaluation approaches evaluators at #eval15 use most frequently in their work. Basing their mini-study off of Marvin C. Alkin's "Evaluation Roots: A Wider Perspective of Theorists’ Views and Influences," they asked evaluators to vote for the top two approaches they used most often. This handout accompanied the real time data mini-study to provide more information about the formation of the evaluation theory tree, it's three branches, and definitions of the evaluation approaches associated with each branch.
Creating a timeline is a method for picturing or seeing events as they take place over time. By documenting major occurrences in chronological order, evaluators are able to identify patterns, themes, or trends that they may not have seen otherwise. A timeline allows evaluators to “zoom out” and look at the broader landscape, so that they are better positioned to think through and understand the context in which events occur. Having a timeline is especially useful for complex, multi-year evaluation projects with several threads of evaluation, where documenting the process is just as important as measuring the outcome itself. Creating a timeline has three key components: planning, populating, and revising. This presentation shows how to incorporate a timeline into a report, how to use a timeline to track progress internally, and how to utilize data visualization principles to create a visual timeline.
Highlights of this presentation are also available in our handout titled "Putting Data in Context: Timelining for Evaluators (HANDOUT)".
[Link: http://www.slideshare.net/InnoNet_Eval/putting-data-in-context-timelining-for-evaluators-handout ]
Data Placemats: Construction and Practical Design TipsInnovation Network
Increasing stakeholder involvement throughout the evaluation lifecycle, not only enhances stakeholder buy-in to the final evaluation results, but it also ensures that the evaluator is taking into consideration multiple viewpoints to be able to provide a more comprehensive picture of a program or initiative. Data placemats, a data viz technique to improve stakeholder understanding of data, can be used to communicate preliminary evaluation results during the analysis phase of the evaluation life cycle. When done correctly, it offers stakeholders an opportunity to form their own judgments about the data and weigh in prior to the final report. In this session, the presenter will review the concept of data placemats, focusing specifically on the nuts and bolts of constructing a data placemat.
Real Time Evaluation: Tips, Tools, and Tricks of the TradeInnovation Network
How can an evaluator meaningfully convey findings to stakeholders based on data collected that same day? How can real time evaluation really be done in real time? This Ignite talk is based on Innovation Network’s experiences with facilitating real time evaluation in health policy settings, and will introduce AEA participants to three tools that can be used as part of any evaluator’s real time evaluation toolbox: surveys, H-forms, and timelines. Yuqi Wang from Innovation Network will provide an overview of each tool; show how these tools can aid data collection, analysis, and communication of findings in real time; and lessons learned from Innovation Network’s experiences with these three tools during the evaluation process.
Collecting and analyzing data in real time doesn't have to be as stressful or hard as it sounds, especially if you want to collect real time data using surveys. There is a short way and a long way to collect real time survey data. The short way of collecting and analyzing survey data is to use software that has the capability of collecting and analyzing survey data when embedded into powerpoints or webinars. The long way is to use hard copies of surveys to collect data, and Excel to analyze. This document will show you step by step how to collect and analyze survey data the long way.
Make Your Data Count: New, Visual Approaches to Evaluation Reporting Innovation Network
Charts and graphs built on dataviz principles are transforming evaluation reporting and increasing the evaluator's communication power. At YNPNdc's 2015 Annual Conference in Washington DC, Johanna Morariu and Katherine Haugh will provide principles and case studies of new, visual approaches to evaluation reporting. The session will provide examples, such as incorporating a wealth of visuals in text reports, using large format hand-outs to emphasize findings, and designing slide reports.
Success and Failure in the Evaluation Process
What do the terms “success” and “failure” really mean in the philanthropic world? Funders have taken different approaches to learning from initiatives that haven’t gone quite as they had hoped. Some funders want to learn from their mistakes, some provide technical assistance to lagging grantees, and some want to focus their light on “bright spots” and grantee successes. In this session, Kat Athanasiades from Innovation Network will discuss how and when her organization uses grant reports in evaluation; how and why getting good evaluation data from grant reports is difficult; and potential ways to make it easier for grantees to report on failure in a way that could be useful to evaluators.
Session participants will:
•Know how funders can embed “failure reporting” into grant reports in ways that are useful to evaluators.
•Learn ways a foundation can combat some of the "structural" impediments, e.g., trust and communication, that may prevent proper reporting on failure.
•Gain ideas from fellow participants on how to understand and appreciate grantmaking "failures" as well as successes.
Since its inception in 2000, the Missouri Foundation for Health (MFH) has invested in policy change in Missouri. Recognizing a dearth of organizations with the capacity to advocate for Missouri health consumers, MFH broadened its grantmaking vision to building a field of consumer health advocates. Using the Framework for Evaluating Advocacy Field Building, Innovation Network and the Center for Evaluation Innovation are currently gauging how MFH shapes this field through its grantmaking. This presentation will focus on evaluating two dimensions of the Framework of this field: Adaptive Capacity and Skills & Resources. A discussion of data collection activities will give the audience ideas about how to evaluate these dimensions, lessons learned from the process, and what has been revealed through the evaluation about Skills & Resources and Adaptive Capacity in a field.
Handout for the session, "Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results" presented at the American Evaluation Association's 2014 conference.
#JAGUnity2014: Innovations in Evaluating Social MovementsInnovation Network
Today, social movement organizers are grappling with big questions: What is the long-term impact we are hoping to make? How can we measure the progress we've made thus far? How can we learn from past practice?
On June 7, 2014, Innovation Netowrk's William Fenn spoke on a panel with with Deepak Pateriya and Sian O'Faolain of the Center for Community Change and Hillary Klein of Make the Road New York to try and answer some of these questions. The presentation highlighted specific ways in which social movement organizers can evaluate the impact of their work.
Evaluation Essentials for Nonprofits: Terms, Tips, and TrendsInnovation Network
These slides are an excerpt from an evaluation session for the Young Nonprofit Professionals Network (YNPN), which was held in June 2014 in Washington, DC.
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...Innovation Network
Presentation by Innovation Network's Johanna Morariu and Ann K. Emery at the Joint Affinity Groups Unity Conference, held June 6, 2014 in Washington, DC.
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)Innovation Network
Innovation Network's Johanna Morariu and Ann K. Emery presented at the Young Nonprofit Professionals Network 2014 Annual Leadership Conference, which was held on May 9, 2014 in Washington, DC.
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Handout)Innovation Network
Innovation Network's Johanna Morariu and Ann K. Emery presented at the Young Nonprofit Professionals Network 2014 Annual Leadership Conference, which was held on May 9, 2014 in Washington, DC.
Assessing the Capacity of Community Coalitions to Advocate for Change (Presen...Innovation Network
Research has shown that high-capacity coalitions are more successful in effecting community change. But what does “high capacity” mean? Evaluators have developed tools to provide an answer, but documentation is scarce regarding how they are implemented, how the results are used, and whether they predict coalition success in collaborative community change efforts. This breakfast talk will focus on a coalition assessment tool designed by Innovation Network to assess changes in coalition capacity over time.
Developed for a health promotion initiative of the Kansas Health Foundation, this tool is designed to assess coalition progress in seven key areas across twelve different community coalitions, over the course of a four-year initiative. The Innovation Network team will share lessons learned from the first year of the initiative about developing and deploying the assessment tool, as well as what these tools can—and can’t—tell us about a coalition’s capacity to conduct community change work. They will also present some data visualization techniques for effectively communicating results back to coalitions.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Dataviz Products throughout the Evaluation Lifecycle
1. Products throughout the Evaluation Lifecycle
Articulating Theory/Strategy
Product Purpose How Used Considerations
Theory of
Change
To create a visual that
articulates the linkages
between activities and hoped-
for outcomes for arriving at
social impact
Used to help align
clients and
stakeholders
around
expectations for
change,
particularly
realistic timing
and order of
outcomes
Can be developed or revisited at any point during the
lifecycle, but is particularly helpful during strategy
development or mid-course review/correction
Outcome maps allow for complexity that can be
limited by the logic model format
Important to fit the level of detail and visualization to
the client’s purpose/needs (e.g., not too in the weeds)
Social
Network
Analysis
(SNA)
To provide a visual of how
different individuals or
organizations within a field are
connected. Can be used as a
vehicle to quickly understand
the characteristics of the field
and identify central “hub”
individuals or organizations
Used to illustrate
the connections
and relationships
between players
within a field
Can be used in conjunction with other frameworks
(e.g. Center for Evaluation Innovation’s Advocacy
Strategy Framework) to identify skills gaps and
potential areas for targeted capacity building
Can be used by funders to inform strategic
grantmaking decisions
Facilitated Processes for Sense-Making
Product Purpose How Used Considerations
Data
Placemat
To engage stakeholders in
interpreting the data
Used by evaluator
to facilitate a
debrief of
evaluation data
with client and/or
stakeholders
Intended to be an interim product that includes
stakeholders in sense-making to build ownership
Context, interpretations, and implications generated
by debrief participants should be used to inform final
reporting
Particularly useful when there is a high amount of
quantitative data that needs interpretation
Important to only include graphs/charts or other
visualizations, not narrative or evaluator
interpretations
Strategic
Debrief Deck
To facilitate a debrief/
discussion of key findings and
implications
Used with a client
and/or
stakeholders to
discuss
implications and
think critically
about, revise, and
improve the
strategy that is
being evaluated
Highlights a prioritized subset of evaluation findings
that would benefit from discussion
Can occur mid-point or at the end of a strategy to lift
up decision points most critical for the stage of work
Evaluator plays an expert role vis a vis the data and
facilitates the discussion about what the data/findings
mean for the work moving forward
Discussion can be used to inform a final deliverable
2. 2
H-Forms To capture the degree to
which stakeholders agree or
disagree with a selected
concept, and why. The process
of publically rating each
concept promotes discussion,
identifies areas of consensus
and divergence among
stakeholders, and allows for
the interpretation of
information
Used as a
facilitative tool to
gather and
document
stakeholder
opinions and
capture in-the-
moment
reflections
Evaluator plays a facilitative role during this process
H-Forms can be used to gather perspectives on
multiple concepts simultaneously
Evaluator creates an H-Form display for each concept
being rated
H-Forms should include a rating scale (preferably a 5-
point scale) with anchor points clearly labeled
Dot stickers can be used to enable stakeholders to rate
each concept
Communication and Reporting to Stakeholders
Product Purpose How Used Considerations
Visual Report
Deck
To share findings from an
evaluation using data
visualization to enhance
understanding and promote
evaluation use
Used as a
communication
tool to share
evaluation
findings
Visual reports take time to create
Evaluator needs to follow basic dataviz guidelines
when creating a visual report
Evaluator needs to ensure that the report is well-
organized
Visual reports need to be a stand-alone product
Visual
Executive
Summary
To complement a traditional
evaluation report
Used as a
communication
tool to share
evaluation
findings with
audiences that
have a limited
appetite for data
or limited time
Data are highly synthesized; should be no more than
1-2 pages
Product may have a life of its own and should tell a
story of the highest priority data/findings
Timeline To chronologically document
events related to an initiative
or program to help
stakeholders identify patterns,
themes, and trends that may
otherwise be difficult to see;
Promotes a shared
understanding of the
contextual factors that may
have contributed to or
detracted from a desired
outcome
Used as a
communication
tool to help
stakeholders
digest multiple
levels of
information across
a specified
timeline
Important to identify the purpose, audience, and time
horizon of the timeline at the beginning of the process
Evaluator needs to be selective about the categories of
information included in the timeline
Important to be purposeful in selecting the format and
visualization style for the timeline
Timelines are especially relevant when evaluating
advocacy and policy change initiatives where
contextual elements play a role in shaping policy
outcomes