Change Management Made Easier - Know Your Stakeholders: Create advocates within your organization by understanding the motivations of your internal customers.
During the 2015 American Evaluation Association's Annual Conference in Chicago, Katherine Haugh and Deborah Grodzicki conducted a real time data mini-study to see which evaluation approaches evaluators at #eval15 use most frequently in their work. Basing their mini-study off of Marvin C. Alkin's "Evaluation Roots: A Wider Perspective of Theorists’ Views and Influences," they asked evaluators to vote for the top two approaches they used most often. This handout accompanied the real time data mini-study to provide more information about the formation of the evaluation theory tree, it's three branches, and definitions of the evaluation approaches associated with each branch.
Change Management Made Easier - Know Your Stakeholders: Create advocates within your organization by understanding the motivations of your internal customers.
During the 2015 American Evaluation Association's Annual Conference in Chicago, Katherine Haugh and Deborah Grodzicki conducted a real time data mini-study to see which evaluation approaches evaluators at #eval15 use most frequently in their work. Basing their mini-study off of Marvin C. Alkin's "Evaluation Roots: A Wider Perspective of Theorists’ Views and Influences," they asked evaluators to vote for the top two approaches they used most often. This handout accompanied the real time data mini-study to provide more information about the formation of the evaluation theory tree, it's three branches, and definitions of the evaluation approaches associated with each branch.
Community Building Begins with Community OrganizingDebra Askanase
Building a great online community relies on the principles of community organizing. Tactics for community-building, case studies of how to build long-term online communities, and build communities around campaigns. Presented at NCVS 2011.
This is a communication plan explaining the organizational change for the organization in the UPOX AET 560 Organizational Change Process Learning Team project
The economic performance of a country is mainly depending on the labour of youth population. Energetic, courageous and qualified youth can make changes to the social economic development if they are well utilized and managed. Investing in youth (ages 14 to 29) now will lay the groundwork for Ethiopia’s future. Strategies to continue progress toward harnessing the potential of its youth will help Ethiopia attain a demographic dividend and foster sustainable development. However, migration, unemployment, drug addiction, unfavorable policy environment and high population growth are the major problems of youth in the country. The overall objective of this paper is to review the current key challenges of youth in Ethiopia. Particularly the paper tries to: Review youth migration, youth unemployment and health and addiction related to youth and finally it suggested the possible solution for the challenges. The data collected, interpreted and evaluated all came from secondary data sources from country Central statistical agency, empirical study, country profile, different authors and researchers have written on the issue of youth; and other reports on youth related reports in Ethiopia. Finally, suggestions are made to overcome the challenges.
This presentation is one of the best presentations from our study material for our weekly workshops which ADMEC conducts every week at the center. This presentation contains very good information for “Use of Shapes in Graphic Design”.
Presenting this set of slides with name Weekly Project Status Updates. The topics discussed in these slides are Weekly Progress Reports, Weekly Performance Reports, Weekly Progress Tracking. This is a completely editable PowerPoint presentation and is available for immediate download. Download now and impress your audience. https://bit.ly/35OssgJ
The field of program evaluation presents a diversity of images a.docxcherry686017
The field of program evaluation presents a diversity of images and claims about the nature and role of evaluation that confounds any attempt to construct a coher- ent account of its methods or confidently identify important new developments. We take the view that the overarching goal of the program evaluation enterprise is to contribute to the improvement of social conditions by providing scientifically credible information and balanced judgment to legitimate social agents about the effectiveness of interventions intended to produce social benefits. Because of its centrality in this perspective, this review focuses on outcome evaluation, that is, the assessment of the effects of interventions upon the populations they are intended to benefit. The coverage of this topic is concentrated on literature published within the last decade with particular attention to the period subsequent to the related reviews by Cook and Shadish (1994) on social experiments and Sechrest & Figueredo (1993) on program evaluation.
The word ‘evaluation’ has become increasingly used in the language of community, health and social services and programs. The growth of talk and practice of evaluation in these fields has often been promoted and encouraged by funders and commissioners of services and programs. Following the interest of funders, has been a growth in the study and practice of evaluation by community, health and social service practitioners and academics. When we consider why this move in evaluative thinking and practice has occurred, we can assume the position of the funder and simply answer, ‘...because we want to know if this program or service works’. Practitioners, specialists and academics in these fields have been called upon by governments and philanthropists to aid the development of effective evaluation. Over time, they have led their own thinking and practice independently. Evaluation in its simplest form is about understanding the effect and impact of a program, service, or indeed a whole organization. Evaluation as a practice is not so simple however, largely because in order to assess impact, we need to be very clear at the beginning what effect or difference we are trying to achieve.
The literature review begins with an overview of qualitative and quantitative research methods, followed by a description of key forms of evaluation. Health promotion evaluation and advocacy and policy evaluation will then be explored as two specific domains. These domains are not evaluation methodologies, but forms of evaluation that present unique requirements for effective community development evaluation. Following this discussion, the review will explore eight key evaluation methodologies: appreciative enquiry, empowerment evaluation, social capital,
social return on investment, outcomes based evaluation, performance dashboards and scorecards and developmental evaluation. Each of these sections will include specific methods, the values base of each methodo ...
Evaluating community projects
These guidelines were initially developed as part of the JRF Neighbourhood Programme. This programme is made up of 20 community or voluntary organisations all wanting to exercise a more strategic influence in their neighbourhood. The guidelines were originally written to help these organisations evaluate their work. They provide step-by-step advice on how to evaluate a community project which will be of interest to a wider audience.
What is evaluation?
Put simply, evaluation by members of a project or organisation will help people to learn from their day-to-day work. It can be used by a group of people, or by individuals working alone. It assesses the effectiveness of a piece of work, a project or a programme. It can also highlight whether your project is moving steadily and successfully towards achieving what it set out to do, or whether it is moving in a different direction. You can then celebrate and build on successes as well as learn from what has not worked so well.
Why evaluate?
Although evaluation may seem like an unnecessary additional task if you are already short of time and resources, it can save you both time and resources by keeping participants focused on, and working towards, the ultimate goal of the project. If necessary, it can refocus activity away from unproductive or unnecessary work.
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
CHAPTER SIXTEENUnderstanding Context Evaluation and MeasuremeJinElias52
CHAPTER SIXTEEN
Understanding Context: Evaluation and Measurement in Not-for-Profit Sectors
Dale C. Brandenburg
Many individuals associated with community agencies, health care, public workforce development, and similar not-for-profit organizations view program evaluation akin to a visit to the dentist’s office. It’s painful, but at some point it cannot be avoided. A major reason for this perspective is that evaluation is seen as taking money away from program activities that perform good for others, that is, intruding on valuable resources that are intended for delivering the “real” services of the organization (Kopczynski & Pritchard, 2004). A major reason for this logic is that since there are limited funds available to serve the public good, why must a portion of program delivery be allocated to something other than serving people in need? This is not an unreasonable point and one that program managers in not-for-profits face on a continuing basis.
The focus of evaluation in not-for-profit organization has shifted in recent years from administrative data to outcome measurement, impact evaluation, and sustainability (Aspen Institute, 2000), thus a shift from short-term to long-term effects of interventions. Evaluators in the not-for-profit sector view their world as the combination of technical knowledge, communication skills, and political savvy that can make or break the utility and value of the program under consideration. Evaluation in not-for-profit settings tends to value the importance of teamwork, collaboration, and generally working together. This chapter is meant to provide a glimpse at a minor portion of the evaluation efforts that take place in the not-for-profit sector. It excludes, for example, the efforts in public education, but does provide some context for workforce development efforts.
CONTRAST OF CONTEXTS
Evaluation in not-for-profit settings tends to have different criteria for the judgment of its worth than is typically found in corporate and similar settings. Such criteria are likely to include the following:
How useful is the evaluation?
Is the evaluation feasible and practical?
Does the evaluation hold high ethical principles?
Does the evaluation measure the right things, and is it accurate?
Using criteria such as the above seems a far cry from concepts of return on investment that are of vital importance in the profit sector. Even the cause of transfer of training can sometimes be of secondary importance to assuring that the program is described accurately. Another difference is the pressure of time. Programs offered by not-for-profit organizations, such as an alcohol recovery program, take a long time to see the effects and, by the time results are viewable, the organization has moved on to the next program. Instead we often see that evaluation is relegated to measuring the countable, the numbers of people who have completed the program, rather than the life-changing impact that decreased alcohol abuse has on ...
Community Building Begins with Community OrganizingDebra Askanase
Building a great online community relies on the principles of community organizing. Tactics for community-building, case studies of how to build long-term online communities, and build communities around campaigns. Presented at NCVS 2011.
This is a communication plan explaining the organizational change for the organization in the UPOX AET 560 Organizational Change Process Learning Team project
The economic performance of a country is mainly depending on the labour of youth population. Energetic, courageous and qualified youth can make changes to the social economic development if they are well utilized and managed. Investing in youth (ages 14 to 29) now will lay the groundwork for Ethiopia’s future. Strategies to continue progress toward harnessing the potential of its youth will help Ethiopia attain a demographic dividend and foster sustainable development. However, migration, unemployment, drug addiction, unfavorable policy environment and high population growth are the major problems of youth in the country. The overall objective of this paper is to review the current key challenges of youth in Ethiopia. Particularly the paper tries to: Review youth migration, youth unemployment and health and addiction related to youth and finally it suggested the possible solution for the challenges. The data collected, interpreted and evaluated all came from secondary data sources from country Central statistical agency, empirical study, country profile, different authors and researchers have written on the issue of youth; and other reports on youth related reports in Ethiopia. Finally, suggestions are made to overcome the challenges.
This presentation is one of the best presentations from our study material for our weekly workshops which ADMEC conducts every week at the center. This presentation contains very good information for “Use of Shapes in Graphic Design”.
Presenting this set of slides with name Weekly Project Status Updates. The topics discussed in these slides are Weekly Progress Reports, Weekly Performance Reports, Weekly Progress Tracking. This is a completely editable PowerPoint presentation and is available for immediate download. Download now and impress your audience. https://bit.ly/35OssgJ
The field of program evaluation presents a diversity of images a.docxcherry686017
The field of program evaluation presents a diversity of images and claims about the nature and role of evaluation that confounds any attempt to construct a coher- ent account of its methods or confidently identify important new developments. We take the view that the overarching goal of the program evaluation enterprise is to contribute to the improvement of social conditions by providing scientifically credible information and balanced judgment to legitimate social agents about the effectiveness of interventions intended to produce social benefits. Because of its centrality in this perspective, this review focuses on outcome evaluation, that is, the assessment of the effects of interventions upon the populations they are intended to benefit. The coverage of this topic is concentrated on literature published within the last decade with particular attention to the period subsequent to the related reviews by Cook and Shadish (1994) on social experiments and Sechrest & Figueredo (1993) on program evaluation.
The word ‘evaluation’ has become increasingly used in the language of community, health and social services and programs. The growth of talk and practice of evaluation in these fields has often been promoted and encouraged by funders and commissioners of services and programs. Following the interest of funders, has been a growth in the study and practice of evaluation by community, health and social service practitioners and academics. When we consider why this move in evaluative thinking and practice has occurred, we can assume the position of the funder and simply answer, ‘...because we want to know if this program or service works’. Practitioners, specialists and academics in these fields have been called upon by governments and philanthropists to aid the development of effective evaluation. Over time, they have led their own thinking and practice independently. Evaluation in its simplest form is about understanding the effect and impact of a program, service, or indeed a whole organization. Evaluation as a practice is not so simple however, largely because in order to assess impact, we need to be very clear at the beginning what effect or difference we are trying to achieve.
The literature review begins with an overview of qualitative and quantitative research methods, followed by a description of key forms of evaluation. Health promotion evaluation and advocacy and policy evaluation will then be explored as two specific domains. These domains are not evaluation methodologies, but forms of evaluation that present unique requirements for effective community development evaluation. Following this discussion, the review will explore eight key evaluation methodologies: appreciative enquiry, empowerment evaluation, social capital,
social return on investment, outcomes based evaluation, performance dashboards and scorecards and developmental evaluation. Each of these sections will include specific methods, the values base of each methodo ...
Evaluating community projects
These guidelines were initially developed as part of the JRF Neighbourhood Programme. This programme is made up of 20 community or voluntary organisations all wanting to exercise a more strategic influence in their neighbourhood. The guidelines were originally written to help these organisations evaluate their work. They provide step-by-step advice on how to evaluate a community project which will be of interest to a wider audience.
What is evaluation?
Put simply, evaluation by members of a project or organisation will help people to learn from their day-to-day work. It can be used by a group of people, or by individuals working alone. It assesses the effectiveness of a piece of work, a project or a programme. It can also highlight whether your project is moving steadily and successfully towards achieving what it set out to do, or whether it is moving in a different direction. You can then celebrate and build on successes as well as learn from what has not worked so well.
Why evaluate?
Although evaluation may seem like an unnecessary additional task if you are already short of time and resources, it can save you both time and resources by keeping participants focused on, and working towards, the ultimate goal of the project. If necessary, it can refocus activity away from unproductive or unnecessary work.
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
CHAPTER SIXTEENUnderstanding Context Evaluation and MeasuremeJinElias52
CHAPTER SIXTEEN
Understanding Context: Evaluation and Measurement in Not-for-Profit Sectors
Dale C. Brandenburg
Many individuals associated with community agencies, health care, public workforce development, and similar not-for-profit organizations view program evaluation akin to a visit to the dentist’s office. It’s painful, but at some point it cannot be avoided. A major reason for this perspective is that evaluation is seen as taking money away from program activities that perform good for others, that is, intruding on valuable resources that are intended for delivering the “real” services of the organization (Kopczynski & Pritchard, 2004). A major reason for this logic is that since there are limited funds available to serve the public good, why must a portion of program delivery be allocated to something other than serving people in need? This is not an unreasonable point and one that program managers in not-for-profits face on a continuing basis.
The focus of evaluation in not-for-profit organization has shifted in recent years from administrative data to outcome measurement, impact evaluation, and sustainability (Aspen Institute, 2000), thus a shift from short-term to long-term effects of interventions. Evaluators in the not-for-profit sector view their world as the combination of technical knowledge, communication skills, and political savvy that can make or break the utility and value of the program under consideration. Evaluation in not-for-profit settings tends to value the importance of teamwork, collaboration, and generally working together. This chapter is meant to provide a glimpse at a minor portion of the evaluation efforts that take place in the not-for-profit sector. It excludes, for example, the efforts in public education, but does provide some context for workforce development efforts.
CONTRAST OF CONTEXTS
Evaluation in not-for-profit settings tends to have different criteria for the judgment of its worth than is typically found in corporate and similar settings. Such criteria are likely to include the following:
How useful is the evaluation?
Is the evaluation feasible and practical?
Does the evaluation hold high ethical principles?
Does the evaluation measure the right things, and is it accurate?
Using criteria such as the above seems a far cry from concepts of return on investment that are of vital importance in the profit sector. Even the cause of transfer of training can sometimes be of secondary importance to assuring that the program is described accurately. Another difference is the pressure of time. Programs offered by not-for-profit organizations, such as an alcohol recovery program, take a long time to see the effects and, by the time results are viewable, the organization has moved on to the next program. Instead we often see that evaluation is relegated to measuring the countable, the numbers of people who have completed the program, rather than the life-changing impact that decreased alcohol abuse has on ...
A Good Program Can Improve Educational Outcomes.pdfnoblex1
We hope this guide helps practitioners and others strengthen programs designed to increase academic achievement, ultimately broadening access to higher education for youth and adults.
We believe that evaluation is a critical part of program design and is necessary for ongoing program improvement. Evaluation requires collecting reliable, current and compelling information to empower stakeholders to make better decisions about programs and organizational practices that directly affect students. A good evaluation is an effective way of gathering information that strengthens programs, identifies problems, and assesses the extent of change over time. A sound evaluation that prompts program improvement is also a positive sign to funders and other stakeholders, and can help to sustain their commitment to your program.
Theories of change are conceptual maps that show how and why program activities will achieve short-term, interim, and long-term outcomes. The underlying assumptions that promote, support, and sustain a program often seem self-evident to program planners. Consequently, they spend too little time clarifying those assumptions for implementers and participants. Explicit theories of change provoke continuous reflection and shared ownership of the work to be accomplished. Even the most experienced program planners sometimes make the mistake of thinking an innovative design will accomplish goals without checking the linkages among assumptions and plans.
Developing a theory of change is a team effort. The collective knowledge and experience of program staff, stakeholders, and participants contribute to formulating a clear, precise statement about how and why a program will work. Using a theory-based approach, program collaborators state what they are doing and why by working backwards from the outcomes they seek to the interventions they plan, and forward from interventions to desired outcomes. When defining a theory of change, program planners usually begin by deciding expected outcomes, aligning outcomes with goals, deciding on the best indicators to evaluate progress toward desired outcomes, and developing specific measures for evaluating results. The end product is a statement of the expected change that specifies how implementation, resources, and evaluation translate into desired outcomes.
Continuously evaluating a theory of change encourages program planners to keep an eye on their goals. Statements about how and why a program will work must be established using the knowledge of program staff, stakeholders, and participants. This statement represents the theory underlying the program plan and shows planners how resources and activities translate to desired improvements and outcomes. It also becomes a framework for program implementation and evaluation.
Source: https://ebookscheaper.com/2022/04/06/a-good-program-can-improve-educational-outcomes/
Connecticut Civic Ambassadors are everyday people who care about and engage others in their communities by creating opportunities for civic participation that strengthens our state’s “Civic Health.” Civic Health is determined by how well diverse groups of residents work together and with government to solve public problems to strengthen their communities. Read more below on how you can be an agent of change in your own community by joining the team.
A comprehensive guide designed to help you recruit people to your community change effort, work with the media, master social media, and tell your story in many different formats along the way.
Ripple Effects Mapping Tip Sheet for Evaluating Community Engagement Everyday Democracy
Community Engagement and Dialogue to Change strategies can lead to many positive changes in your community. However, direct impacts can be tough to track. Ripple Effects Mapping (REM) allows you, along with local leaders and others in your community, to assess impacts from your Dialogue to
Change efforts. It allows you to visually document the impacts your efforts have had on individuals, on your community, and on institutions and systems over time. These are tips for rolling out a Ripple Effects Mapping process:
Evaluation Guide Toolkit (Companion to Evaluating Community Engagement Guide)Everyday Democracy
Includes an Evaluation Capacity Self-Assessment Tool,
Sample Community Engagement Logic Model, Logic Model Template, Data Collection and Planning Template and Ripple Mapping Tip Sheet
The Wondertwins, "Black"- September 27th, West Hartford, CT Everyday Democracy
The Wondertwins, famed veteran hip-hop dance duo from Boston, perform their newest piece, BLACK. BLACK explores the traumatizing effects of police violence towards the black community by incorporating dance with historic and contemporary audio and video clips. Post-show dialogue will be facilitated by Everyday Democracy and the Connecticut Collaborative on Poverty, Criminal Justice and Race.
The practice of treating everyone fairly and justly regardless of age, with special consideration to the structural factors that privilege some age groups over others.
This is a brief guide developed for Stand Against Racism Day, 2019. The guide helps communities discuss immigration and how it connects to racial equity.
“American citizenship brings legal rights, protections, and responsibilities. But its meaning goes deeper. To be a citizen is to be accepted, to feel safe, to be ‘one of us.’ ”
Racism is rooted in our country's history and is embedded in our culture, and yet the history of structural racism is rarely taught or portrayed. Racism is still one of the greatest barriers to fulfilling the promise of our democracy. That is why Everyday Democracy uses a racial equity lens in all the work we do.
Unfortunately, most people in the U.S. have not had the chance to study and understand how racism has evolved and how it continues to affect every area of our lives. We don’t usually learn about it in school, except in cursory ways. Even then, it is often portrayed as a part of a distant past that stopped with the fight for civil rights in the 60s. That, in itself, is part of the “invisible” power of structural racism.
There are many people who don’t realize that, as a country, we still have work to do to create equal opportunities for all. And many aren’t aware that all of us – of every region of the country, of every color and ethnic background – are still dealing with the impact of slavery, Jim Crow, and other policies that have perpetuated unfair advantages based on color. All of us need to deepen our understanding of our full history, so that we can move beyond “us vs. them” to “us.” Only as we understand the forces that have shaped our lives can we begin imagine and create a democracy that supports voice and belonging for all.
To share an important part of this history, the New York Historical Society (NYHS) has developed a curriculum to help students and communities explore the legacy of racism. It includes three comprehensive units and printable resources. This curriculum was developed as part of NYHS’s current exhibit, Black Citizenship in the Age of Jim Crow, that explores the struggle for full citizenship and racial equity. This powerful exhibit uncovers not only the overt and hidden racism that marked a pivotal era in our history, it highlights the day-to-day acts of courage that so many people took to claim citizenship as belonging. It is impossible to see this exhibit without thinking about the parallels for today.
We invite you to use and share this curriculum with students, coworkers, family members, and community members. And then we invite you to work with us at Everyday Democracy to use your learning as a catalyst for expanding the dialogue and creating equitable change in your community and our country.
MRS PUNE 2024 - WINNER AMRUTHAA UTTAM JAGDHANEDK PAGEANT
Amruthaa Uttam Jagdhane, a stunning woman from Pune, has won the esteemed title of Mrs. India 2024, which is given out by the Dk Exhibition. Her journey to this prestigious accomplishment is a confirmation of her faithful assurance, extraordinary gifts, and profound commitment to enabling women.
Have you ever wondered about the lost city of Atlantis and its profound connection to our modern world? Ruth Elisabeth Hancock’s podcast, “Visions of Atlantis,” delves deep into this intriguing topic in a captivating conversation with Michael Le Flem, author of the enlightening book titled “Visions of Atlantis.” This podcast episode offers a thought-provoking blend of historical inquiry, esoteric wisdom, and contemporary reflections. Let’s embark on a journey of discovery as we unpack the mysteries of ancient civilizations and their relevance to our present existence.
La transidentité, un sujet qui fractionne les FrançaisIpsos France
Ipsos, l’une des principales sociétés mondiales d’études de marché dévoile les résultats de son étude Ipsos Global Advisor “Pride 2024”. De ses débuts aux Etats-Unis et désormais dans de très nombreux pays, le mois de juin est traditionnellement consacré aux « Marches des Fiertés » et à des événements festifs autour du concept de Pride. A cette occasion, Ipsos a réalisé une enquête dans vingt-six pays dressant plusieurs constats. Les clivages des opinions entre générations s’accentuent tandis que le soutien à des mesures sociétales et d’inclusion en faveur des LGBT+ notamment transgenres continue de s’effriter.
Care Instructions for Activewear & Swim Suits.pdfsundazesurf80
SunDaze Surf offers top swimwear tips: choose high-quality, UV-protective fabrics to shield your skin. Opt for secure fits that withstand waves and active movement. Bright colors enhance visibility, while adjustable straps ensure comfort. Prioritize styles with good support, like racerbacks or underwire tops, for active beach days. Always rinse swimwear after use to maintain fabric integrity.
Johnny Depp Long Hair: A Signature Look Through the Yearsgreendigital
Johnny Depp, synonymous with eclectic roles and unparalleled acting prowess. has also been a significant figure in fashion and style. Johnny Depp long hair is a distinctive trademark among the various elements that define his unique persona. This article delves into the evolution, impact. and cultural significance of Johnny Depp long hair. exploring how it has contributed to his iconic status.
Follow us on: Pinterest
Introduction
Johnny Depp is an actor known for his chameleon-like ability to transform into a wide range of characters. from the eccentric Captain Jack Sparrow in "Pirates of the Caribbean" to the introspective Edward Scissorhands. His long hair is one constant throughout his evolving roles and public appearances. Johnny Depp long hair is not a style choice but a significant aspect of his identity. contributing to his allure and mystique. This article explores the journey and significance of Johnny Depp long hair. highlighting how it has become integral to his brand.
The Early Years: A Budding Star with Signature Locks
1980s: The Rise of a Young Heartthrob
Johnny Depp's journey in Hollywood began in the 1980s. with his breakout role in the television series "21 Jump Street." During this time, his hair was short, but it was already clear that Depp had a penchant for unique and edgy styles. By the decade's end, Depp started experimenting with longer hair. setting the stage for a lifelong signature.
1990s: From Heartthrob to Icon
The 1990s were transformative for Johnny Depp his career and personal style. Films like "Edward Scissorhands" (1990) and "Benny & Joon" (1993) saw Depp sporting various hair lengths and styles. But, his long, unkempt hair in "What's Eating Gilbert Grape" (1993) began to draw significant attention. This period marked the beginning of Johnny Depp long hair. which became a defining feature of his image.
The Iconic Roles: Hair as a Character Element
Edward Scissorhands (1990)
In "Edward Scissorhands," Johnny Depp's character had a wild and mane that complemented his ethereal and misunderstood persona. This role showcased how long hair Johnny Depp could enhance a character's depth and mystery.
Captain Jack Sparrow: The Pirate with Flowing Locks
One of Johnny Depp's iconic roles is Captain Jack Sparrow from the "Pirates of the Caribbean" series. Sparrow's long, dreadlocked hair symbolised his rebellious and unpredictable nature. The character's look, complete with beads and trinkets woven into his hair. was a collaboration between Depp and the film's costume designers. This style became iconic and influenced fashion trends and Halloween costumes worldwide.
Other Memorable Characters
Depp's long hair has also been featured in other roles, such as Ichabod Crane in "Sleepy Hollow" (1999). and Roux in "Chocolat" (2000). In these films, his hair added a layer of authenticity and depth to his characters. proving that Johnny Depp with long hair is more than a style—it's a storytelling tool.
Off-Screen Influenc
Is your favorite ring slipping and sliding on your finger? You're not alone. Must Read this Guide on What To Do If Your Ring Is Too Big as shared by the experts of Andrews Jewelers.
From Stress to Success How Oakland's Corporate Wellness Programs are Cultivat...Kitchen on Fire
Discover how Oakland's innovative corporate wellness initiatives are transforming workplace culture, nurturing the well-being of employees, and fostering a thriving environment. From comprehensive mental health support to flexible work arrangements and holistic wellness workshops, these programs are empowering individuals to navigate stress effectively, leading to increased productivity, satisfaction, and overall success.
From Stress to Success How Oakland's Corporate Wellness Programs are Cultivat...
An Evaluation Guide for Community Engagement
1. An Evaluation Guide and
Toolkit for Practical Use
Evaluating Community
Engagement
2. Evaluating Community Engagement | Table of Contents | everyday-democracy.org
Table of Contents
This is a mini-guide that offers tips for evaluating community
engagement. It is not meant to be fully prescriptive on evaluation but it
is a practical, hands-on tool to help you determine if you are ready to
evaluate your community engagement work, and if yes, it offers
considerations for implementing an evaluation.
Part I — Evaluation Basics - Getting Ready.............................................. 1
Introduction
Readiness
Key Principles
Part II — Choosing the Right Evaluation.................................................... 2
Evaluation Approaches
Process Evaluation
Outcome Evaluation
Impact Evaluation
Participatory Approach
Part III — Planning the Evaluation............................................................... 4
Developing an Evaluation Framework
Develop a Logic Model
Seek Focus
Identify Measures
Identify Data
Part IV — Collecting and Analyzing Data.................................................... 6
Data Collection Methods
Data Analysis
Sample Survey Questions
References..................................................................................................... 9
Companion Toolkit
3. Evaluating Community Engagement | everyday-democracy.org 1
Evaluation Basics —
Getting Ready
“Evaluation helps us to learn, to do our work better, to be accountable to community and
funders, and to support funder-allies in advocating on our behalf.” — Building the Field
of Community Engagement Community Partners, 2015
Introduction
Evaluation is a critical aspect of any
community engagement process. This
resource guide was created to provide
some guidance in developing an evaluation
framework for your community engagement
work. This is not a comprehensive evaluation
tool but more of a primer that offers some
guiding principles and basic instruction when
evaluating community engagement.
Readiness
Determining your evaluation capacity is an
important step in community engagement
planning. This entails answering questions
like: “Is there money in the budget to pay
for an evaluation?”; “Do we have evaluation
expertise internally?”; or “Will we need to hire
an external evaluator?” One of the primary
benefits of assessing your evaluation capacity
is that it will help you make informed decisions
about choosing the ‘right size’ evaluation that
fits your current organizational capacity or
where you aim to expand toward.
InYour Toolkit: Everyday Democracy’s
Evaluation Capacity Self Assessment
Matrix
Key Principles
Evaluation is the systematic collection of
information about activities and outcomes of a
program or initiative. Often evaluation is used
for both learning and accountability. When
it comes to community engagement there is
no one-size-fits-all evaluation approach. The
scale and scope of the evaluation should
align with the community engagement plan.
For instance, if you are conducting a one-
time, single community engagement event,
you would evaluate the level of participation
and participant reactions and would collect
that information or data at the end of the
event. If you are implementing a community
engagement program that includes multiple
activities delivered over time, then you would
evaluate changes. You would collect data
throughout the process to assess changes in
individual knowledge, attitudes and behaviors
to see how those changes affect the way
people engage with each other and lead to
school and community change.
It is important to be intentional about including
evaluation in your community engagement
planning. When evaluation is overlooked,
you are limited to informal judgments and
anecdotal evidence when talking about the
success of your engagement effort.
In Short… 1) An evaluation strategy should be integral to the community engagement plan.
2) Evaluation capacity is important to determine. 3) The scale and scope of the evaluation should fit
the community engagement effort.
4. Evaluating Community Engagement | everyday-democracy.org 2
Choosing the Right
Evaluation
Evaluation Approaches
What do you and your stakeholders want to learn from
evaluating community engagement? This question is
an important one to ask because your answer will help
you determine the appropriate evaluation design and
approach to use. There are three evaluation design
options introduced here: Process, Outcome and Impact
Evaluation. Each one offers an opportunity to answer
important questions about community engagement. In
addition, they all allow for using a participatory approach
to evaluation, a more inclusive promising practice for you
to consider.
Process Evaluation
A process evaluation involves collecting data in the planning and implementation phases of
community engagement as shown below:
Planning: Process & Structure Implementation: Delivery & Quality
-- Frequency and content of planning
meetings
-- Inclusiveness of process
-- Diversity/representativeness of planners
-- Collaboration (shared decision-making)
-- Equity (all voices heard and valued)
-- Outreach to community
-- Demographic characteristics of participants
-- Context in which engagement activities are
delivered
-- Frequency and intensity of engagement activities
-- Participant satisfaction with engagement activities
It’s important to conduct a process evaluation because it provides a way to examine your strategy to
see what you need to do better to strengthen the effectiveness of community engagement. Process
evaluations are done at the beginning and throughout the engagement process. It allows you to answer
questions such as:
1. How inclusive is the engagement planning process? How equitable is it?
2. How do participants rate the quality of the engagement activities?
While results matter, without a process evaluation you won’t know why the engagement
process succeeded or failed.
OUTCO
M
E PR
OCESS
I M PA C T
Participatory
Approach
5. Evaluating Community Engagement | everyday-democracy.org 3
Outcome Evaluation
Outcome evaluation focuses on change. It
is used to assess change resulting from
community engagement, such as change
in the way people engage with each other
and change resulting from their engagement.
This evaluation collects data that pertain
to individual level changes in knowledge,
attitudes, beliefs and behaviors that affect
how people engage with each other. It also
collects data on community and organizational
level changes resulting from the engagement.
Outcome evaluation is conducted at the end
of an engagement process. It is often the
evaluation that is required by most funders.
Here is a sample evaluation question:
Q. To what extent are people in the
community engaged?
Impact Evaluation
This evaluation is considered a gold star
design because it sets out to establish
evidence of causality. It requires random
assignment of participants and the use of
an intervention group and a control or
comparison group. In this design people are
randomly assigned to be in an intervention or
control group and depending on which group
the individual is placed, they receive either
the intervention or something else believed
to be less effective than the intervention.
This evaluation can be more challenging to
implement and costly to do because of the
prerequisites needed to be able to conduct
it effectively.
The prerequisites for evaluating community
engagement for impact would include: 1)
having a long-standing (minimum 5-10 years)
community engagement program operating
that has a lot of data already collected on it,
2) previous evaluations have been conducted
and the results are readily available,
3) significant time, financial and human
capacity to conduct the evaluation.
If these conditions are met then conducting
an impact evaluation may be the way to go.
This type of evaluation answers the following
question:
Q. To what extent can community
change be attributed to community
engagement?
Participatory Approach
This approach is about making evaluation
inclusive and it aligns well with community
engagement because it supports the
involvement or participation of stakeholders.
With this approach participants (community
groups, youth and public service providers etc.)
are partners with the evaluator(s) and
participate in each phase of evaluation from
planning, data collection and analysis to
reporting on results.
Three key benefits of a participatory
approach are: 1) promoting buy-in to the
evaluation; 2) empowering participants as it
facilitates building evaluation capacity within
the community; 3) Creating space to obtain
input from all participants and acknowledging
and addressing “asymmetrical levels of power
and voice among stakeholders.”
(Sulfian, Grunbaum, Akintobi, Dozier, Eder,
Jones, Mullan, Weir & White-Cooper, 2011,
p. 171)
6. Evaluating Community Engagement | everyday-democracy.org 4
Planning the Evaluation
Developing an Evaluation Framework
There are four steps to developing a framework or plan for evaluating community engagement.
Step One: Develop A Logic Model, Step Two: Seek Focus, Step Three: Identify Measures,
Step Four: Identify Data
Step One | Develop A Logic Model
Describe the community engagement
program explicitly.
A logic model is a visual tool that allows you to
lay out your plan for achieving the goal(s) of
community engagement. It is best practice to
develop a logic model with a group of people
directly involved in planning and implementing
the community engagement program. The logic
model requires limited and explicit descriptions
of what is needed, activities to do, results and
outcomes expected to achieve the goal.
Most logic models have five primary
components that include: inputs, activities,
outputs, outcomes and goal(s) and impact.
Another way to define outcomes is changes.
In the logic model the flow of thinking is that
if activities are implemented effectively and
produce anticipated outputs then expected
changes that are listed as outcomes would
occur at different intervals short-term,
intermediate, and long-term. An example of a
logic model framework is below.
Logic Model
Inputs
Outputs Outcomes
Activites
Participation
Levels
Short Intermediate Long
Goal Statement
InYour Toolkit: Sample Community Engagement Logic Model
Impact
7. Evaluating Community Engagement | everyday-democracy.org 5
Step Two | Seek Focus
Be clear about the purpose of the evaluation
and what you want to learn
Determine why you want to evaluate your
community engagement program and state the
reasons in the plan. By focusing the evaluation,
you avoid collecting data that are not relevant.
It is important to explore what people want
to learn from the evaluation with as many
stakeholders as possible so that the evaluation
asks the right questions and identifies the
appropriate data to collect.
Step Three | Identify Measures
Articulate indicators of authentic community
engagement and measures
An important question to ask when developing
the evaluation plan is: what does successful
community engagement look like and how
will we know it when we see it? This is where
knowledge of some of the principles of
authentic engagement is essential.
A framework for principles of engagement
is provided below that includes a list of potential
indicators and measures to help guide thinking
about community engagement through an
equity lens. Each principle is listed and there
are indicators of success for each one.
There is also a list of possible measures of
the indicators.
Step Four | Identify Data
Identify the type of data you will need
to collect
There are basically two types of data collected
in evaluation: quantitative or qualitative.
Quantitative or numerical data are collected
from surveys, census reports, tests and other
sources. Results from these data require
calculations or statistical analyses to make
meaning of the numbers by addressing ‘how
many, how much, or how often.’
Qualitative or narrative data are collected from
stories, interviews, survey comments or quotes
from focus groups. This type of data allows
for gaining a more in-depth understanding of
people’s experiences. These data address ‘why’
and ‘what’ has made a difference.
Principles of Authentic Engagement Indicators & Measures
Principles
of Authentic
Engagement
Indicators of Success:
How you know that engagement is working?
As Measured by:
Equitable
-- Understanding of inequity is used to create more equitable
opportunities
-- Shared knowledge, resources and power are evident
-- Structural racism, socio-economic disparity, unequal educational
opportunity and other factors that have shaped community and nation
are examined
-- How often equity is covered in the
content of Community Engagement (CE)
plans and agendas
-- Participant feedback on surveys
-- Participant interviews
Inclusive-
- Involves diverse people representative of community demographics
-- Equitable opportunity for all people to participate
-- Opportunities for diverse voices and perspectives
to be shared and heard are plentiful
-- Number of people from diverse groups
represented in CE planning and
implementation
-- Number of positive self-reports
participants about their experience
Connected to
decision-
making
-- Decisions and policies reflect everyone’s voice
-- Decisions communicate the needs, interests and values of everyone
-- Decision-making process is transparent
-- Self-reports of participants on
inclusiveness of decision-making process
-- Content of decisions
Connected
to change
-- Connects local change to national movements
-- More cohesive communities and governance
-- Number of changes in policies, practices,
& public participation in education
8. Evaluating Community Engagement | everyday-democracy.org 6
Collecting and
Analyzing Data
“I never guess. It is a capital mistake to theorize before one has data. Insensibly one
begins to twist facts to suit theories, instead of theories to suit facts.”
— Sir Arthur Conan Doyle, Author of Sherlock Holmes stories
Data Collection Methods
The methods used to collect data in evaluation
will depend on the type of data-whether it
is quantitative or qualitative. Common data
collection methods include: surveys, interviews,
focus groups, facilitated conversations, and
tests. To evaluate community engagement,
any one of these methods listed may be used.
However, decisions about which methods are
appropriate for your community context should
be made during planning.
Surveys are popular to use because people
can take them online or in-person. They allow
you to collect information from a large number
of participants at one point in time or at
different points in the community engagement
process. They also afford a degree of
anonymity to respondents which can increase
the likelihood that people will respond more
honestly and openly to the questions. A
survey can be long and complicated or short
and simple. A shorter survey is more likely to
be completed.
-- The types of items on a survey will vary
from closed-ended questions, multiple
choice and fill-in-the blank items to open-
ended questions.
-- Designing a survey that will give you the
data you need to answer your evaluation
questions requires thoughtful consideration of
the items that are proposed and should be a
collaborative team process.
Interviews are used when you want to gather
in-depth information from participants to
capture their perspectives and perceptions of
the community engagement experience.
-- Interviews can be structured — where the
interviewers are expected to follow a rigid
protocol with specific questions that are asked
the same way for each interview. Or they can
be semi-structured — where there is a protocol
but the way questions are asked may vary
depending on who is being interviewed.
Tips for Designing Survey Questions: Know the goal and objectives of the survey so you can ask the
right questions and have good data for analysis. Be mindful of your wording and avoid leading questions
like, “Experts agree that one-off community engagement is the least effective. Do you agree?” Avoid
using too many “catch-all” responses like “don’t know,” “none of the above” or “other.” Use language
everyone can understand. Do not use jargon or colloquial words or phrases. Avoid general/non-directed
questions such as, “What did you think of the event?” Avoid “double barreled” questions like,“What was
the most fun and most informative activity during the event?”
9. Evaluating Community Engagement | everyday-democracy.org 7
Focus Groups and Facilitated
Conversations are structured group
interviews used to collect information from
people about a shared group experience that
could be part of a community engagement
program. These methods are useful when it is
not practical to conduct individual interviews
because of time and resource limitations.
-- The size for these types of group interviews
may vary, however, eight to 12 people are
recommended for focus groups.-
- These methods also require a protocol with
specific questions that the group is asked to
respond to.
Ripple Effect Mapping (REM) is a method
used in evaluation to engage key
stakeholders in a participatory process to
assess the impact of community
engagement. It allows participants to look
back over a period of time and create a visual
map of direct or indirect impacts of
community engagement they have seen at
group, community, institutional or system
levels. REM is led by two facilitators who
guide the group in their discussions. A
session may have from 8 to 20 participants
made up of a diverse cross section of
stakeholders. It can be two hours or longer
depending on the size of the group. At the
end of REM, the data is collected by the
session leaders and sent to an evaluator for
analysis and reporting.
REM has shown high efficacy as a method
for evaluating impacts from community
engagement when the Dialogue to Change
approach has been used. There is a REM tip
sheet in the toolkit that is part of this guide.
Tests are used to collect data from participants
to assess changes in knowledge and skills.
They are typically administered in a pre- and
post-timeframe in order to measure gains or
losses in knowledge or mastery of skills after
participating in a program or event.
Data Analysis
When you have collected data using various
methods described, you will have collected
both quantitative and qualitative data.
Quantitative analysis involves working with
numbers so that the results are presented in
some statistical form such as percentages,
frequencies, or averages. If you have a large
amount of quantitative data to analyze on a
number of different variables, you may want
to use a statistical software program such as
Statistical Package for the Social Sciences
(SPSS) or Statistical Analysis System (SAS).
These programs can handle large volumes
of data and allow you to conduct complex
statistical calculations on the data. These
programs can be cost prohibitive so an
alternative would be to use Microsoft Excel.
Qualitative analysis involves taking survey
participant comments, or the results of
individual and/or group interviews and finding
themes. In order to identify themes, the data
has to be coded and categorized and this
can be done manually or through computer
software. There are several qualitative
software programs available and two popular
ones are NVivo or ATLAS-ti both of which can
be costly. An inexpensive program available
online that is very helpful for coding qualitative
data is called Dedoose.
Links to the websites for these software
programs are included on the Reference page.
When in doubt about the appropriate data analysis tools to use, ask an evaluator.
10. Evaluating Community Engagement | everyday-democracy.org 8
Sample Survey
Demographic Questions
Used to collect information on representation.
Examples:
1) What racial/ethnic group do you identify with?
2) What gender do you identify with?
3) What is your age? Under 18 18-29 30-49 50-64 Over 65
Multiple Choice Questions
Used to ask to collect quantitative data and allows participants to have multiple options.
Examples:
1) How did you learn about the event? (Please select all that apply)
Local newspaper Email invitation Friend/Colleague recruited me
Flyer posted around community Other (Please Specify)
Likert Type Rating Scales
Used to collect quantitative data by allowing participants to give a rating response.
Examples:
1) How likely are you to recommend this event to a friend?
(Very Likely) (Likely) (Somewhat Likely) (Probably Not Likely) (Definitely Not Likely)
2) Will you play a larger role in your local community after participating in this event?
(Very Likely) (Likely) (Somewhat Likely) (Probably Not Likely) (Definitely Not Likely)
Open-Ended Questions
These are questions designed to capture qualitative information so that you can get a better
understanding of the participant’s experience in the event in their own words.
Example:
What is the most valuable thing you learned from this community engagement activity?
11. Evaluating Community Engagement | everyday-democracy.org 9
References
http://www.nsf.gov/pubs/2002/nsf02057/nsf02057_4.pdf “An Overview of Quantitative and Qualitative
Data Collection Methods,” found in The 2002 User Friendly Handbook for Project Evaluation
http://nexuscp.org/wp-content/uploads/2015/05/BTF-EvalofCommEng.pdf Evaluation and Community
Engagement (2015). Building the Field of Community Engagement partners & Tracy Babler
https://www.qld.gov.au/web/community-engagement/guides-factsheets/documents/engaging-
queenslanders-evaluating.pdf Engaging Queenslanders: Evaluating community engagement (2004).
Anna L. Johnson
http://www.publicagenda.org/files/ATD-self-assessment.pdf Community Engagement (CE) Self-
Assessment Matrix for Achieving the Dream (ATD) Colleges (2006)
Data Analysis Software Links:
SPSS - https://www.ibm.com/products/spss-statistics
SAS - https://www.sas.com
NVIVO - http://www.qsrinternational.com/nvivo/nvivo-products
Atlas-ti - http://atlasti.com
Dedoose - http://www.dedoose.com/
12. Evaluation Guide Author: Deloris Vaughn, PhD — Everyday Democracy
Director of Evaluation and Learning
Dr. Vaughn has extensively trained and coached professionals on evaluation planning and
implementation in the nonprofit sector, taught undergraduate and graduate courses, and has operated
her evaluation consulting practice for several years. She received her PhD. in Curriculum and Instruction
from the University of South Florida. Deloris can be reached at: dvaughn@everyday-democracy.org
About Everyday Democracy Everyday Democracy works to strengthen democracy by making authentic
engagement and public participation a permanent part of the way we work as a country. Since our
founding in 1989, we have worked with hundreds of communities throughout the US, first by offering
small, structured dialogues that led to positive and lasting change, and now offering an array of flexible
resources and discussion guides, technical assistance and coaching, and information about our Dialogue
to Change process.
Our process uses solid engagement principles with a racial equity lens, and leads from personal
connection to sustained action. We also work with Anchor Partners, and throughout the country, to
expand our impact and create a democracy movement. Everyday Democracy is a project of the
Paul J. Aicher Foundation.
July 2018 –Version 2 | everyday-democracy.org
75 Charter Oak Avenue – Hartford, CT 06103