This guide has been produced for Our Place areas who are implementing their Operational Plans, to support you to explore the reasons and uses for evaluation, and why it might help to add value to your work. It explores the principles that underpin robust (but realistic) evaluation, presenting guidelines that you can use to inform the development of your own evaluation plan.
If you want to effectively tell people your story, or convince a community to support your plans for action or change, this presentation will help you on your mission.
How to set up, run and sustain a community hub to transform local service provision
This presentation contains:
An overview of Community Hubs
What they are and the benefits they bring
Examples of hubs in practice
Tips for setting up and sustaining community hubs.
Whether you’re taking over a community asset, drawing up your neighbourhood plan or helping to reshape public services, drawing attention to your project through your local newspaper, radio station or television network can help you gain support from your community and influence decision-makers.
This guide and press release template will help you engage local journalists so they cover your story, giving your project a publicity boost.
Direction of Health and Social care in Norfolk CANorfolk
Jon Clemo (Chief Executive, Community Action Norfolk) facilitates a conversation with Melanie Craig (Chief Officer, Norfolk & Waveney Clinical Commissioning Group) and James Bullion (Executive Director, Adult Social Services, Norfolk County Council) on the direction of Health and Social Care in Norfolk based on questions received from the VCSE sector.
Seth Reynolds (Principal Consultant for Systems Change at NPC) and Katie Turner (Deputy Head of Research at the Institute for Voluntary Action Research (IVAR)) share their insights and inspiration on how we can build on the adaptations and innovation shown so far this year, to influence and shape a better future for people and communities in Norfolk.
Developing & sustaining community based voluntary action CANorfolk
Co-ordinators from North Walsham Good Neighbour Scheme,
Mattishall Volunteer Hub and Great Hockham Good Neighbour Scheme share their experiences of helping and supporting vulnerable residents in their communities. As groups at different stages of development this will provide an opportunity to understand the factors involved in successfully developing and sustaining grassroots community-based voluntary action.
If you want to effectively tell people your story, or convince a community to support your plans for action or change, this presentation will help you on your mission.
How to set up, run and sustain a community hub to transform local service provision
This presentation contains:
An overview of Community Hubs
What they are and the benefits they bring
Examples of hubs in practice
Tips for setting up and sustaining community hubs.
Whether you’re taking over a community asset, drawing up your neighbourhood plan or helping to reshape public services, drawing attention to your project through your local newspaper, radio station or television network can help you gain support from your community and influence decision-makers.
This guide and press release template will help you engage local journalists so they cover your story, giving your project a publicity boost.
Direction of Health and Social care in Norfolk CANorfolk
Jon Clemo (Chief Executive, Community Action Norfolk) facilitates a conversation with Melanie Craig (Chief Officer, Norfolk & Waveney Clinical Commissioning Group) and James Bullion (Executive Director, Adult Social Services, Norfolk County Council) on the direction of Health and Social Care in Norfolk based on questions received from the VCSE sector.
Seth Reynolds (Principal Consultant for Systems Change at NPC) and Katie Turner (Deputy Head of Research at the Institute for Voluntary Action Research (IVAR)) share their insights and inspiration on how we can build on the adaptations and innovation shown so far this year, to influence and shape a better future for people and communities in Norfolk.
Developing & sustaining community based voluntary action CANorfolk
Co-ordinators from North Walsham Good Neighbour Scheme,
Mattishall Volunteer Hub and Great Hockham Good Neighbour Scheme share their experiences of helping and supporting vulnerable residents in their communities. As groups at different stages of development this will provide an opportunity to understand the factors involved in successfully developing and sustaining grassroots community-based voluntary action.
Embedding learning from cooperative projects Noel Hatch
How can we collaborate with people to help them build their resilience? Get under the skin of the culture and the lives people live. Identify people’s feelings and experiences of community and understand what people think is shaped by different values and by the environment and infrastructure around them. The future of collaboration could bring many opportunities but people find it more difficult to live and act together than before. How can we help people…and communities build their resilience? Understand people’s different situations and capabilities to develop pathways that help them build resilient relationships. Help people experience and practice change together. Help people grow everyday practices into sustainable projects. Turn people’s everyday motivations into design principles. Support infrastructure that connects different cultures of collaboration. Build relationships with people designing in collaboration for the future…now.
Gary Kent of NewKey and Jacqui Hendra of Devon County Council describe how the use of Individual Service Funds has promoted trust, flexibility and a focus on outcomes in health and social care.
Chris Watson of the Cooperative for Welfare Reform explains why Individual Service Funds are so important and how they can be used to help people live lives of citizenship and transform local communities. This talk was given as a Centre for Welfare Reform Webinar.
How can we collaborate with people to help them build their resilience? Get under the skin of the culture and the lives people live. Identify people’s feelings and experiences of community and understand what people think is shaped by different values and by the environment and infrastructure around them. The future of collaboration could bring many opportunities but people find it more difficult to live and act together than before. How can we help people…and communities build their resilience? Understand people’s different situations and capabilities to develop pathways that help them build resilient relationships. Help people experience and practice change together. Help people grow everyday practices into sustainable projects. Turn people’s everyday motivations into design principles. Support infrastructure that connects different cultures of collaboration. Build relationships with people designing in collaboration for the future…now.
From ‘what’s the matter with you’ to ‘what matters to you’ : the assets appr...Iriss
IRISS has, for a long time, been interested in the way that
asset-based approaches can redress in favour of doing
things with people rather than doing things to people.
We set out with our partners in East Dunbartonshire to
explore how to implement an assets approach in action.
Contributor: IRISS
This presentation is all about the project Management which includes level of success of a project, Monitoring & evaluation, LFA in view of development sector. This presentation has been prepared in view of development/Social or Non-profit sector.
Note: Any kind of feedback from industry experts will always be appreciated.
Embedding learning from cooperative projects Noel Hatch
How can we collaborate with people to help them build their resilience? Get under the skin of the culture and the lives people live. Identify people’s feelings and experiences of community and understand what people think is shaped by different values and by the environment and infrastructure around them. The future of collaboration could bring many opportunities but people find it more difficult to live and act together than before. How can we help people…and communities build their resilience? Understand people’s different situations and capabilities to develop pathways that help them build resilient relationships. Help people experience and practice change together. Help people grow everyday practices into sustainable projects. Turn people’s everyday motivations into design principles. Support infrastructure that connects different cultures of collaboration. Build relationships with people designing in collaboration for the future…now.
Gary Kent of NewKey and Jacqui Hendra of Devon County Council describe how the use of Individual Service Funds has promoted trust, flexibility and a focus on outcomes in health and social care.
Chris Watson of the Cooperative for Welfare Reform explains why Individual Service Funds are so important and how they can be used to help people live lives of citizenship and transform local communities. This talk was given as a Centre for Welfare Reform Webinar.
How can we collaborate with people to help them build their resilience? Get under the skin of the culture and the lives people live. Identify people’s feelings and experiences of community and understand what people think is shaped by different values and by the environment and infrastructure around them. The future of collaboration could bring many opportunities but people find it more difficult to live and act together than before. How can we help people…and communities build their resilience? Understand people’s different situations and capabilities to develop pathways that help them build resilient relationships. Help people experience and practice change together. Help people grow everyday practices into sustainable projects. Turn people’s everyday motivations into design principles. Support infrastructure that connects different cultures of collaboration. Build relationships with people designing in collaboration for the future…now.
From ‘what’s the matter with you’ to ‘what matters to you’ : the assets appr...Iriss
IRISS has, for a long time, been interested in the way that
asset-based approaches can redress in favour of doing
things with people rather than doing things to people.
We set out with our partners in East Dunbartonshire to
explore how to implement an assets approach in action.
Contributor: IRISS
This presentation is all about the project Management which includes level of success of a project, Monitoring & evaluation, LFA in view of development sector. This presentation has been prepared in view of development/Social or Non-profit sector.
Note: Any kind of feedback from industry experts will always be appreciated.
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...OECD CFE
Presentation by Jonathan Potter, OECD LEED Senior Policy Analyst, and Stuart Thompson, OECD LEED Policy Analys, tat the seminar organised by the OECD LEED Trento Centre for the Officers of the Autonomous Province of Trento on 13 November 2015.
https://www.trento.oecd.org
A Good Program Can Improve Educational Outcomes.pdfnoblex1
We hope this guide helps practitioners and others strengthen programs designed to increase academic achievement, ultimately broadening access to higher education for youth and adults.
We believe that evaluation is a critical part of program design and is necessary for ongoing program improvement. Evaluation requires collecting reliable, current and compelling information to empower stakeholders to make better decisions about programs and organizational practices that directly affect students. A good evaluation is an effective way of gathering information that strengthens programs, identifies problems, and assesses the extent of change over time. A sound evaluation that prompts program improvement is also a positive sign to funders and other stakeholders, and can help to sustain their commitment to your program.
Theories of change are conceptual maps that show how and why program activities will achieve short-term, interim, and long-term outcomes. The underlying assumptions that promote, support, and sustain a program often seem self-evident to program planners. Consequently, they spend too little time clarifying those assumptions for implementers and participants. Explicit theories of change provoke continuous reflection and shared ownership of the work to be accomplished. Even the most experienced program planners sometimes make the mistake of thinking an innovative design will accomplish goals without checking the linkages among assumptions and plans.
Developing a theory of change is a team effort. The collective knowledge and experience of program staff, stakeholders, and participants contribute to formulating a clear, precise statement about how and why a program will work. Using a theory-based approach, program collaborators state what they are doing and why by working backwards from the outcomes they seek to the interventions they plan, and forward from interventions to desired outcomes. When defining a theory of change, program planners usually begin by deciding expected outcomes, aligning outcomes with goals, deciding on the best indicators to evaluate progress toward desired outcomes, and developing specific measures for evaluating results. The end product is a statement of the expected change that specifies how implementation, resources, and evaluation translate into desired outcomes.
Continuously evaluating a theory of change encourages program planners to keep an eye on their goals. Statements about how and why a program will work must be established using the knowledge of program staff, stakeholders, and participants. This statement represents the theory underlying the program plan and shows planners how resources and activities translate to desired improvements and outcomes. It also becomes a framework for program implementation and evaluation.
Source: https://ebookscheaper.com/2022/04/06/a-good-program-can-improve-educational-outcomes/
The implementation 'black box' and evaluation as a driver for change. Presentation by Katie Burke and Claire Hickey of the Centre for Effective Services.
The field of program evaluation presents a diversity of images a.docxcherry686017
The field of program evaluation presents a diversity of images and claims about the nature and role of evaluation that confounds any attempt to construct a coher- ent account of its methods or confidently identify important new developments. We take the view that the overarching goal of the program evaluation enterprise is to contribute to the improvement of social conditions by providing scientifically credible information and balanced judgment to legitimate social agents about the effectiveness of interventions intended to produce social benefits. Because of its centrality in this perspective, this review focuses on outcome evaluation, that is, the assessment of the effects of interventions upon the populations they are intended to benefit. The coverage of this topic is concentrated on literature published within the last decade with particular attention to the period subsequent to the related reviews by Cook and Shadish (1994) on social experiments and Sechrest & Figueredo (1993) on program evaluation.
The word ‘evaluation’ has become increasingly used in the language of community, health and social services and programs. The growth of talk and practice of evaluation in these fields has often been promoted and encouraged by funders and commissioners of services and programs. Following the interest of funders, has been a growth in the study and practice of evaluation by community, health and social service practitioners and academics. When we consider why this move in evaluative thinking and practice has occurred, we can assume the position of the funder and simply answer, ‘...because we want to know if this program or service works’. Practitioners, specialists and academics in these fields have been called upon by governments and philanthropists to aid the development of effective evaluation. Over time, they have led their own thinking and practice independently. Evaluation in its simplest form is about understanding the effect and impact of a program, service, or indeed a whole organization. Evaluation as a practice is not so simple however, largely because in order to assess impact, we need to be very clear at the beginning what effect or difference we are trying to achieve.
The literature review begins with an overview of qualitative and quantitative research methods, followed by a description of key forms of evaluation. Health promotion evaluation and advocacy and policy evaluation will then be explored as two specific domains. These domains are not evaluation methodologies, but forms of evaluation that present unique requirements for effective community development evaluation. Following this discussion, the review will explore eight key evaluation methodologies: appreciative enquiry, empowerment evaluation, social capital,
social return on investment, outcomes based evaluation, performance dashboards and scorecards and developmental evaluation. Each of these sections will include specific methods, the values base of each methodo ...
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Understanding the Challenges of Street ChildrenSERUDS INDIA
By raising awareness, providing support, advocating for change, and offering assistance to children in need, individuals can play a crucial role in improving the lives of street children and helping them realize their full potential
Donate Us
https://serudsindia.org/how-individuals-can-support-street-children-in-india/
#donatefororphan, #donateforhomelesschildren, #childeducation, #ngochildeducation, #donateforeducation, #donationforchildeducation, #sponsorforpoorchild, #sponsororphanage #sponsororphanchild, #donation, #education, #charity, #educationforchild, #seruds, #kurnool, #joyhome
Up the Ratios Bylaws - a Comprehensive Process of Our Organizationuptheratios
Up the Ratios is a non-profit organization dedicated to bridging the gap in STEM education for underprivileged students by providing free, high-quality learning opportunities in robotics and other STEM fields. Our mission is to empower the next generation of innovators, thinkers, and problem-solvers by offering a range of educational programs that foster curiosity, creativity, and critical thinking.
At Up the Ratios, we believe that every student, regardless of their socio-economic background, should have access to the tools and knowledge needed to succeed in today's technology-driven world. To achieve this, we host a variety of free classes, workshops, summer camps, and live lectures tailored to students from underserved communities. Our programs are designed to be engaging and hands-on, allowing students to explore the exciting world of robotics and STEM through practical, real-world applications.
Our free classes cover fundamental concepts in robotics, coding, and engineering, providing students with a strong foundation in these critical areas. Through our interactive workshops, students can dive deeper into specific topics, working on projects that challenge them to apply what they've learned and think creatively. Our summer camps offer an immersive experience where students can collaborate on larger projects, develop their teamwork skills, and gain confidence in their abilities.
In addition to our local programs, Up the Ratios is committed to making a global impact. We take donations of new and gently used robotics parts, which we then distribute to students and educational institutions in other countries. These donations help ensure that young learners worldwide have the resources they need to explore and excel in STEM fields. By supporting education in this way, we aim to nurture a global community of future leaders and innovators.
Our live lectures feature guest speakers from various STEM disciplines, including engineers, scientists, and industry professionals who share their knowledge and experiences with our students. These lectures provide valuable insights into potential career paths and inspire students to pursue their passions in STEM.
Up the Ratios relies on the generosity of donors and volunteers to continue our work. Contributions of time, expertise, and financial support are crucial to sustaining our programs and expanding our reach. Whether you're an individual passionate about education, a professional in the STEM field, or a company looking to give back to the community, there are many ways to get involved and make a difference.
We are proud of the positive impact we've had on the lives of countless students, many of whom have gone on to pursue higher education and careers in STEM. By providing these young minds with the tools and opportunities they need to succeed, we are not only changing their futures but also contributing to the advancement of technology and innovation on a broader scale.
Canadian Immigration Tracker March 2024 - Key SlidesAndrew Griffith
Highlights
Permanent Residents decrease along with percentage of TR2PR decline to 52 percent of all Permanent Residents.
March asylum claim data not issued as of May 27 (unusually late). Irregular arrivals remain very small.
Study permit applications experiencing sharp decrease as a result of announced caps over 50 percent compared to February.
Citizenship numbers remain stable.
Slide 3 has the overall numbers and change.
A process server is a authorized person for delivering legal documents, such as summons, complaints, subpoenas, and other court papers, to peoples involved in legal proceedings.
Presentation by Jared Jageler, David Adler, Noelia Duchovny, and Evan Herrnstadt, analysts in CBO’s Microeconomic Studies and Health Analysis Divisions, at the Association of Environmental and Resource Economists Summer Conference.
Russian anarchist and anti-war movement in the third year of full-scale warAntti Rautiainen
Anarchist group ANA Regensburg hosted my online-presentation on 16th of May 2024, in which I discussed tactics of anti-war activism in Russia, and reasons why the anti-war movement has not been able to make an impact to change the course of events yet. Cases of anarchists repressed for anti-war activities are presented, as well as strategies of support for political prisoners, and modest successes in supporting their struggles.
Thumbnail picture is by MediaZona, you may read their report on anti-war arson attacks in Russia here: https://en.zona.media/article/2022/10/13/burn-map
Links:
Autonomous Action
http://Avtonom.org
Anarchist Black Cross Moscow
http://Avtonom.org/abc
Solidarity Zone
https://t.me/solidarity_zone
Memorial
https://memopzk.org/, https://t.me/pzk_memorial
OVD-Info
https://en.ovdinfo.org/antiwar-ovd-info-guide
RosUznik
https://rosuznik.org/
Uznik Online
http://uznikonline.tilda.ws/
Russian Reader
https://therussianreader.com/
ABC Irkutsk
https://abc38.noblogs.org/
Send mail to prisoners from abroad:
http://Prisonmail.online
YouTube: https://youtu.be/c5nSOdU48O8
Spotify: https://podcasters.spotify.com/pod/show/libertarianlifecoach/episodes/Russian-anarchist-and-anti-war-movement-in-the-third-year-of-full-scale-war-e2k8ai4
What is the point of small housing associations.pptxPaul Smith
Given the small scale of housing associations and their relative high cost per home what is the point of them and how do we justify their continued existance
2. Welcome to My Community
mycommunity.org.uk
My Community helps communities take control
over their neighbourhoods through advice, direct
support and grants to enable communities to have
influence and control over local assets, services
and development.
3. Background to the guide
• The guide has been developed by OPM, an independent, not-for-
profit research and consultancy organisation that has supported
community initiatives forming part of the Our Place programme.
• It has been developed based on insights gained from our work with
Our Place projects, and a series of regional workshops we
delivered to support project leads to build their knowledge and
confidence in evaluation and communicating with funders.
• The guide is also informed by OPM’s wider experience in research
and evaluation of many different public service initiatives and
programmes over the past 25 years.
4. This guide is for:
This guide has been produced for Our Place areas who are implementing their Operational
Plans, to support you to explore the reasons and uses for evaluation, and why it might
help to add value to your work. It explores the principles that underpin robust (but
realistic) evaluation, presenting guidelines that you can use to inform the development of
your own evaluation plan.
The guide provides information about different types of evaluation and when you might
want to consider using each one. It goes on to explore how to develop an evaluation
plan, taking you through a step-by-step process.
Finally, the guide considers some of the processes through which you might capture and
report your evidence - in ways that speak to commissioners, potential supporters and
funders.
5. What you will get from using this guide
An understanding of how and why good evaluation can prove
useful
An overview of the different types of evaluation
A step-by-step guide through the process for developing the
‘theory of change’ for your initiative
Practical tips and tools to help you feel more confident about
evidence, and how to identify, obtain and use it
Guidance on mapping your stakeholders and identifying those
you might need to inform versus those you need to influence.
7. What is evaluation?
• It’s not just a measurement issue, it is about
demonstrating impact & value:
• Knowing that you are making a difference to
the people/community you serve
• Communicating your evidence in an
impactful way
• Influencing commissioners/funders
• Accountability and doing better.
Evaluation comprises of association, attribution
and performance.
Association:
• Establishing baseline of logic and process
• Tracking against the baseline
• Is what is happening on the ground
actually what was intended?
Attribution:
• How sure are you that what you did
brought about the changes?
• What evaluation design will best enable
you to ‘attribute’ robustly?
• Which indicators are most suitable?
Performance:
• Are you meeting set standards?
• Performance alone doesn’t tell you about
coherence or the impacts on different
groups.
8. Different forms of evaluation
Summative and formative
Summative evaluation
Tests the impact of what you’ve done
• Can help to develop ownership and build
consensus
• Uses emerging findings to shape and
improve practice as you go along
• Maximises the chances of success
• Focuses on understanding the activities
that have underpinned the programme
• Formative evaluation focuses on the
process
• This can be just as important as understanding
and learning about the impact, depending on
what specifically you need to get out of the
evaluation.
Formative evaluation
Tests the process you’ve gone through
• Helps answer ‘what works’, ‘how things
work’, ‘why things work’, and / or ‘who
needs to be involved’
• Some can answer ‘how much does it
cost?’
• Tells you after the event
• Understanding what has happened as a
result of the programme: impact
• The two can be mutually dependent, or
you can focus on one of the other.
9. Case studies
Some examples of different types of evaluation
Formative evaluation – Making it Better
OPM were commissioned to undertake a process evaluation of the Making it Better reconfiguration
of children’s, neonatal and maternity services in Greater Manchester. The work was originally
commissioned by NHS North.
This process evaluation involved over 50 in-depth interviews with politicians, clinicians, Trust
managers and policy makers, to understand the processes undertaken, what worked well and less
well, who was involved in each aspect (and why), and identify learning and advice for others.
Document reviews were also undertaken to enable us to understand the processes and strategic
plan, and workshops were facilitated with commissioners and programme leads.
The evaluation led to the production of a guidance toolkit, to help others planning a major NHS
service reconfiguration to learn from the Making it Better experience. The toolkit and report can be
accessed here.
10. Formative & summative evaluation – Early Language and
Development programme
OPM were commissioned by the Department for Education to evaluate the brand new
Early Language Development Programme, a training and development programme aimed
at increasing practitioner and parental awareness and confidence in supporting young
children’s speech, language and communication.
OPM generated both formative (process) learning about the pilot, including learning about
the cascade model, training materials and recruitment processes. OPM also generated
summative (impact) evidence regarding the impact of the programme on practitioners,
those providing the training, and on parents and carers. The evaluation involved surveys
carried out before (baseline), immediately after training, and follow up (6-9 months after
training), to assess impact. OPM also carried out site visits, telephone interviews, and
developed case studies to explore the impacts of the programme within different early
years settings. The findings can be accessed here.
11. Different forms of evaluation
• If you are building up an evaluation plan to run
alongside a programme, to understand processes
and what works well or less well, you should be
working to a formative evaluation
• If you are looking at outcome indicators to
demonstrate impact, you are working to an
impact evaluation
The two are not mutually exclusive, so an evaluation
can include elements of more than one of these.
Co-produced, internally conducted
or externally led?
• You may wish to ‘co-produce’ your
evaluation with service users, which
means they would help you design the
evaluation, collect the data and make
sense of it. You may conduct the
evaluation in-house, or you could
commission an external provider (e.g.
possibly a university or consultancy)
• The boundary between the ‘evaluator’
and the ‘evaluated’ can be slightly
blurred. Collaboration and capacity
building can help to enhance likelihood
of sustainable improvements - through
skill/ knowledge transfer
Again, it’s about identifying your priorities
and what you hope to get from the
evaluation.
12. Formative & summative evaluation – Early Language and Development
programme
OPM were commissioned by Macc and North, Central and South Manchester Clinical Commissioning Groups (CCGs) to
undertake a 2.5 year evaluation of the Reducing Social Isolation and Loneliness Grant Programme. The evaluation was
designed to capture formative and summative evidence, to show the impact achieved and also to inform future
programmes funded or supported by the CCGs and Macc, whilst also informing the support provided as part of this
programme.
OPM worked alongside the CCGs, Macc and the larger grant funded projects to co-produce the evaluation tools.
Following a desk-based review of validated tools aimed at measuring social isolation and loneliness, OPM ran workshops
with the programme leads and undertook 1-1 discussions with the leads from all large grant funded projects, to design
common and bespoke indicators for measuring the impact and capturing emerging learning. This ensured the tools
generated comparable learning across the programme, whilst also capturing insights specific to the aims and client
group of each funded project. This co-production approach helped to secure buy-in to the monitoring processes, and
helped to build understanding about evaluation across the programme leads and project staff.
The tools which were agreed upon to capture data include quarterly monitoring forms, service user ‘before’ and ‘after’
self-completion questionnaires, telephone calls / meetings to discuss impact and emerging learning, and two
programme-wide workshops.
13. Logic modelling
• Logic models are also known as ‘pathways to outcomes models’ or ‘theory of change
models’
• They aim to articulate the inputs, outputs and outcomes emerging as a result of a
particular programme or initiative
• These are extremely useful underpinning tools for any evaluation
• Some logic models also show the linkages between different factors – e.g. which inputs
are intended to lead to specific outputs and outcomes: presenting the logic
underpinning the programme or initiative
• There are different templates or models you can use: some examples are provided on
the following pages, along with prompt questions for you to think about when
developing your model
• You may want to keep your logic model ‘live’ as your evaluation progresses, amending
and adding to it as needed
• Don’t forget to validate your model with key stakeholders, to ensure it resonates with
them and you’ve not missed anything.
14. Why create a logic model?
• It can help you and others understand how a project or initiative is seeking to
achieve its desired effects
• It can help you to learn from and reflect on your project as it takes shape, as it
sets out very clearly the assumptions you have made about how you will achieve
impact
• It provides a framework that will guide the development of research tools for
use during evaluation fieldwork, as well as an analytical framework to guide the
analysis of findings
• It allows evaluators to build a clear narrative of how the programme is intended
to work and make a difference, then monitor the extent to which these
intentions and assumptions are born out in practice.
15. In these boxes write the activities you plan to
undertake as part of your project, e.g. setting up an
older people’s lunch club or setting up a befriending
scheme. You could do this at a more detailed level,
e.g. for the lunch club this could include recruiting
volunteers, getting local businesses involved,
inviting older people to attend etc.
In these boxes write the outcomes you would expect to see in
the short term as a result of the outputs you’ve created – e.g.
fewer older people report feeling lonely or isolated, fewer GP
visits made by older people accessing the new activities. Also
ask yourself: could there be any unintended outcomes or
negative consequences?
Funding and funding
environment
Who’s funding this and
what money is going in?
Internal expertise
and capacity
Where is this
coming from?
External expertise, capacity and
experience
Where is this coming from?
Physical resources
What are these? Equipment,
a venue etc.
Intangible resources
What are these? Ideas from
your services users? Networks
with local partners which you
can tap into?
In these boxes write the tangible
things you’re going to create – e.g.
an older people’s lunch club which
caters for 50 people once a week,
or befriending scheme which
reaches 25 people every week etc.
Long term outcome you want to achieve, e.g.…
‘Older people in our community live happier and healthier lives’
Inputs
Intermediate
outcomes
Longterm
outcomesOutputsActivities
Example of a
logic model
template
16. Other uses for your logic model or
‘Pathways to Outcomes’ model
Not just for evaluation, logic models can also be useful for:
• Service design or re-design
• Service review
• Generating consensus and clarity internally
• Generating buy-in and clarity regarding your initiative or programme with
external partners
• Helping to communicate your initiative or programme.
17. Exercise
1. If you don’t have a logic model for your initiative, programme or
project, start mapping one out following the examples given. This is
something the project lead should direct – perhaps by arranging a
discussion with colleagues, funders or partners involved in the
programme or project. If you have time, you could then test this out
with a wider group of stakeholders to see if they agree, e.g. in a
workshop session.
2. If you do have a logic model, things to start thinking about include:
• Who benefits from each outcome?
• Which outcomes are likely to be realised within 6 months; 12 months;
18 months; longer term?
• What data will help you to evidence the outcomes listed in the logic
model? Who holds this evidence? How frequently is it collected? Is
there a time-lag?
19. Telling the story
• Your evaluation should ‘tell the story’ of your initiative or
project.
• Assuming you are undertaking both a formative and summative
evaluation, readers should be able to understand the rationale
behind the particular approach or project, the activities
delivered, who was involved, where it was undertaken, and
what the impacts were. This means exploring the who, what,
where, why and how regarding your project or initiative.
20. Your
initiative or
project
Knowing how: How
are things to be
done? How can it
be done
effectively?
Knowing who: Who
do you target? Who
else needs to be
involved, at what
stage, why?
Knowing why: Why
do you think that
doing things that
way brings about
the intended
outcomes?
Knowing what: What
are the desired
outcomes, and how
can we demonstrate
this? Are there
unintended or
negative outcomes?
Knowing where:
Just because
something ‘works’,
does not mean it
will be sustained or
rolled-out
Logic,
rationale,
theory of
change
Key components of your story
21. Building your evaluation plan
• It is important to systematically plan out your evaluation. This does
not necessarily have to be a hugely time consuming or bureaucratic
exercise. Building an evaluation plan will enable you to be clear
on:
- The key questions you’re trying to address: e.g. what are you
trying to show via the evaluation?
- The type of evaluation you will be undertaking: e.g. Formative,
summative, and / or economic?
- What ‘success’ looks like for your initiative: e.g. what outcomes
are you seeking to achieve?
- What you will be measuring, how and when?
- Who will need to be involved in this process?
22. Building your evaluation plan
As well as the above, it is important to also think about the following
when building your evaluation plan:
• Who are you communicating to?
• What are they interested in?
• What type of evidence will resonate with them?
• What resources have you got to deliver the evaluation? Who will do the
work? How much time do they have? What skills do they have?
• What data can you access?
• When do you need to start communicating findings? When is evidence
likely to begin emerging?
• Do you want to generate process learning as well as impact evidence?
What evaluation questions do you want to explore?
24. Starting your Evaluation Plan
Breaking it down
Stakeholder
group
Intended
outcomes
Outcome indicators Measurement
tool/s
(who and how)
Data collection
points
Responsibility
E.g. residents aged
50+ who take part in
the project
Community networks
will expand
Individuals no longer feel
isolated
Made new friends, know who
to turn to for advice
Measures of ‘inclusion’ Internal/ LA and health
service/ other partners
As above Increase in resilience Individuals have increased
confidence
Qualitative research, ad
hoc feedback
SDQ
Ongoing feedback
/use of measures pre-
and post support
Internal/ LA
Residents aged 50+
with one or more long-
term condition(s)
Increase in capacity Increased awareness of own
condition
Increased self-care/ taking
more responsibility for care
Increased involvement in
support planning
Agree reduced contact with
statutory services
Care plans
Standardised outcome
measures
Self-reported
knowledge/ confidence
about condition
Service activity data/
Client Service Receipt
Inventories
Ongoing feedback and
interviews during
programme.
Surveys post support
& programme
Internal/ LA/ NHS
25. Step 1: Revisit your outcomes
• Who are you hoping will benefit from your project?
• Individuals / citizens in targeted groups
• The organisation / service
• Wider community
• Relationships between groups
• Others?
• What benefits do you expect for each of these stakeholder groups?
• When do you think benefits will manifest themselves?
• Short term
• Intermediate
• Long term
• The more specific you are, the more nuanced your evidence will be.
• Being specific and ‘unpacking’ your outcomes into tangible things will also enable you to quantify - and then possibly
monetise - outcomes and added value
26. Why is it important to revisit the outcomes
you’re trying to achieve?
• It is all about where you are making a difference
• It is about being realistic
• It is about helping you think through whether what you are doing,
who you are targeting, etc. is likely to bring about desired outcomes
• It is about understanding indicators – it’s not just an outcome, it’s
how you break it down into tangible things
• It’s about understanding the unintended or wider impact you have.
27. Note down the outcomes
for example, for clients in a high priority or targeted group
Intended outcome
(Service users)
Indicator/s Measurement tool/s
(who and how)
Data collection
Points
Responsibility Intended outcome
(Service users)
Families on the edge
of crisis
e.g. shared
decision making
e.g. self
management of
condition and care
28. Step 2: Moving on to indicators
• What indicators can help you measure against these outcomes?
• Ask yourself:
• How will it look and feel if the intended outcome happens?
• How will the target group behave or appear different?
• Are there any tangible actions/outcomes that would indicate that positive change has
taken place?
• This is where outcome indicators need to be used. Indicators need to be quantifiable in some
way, to indicate a change (or where we are trying to prevent something, that no change has
taken place), and be appropriate to the outcome
• Using multiple indicators for the same outcome can help build confidence in your data (e.g. self
reported increase in confidence, and an increase in levels of self-managed care, can help
demonstrate that the project has increased self confidence)
• Tracking changes over time will build up the robustness of your data.
29. Breaking down outcomes by indicators
Think about what indicators you will use to track the progress of your project or initiative.
You could start by looking for tried and tested indicators already used for similar projects, or you may need to
develop bespoke indicators that better match up with your project and the outcomes you’re trying to achieve.
Intended outcome
(Service users)
Indicator/s Measurement tool/s
(who and how)
Data collection
Points
Responsibility
E.g. shared decision
making
Know more about their
condition, influence their care
plans, know who to turn to for
advice,
E.g. self management Undertake agreed care
procedures for themselves,
greater sense of control
E.g. increased self
confidence
Mentors other service users,
presents own ideas in care
reviews
What else??
30. Step 3: Measurement
What are the most effective ways to gather data against each of the
indicators?
• Think about quantitative and qualitative approaches
• Think beyond ‘traditional approaches’ such as interviews and questionnaires
• Will you use existing tools or design your own?
What is the most effective approach?
• Think about how data collection can be built into the design of your project
so that it is done routinely
• Are there opportunities for peer or citizen involvement?
• This is not just about the tools, but also the questions asked: what has
happened (the impact) but also what led to that change (‘what worked’).
31. Relevant measure tools
Intended outcome Indicator/s Measurement tool/s
(who and how)
Data collection
points
Responsibility
E.g. shared decision
making
Know more about their
condition, influence their care
plans, know who to turn to for
advice,
Qualitative interviews with
users, review copies of
care plans, use of journals
E.g. self management Undertake agreed care
procedures for themselves,
greater sense of control
Qualitative interviews with
users, review of care
plans, use of journals
E.g. increased self
confidence
Mentors other young people,
presents own ideas in reviews
Qualitative interviews with
users, use of outcome
rating scale, participant
surveys of peers and staff,
review meeting minutes
What else?
32. Step 4: When is the best time to collect data?
You are likely to need to capture a baseline position (a starting point as in slide 6), as well as follow on data to
evidence a change (or no change).
1. Think about the time points at which you would expect changes to manifest
2. Think about longer-term impacts that will happen after service users have left the project:
• Can these be captured somehow?
• Is there evidence in wider literature which you can draw on?
For instance, you may want to collect data immediately after a particular intervention. If it's a training programme
or awareness raising scheme, you would expect both immediate impacts and longer-term impacts, so should assess
impacts at different milestone points.
Design the evaluation so you capture evidence of 'quick wins' wherever possible, to provide evidence to funders and
other stakeholders that you're on track, as well as to generate some formative learning (i.e. learning about the
process) wherever possible. This can then inform any revisions to the project that might be needed.
33. Data collection points
Intended outcome Indicator/s Measurement tool/s
(who and how)
Data collection
points
Responsibility
E.g. shared decision making Know more about their condition,
influence their care plans, know who
to turn to for advice
Qualitative interviews with
service users, review copies of
care plans, journals
Interviews at start and end
of project, journals filled
out throughout project
E.g. self management Undertake agreed care procedures for
themselves, greater sense of control
Qualitative interviews with
service users, review of care
plans, journals
Interviews and care plan
data at end of project,
journals filled out
throughout project
E.g. increased self
confidence
Mentors other service users, presents
own ideas in care reviews
Qualitative interviews with
service users, use of outcome
rating scale, Participant surveys
of PNs and staff, review
meeting minutes
Interviews at end of the
programme, rating scale
administered start and end
of programme, surveys post
programme, review
minutes throughout
What else?
34. Who is responsible for collecting data?
It is unlikely to be a single individual that will hold key data for your evaluation.
For example, finance colleagues, frontline practitioners and partner
organisations are all likely to have a role to play.
Think about what data is already available and who will have access to this data:
• Journals, action plans, attendance figures etc.
• What data do you already have?
• What new data will have to be collected and who is best placed to collect this data?
35. Responsibilities
Intended outcome Indicator/s Measurement tool/s
(who and how)
Data collection
points
Responsibility
E.g. shared decision making Know more about their condition,
influence their care plans, know who
to turn to for advice,
Qualitative interviews with
service users, review copies of
care plans, use of journals
Interviews at start and end
of project, journals filled
out throughout project
Lead designs the journals
with intern support, project
lead complete interviews
E.g. self management Undertake agreed care procedures for
themselves, greater sense of control
Qualitative interviews with
service users, review of care
plans, use of journals
Interviews and care plan
data at end of project,
journals filled out
throughout project
Lead designs the journals,
project lead complete
interviews and monitor plans
E.g. increased self
confidence
Mentors other service users, presents
own ideas in care reviews
Qualitative interviews with
users, use of outcome rating
scale, participant surveys of
volunteers, staff, review
meeting minutes
Interviews at end of the
programme, rating scale
administered start and end
of programme, surveys post
programme, review
minutes throughout
Lead & volunteers design the
rating scale and post project
surveys, project lead
complete interviews, project
lead reviews care plans &
meeting minutes. Plus intern/
volunteer support.
What else?
36. Reflecting on this process
This is about:
• demonstrating impact and how you are making a difference
• Using resources wisely
• Being sustainable
• Keeping people motivated.
Being clear about your priority outcomes:
• Have you defined them clearly?
• How are you measuring progress / achievement?
• What data are you already collecting, or planning to collect?
• What else needs to be collected?
• Are there any gaps?
37. Taking your evaluation forward: practical
steps
• Engage others: continuing the dialogue, including key stakeholders identified via
the mapping exercise
• Clarify the priorities and resources for evaluation
• Keep referring to your logic model and evaluation framework
• Develop tools and agree timing
• Data collection
• Gaps, access, ethics
• Analysis
• Methods, skills, support
• Reporting
• Format, QA and sign-off, consultation, recommendations
38. Top tips
1. Keep sight of your logic model as the guiding framework
2. Your evaluation plan should be a ‘live’ document, to be re-visited
and refreshed on a regular basis
3. Encourage joint ownership by continuing to involve key partners
in developing the plan and in collating, analysing and/ or
interpreting data and conclusions
4. Use findings to shape and improve what you’re doing on an
ongoing basis
5. Share emerging findings with service users, colleagues and other
stakeholders to communicate your impact and what you are doing
well.
40. Why map your stakeholders?
• Stakeholder mapping can help to identify who is, and who
needs to be, involved in or informed about your initiative or
project.
• It can help to identify who the ‘core’ stakeholders are that are
critical to delivery, as well as those who might be affected by
the initiative, and those you need to inform about it. It can
also help to identify those stakeholders that you need to
influence.
• It can also help you to identify key people to consult about
what they need from the assessment as well as informing your
communication plan.
41. Exercise: Stakeholder mapping
• Exercise:
• For your initiative or project, identify all the stakeholders associated with it
• Think about direct, indirect; internal, external
• Identify:
• The ones you are trying to influence with the results of your evaluation
• The ones whom you think are making the (direct and indirect) inputs
• The ones whom you think experience (direct and indirect) benefits
[template overleaf]
43. Stakeholder mapping expanded
Influencing versus impacted
For those you are hoping to influence with your evaluation, clarify:
• Whether you have an existing relationship with them
• If not, identify who you may need to help broker a relationship.
For those whom you think are impacted by the project (beneficiaries
of direct and indirect benefits):
• Write down the specific types of benefits experienced by each of
these stakeholders
• Do you have data on these benefits? Who holds it?
• Are some stakeholders impacted in other ways – e.g. not
necessarily beneficially?
44. Stakeholder mapping expanded
Influence versus level of engagement
• Another way of mapping your initiative or project’s
stakeholders is by the level of influence and engagement
they have.
• This can help to determine where you should invest your
efforts and focus when it comes to communicating the
messages from your evaluation (i.e. if some stakeholders have
high levels of influence but are not very engaged, you need to
focus more energy on reaching them)
45. Influence - high
Influence - low
Stakeholder mapping matrix
Influence versus level of engagement
Current
engagement -
high
Current
engagement
- low
46. Exercise: Stakeholder engagement
From your mapping exercise:
• identify who you need to engage/involve more
• What processes are used to make decisions about key funding in your area / sector?
• Whose and what preferred outcomes are considered to legitimate or are most valued?
You may also want to consider…
• Mapping the web of relationships: who knows who? Who is influential within the relationship?
• Systemic thinking - understanding how broader issues in the wider system can play out in micro-
interactions
• Paying attention to language and mind sets
• Group dynamics; power
• Valuing and working with difference
• Considering different levels of client – who makes decisions about different things?
• ‘Generative governance’ – working to create the conditions for conversations about the issues
that matter the most.
48. Planning your communications
In order to effectively plan your communication strategy, it may be useful to work through the
following questions, to ensure your communications are proportionate, well targeted and
timely.
1. Review the stakeholder mapping: Who do you need to inform? Who do you need to
influence?
a) How well engaged are these stakeholders already?
b) Who do you need to prioritise for engagement? To what extent do you know the key
people to target within each organisation? Do you know their contact details?
c) What communication channels might work well in reaching them? What is likely to
resonate for them?
2. Review the evaluation plan – what do you expect to be able to communicate? When? How
widely do you expect to want to disseminate the findings (e.g. might some findings be
sensitive, or need to be shared with internal stakeholders first?)
3. What key milestones do commissioners work to? E.g. when do they do their budget setting
/ business planning?
49. Different communication channels
Consider the following communication
channels.
All have different advantages and
disadvantages, and you’ll want to
select the approach that:
• best suits your needs
• the type of message you’re
communicating
• who the audiences are (different
audiences are likely to require
different forms of communication
• whilst individuals all differ in their
personal preferences)
• the resources available to you.
Website
Email
Delivered
letter
Newsletter
Door knock
Pop up
engagement hub
Local press
Toolkits Journal
article
Speaking at
events
Face-to-face
meetings
50. Exercise: Commissioner or funder
resistance
Exercise – it might be useful to consider:
• Why might commissioners challenge your findings?
• What happens if commissioners challenge your findings? What
are the risks / implications associated with this?
• How could you address or mitigate this?
51. Tips for addressing resistance
Some tips for addressing commissioner push-back:
• Sense check your evaluation plan at the outset
• Identify potential barriers or concerns – classify them, explore how to overcome them
• Triangulate findings against other studies or sources of evidence. Ground findings in the
local context.
• Develop case studies, to richly illustrate the impact your project is having on individuals
or at community / organisational level. This can help to ‘make it real’ for funders, and
enable them to really understand the difference your project or initiative is making.
• Pull out key headlines – what are the funders likely to be most interested in? Align findings
with their key priorities. Are there any national priorities you can align the findings with?
• What makes your project different or particularly effective? How is it different to what
would have otherwise happened?
52. Resourcing communication activities
• It might be useful to consider the following key questions
when planning your communications. This will help to ensure
your plan is pragmatic and realistic, and also identify any
support you might need.
• Who would take forward these activities? When?
• What is this dependent on?
• Do you / they need any additional support?
• If so, in what areas? How much? Who could provide this?
• Are there any gaps in capacity?