Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluating Systems Change

58 views

Published on

Whole systems change across a neighbourhood

How can we collaborate with people to help them build their resilience? Get under the skin of the culture and the lives people live. Identify people’s feelings and experiences of community and understand what people think is shaped by different values and by the environment and infrastructure around them. The future of collaboration could bring many opportunities but people find it more difficult to live and act together than before. How can we help people…and communities build their resilience? Understand people’s different situations and capabilities to develop pathways that help them build resilient relationships. Help people experience and practice change together. Help people grow everyday practices into sustainable projects. Turn people’s everyday motivations into design principles. Support infrastructure that connects different cultures of collaboration. Build relationships with people designing in collaboration for the future…now.

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

Evaluating Systems Change

  1. 1. Evaluating Systems Change 1
  2. 2. Overview Feedback from SE on evaluating systems change Types of evaluation Generic evaluation timetable Steps for generic evaluation What are our responsibilities toward the evaluation partner? Questions for bidders How do we sample? How do we measure the impact & quality of the evaluation itself? How should we procure? 2
  3. 3. Feedback from SE on evaluating systems change • Consider uniqueness of systems change, but still need to be able to baseline it and evaluate change over time and what's caused it • Articulate how the programme expects to bring about change so that the evaluation partner can evaluate whether & how it's occurring • Define the initial outcomes & target cohorts you want to baseline & evaluate • Provide access to the evaluation partner to the people & organisations who most influence change and the residents • Ensure that the evaluation is part of the process of change and needs to support & inform any changes to the programme as it develops • Build in ways that the evaluation approach itself can be adapted to suit the changing needs of the programme 3
  4. 4. Applying insight to test ways to help people get more active & be more self sufficient 4 We will use an iterative approach to make sure that we can move quickly from using research & analysis to testing opportunities for behaviour change with residents & staff and supporting residents to self report their behaviour digitally to track impact.
  5. 5. 1. Develop evaluation framework, baseline & deep dive to develop hypotheses Development of evaluation framework & tools & systems change tracking • Develop framework to manage risk, finance & evaluation & escalation, Develop tools for measuring & tracking with Board and SE evaluation partner, Baseline systems change indicators to co-design maturity model for the programme, Map how stakeholders influence & support each other, identify influencers and assess leverage points to test interventions on, social network analysis of partners to demonstrate effectiveness of system, Develop maturity model for neighbourhood-level systems change, Baseline & regularly review readiness by practitioners to deliver systems change, Align performance monitoring for programme to borough outcomes framework • Outcome: Evaluation framework, systems change indicators and maturity model for systems change and social network analysis of partners to identify how to intervene in the system Data analysis to develop baseline & analyse data on cohorts: • Baseline, systematic review, mapping & analysing datasets, predictive modelling to identify those at risk of physical inactivity, linking of relevant data to produce baseline, framework developed to enable service providers to collect, integrate & analyse data to help residents to be more active • Outcome: Produce baseline & analysis of cohorts at risk of inactivity & models to enable partners to analyse data at the point of contact Qualitative research to dig deeper from the baseline to surface hypotheses to test interventions • Design survey, map & analyse user journeys & touchpoints, service demand, effectiveness of service provision, cost drivers, map factors that influence physical inactivity, model potential interventions, segmentation & stratification of residents, identification of hypotheses, priority cohorts & geographic areas to test interventions, use research to start testing interventions • Outcome: Produce segmentation of residents & hypotheses to test interventions based on understanding of needs, motivations & opportunities for change amongst residents and assessment of service demand, effectiveness of provision & cost drivers 5
  6. 6. 2. Support residents & communities to track their own behaviour Development of peer research • Develop criteria for selecting characteristics for residents to be identified & trained to carry out research amongst their social networks, Residents identified & trained to carry out research, Support developed & provided for residents carrying out research, Trained researchers carry out research and support others to carry out research in their networks • Outcome: Residents supported to carry out research and support others to carry out research in their networks Development & delivery of digital self reporting • Identify & test what methods & tools work best to enable people to self-report their activity, motivations & needs, development of digital tool, testing of tools with residents, use of data produced by tool to recommend how to get more active • Outcome: Digital tool that enables residents & practitioners to better understand the impact of their behaviour and become more active as a result co-designed with target audiences & service providers so it meets their needs 6
  7. 7. Survey Questions for Residents • Demographics • Overall needs, motivations & behaviours • Levels & types of physical activity and motivation to get people active • Select standardised questions from relevant national surveys (Values Mode Analysis, Understanding Population & Active Lives) • Test survey with Residents Panel to validate • Peer review with other local areas and IFF 7
  8. 8. Types of evaluation Type of Evaluation Situation Iterative evaluation When working on complex situations or early stage innovation. Where there is a degree of uncertainty about what will work and how. Where new challenges, opportunities & activities continue to emerge. Formative evaluation When the core elements of the programme are taking shape. Where the outcomes are becoming predictable. Where the context is increasingly well know and understood. Summative evaluation Where the programme's activities are well-established. Where people involved are increasingly more confident about "what works". Where the interventions are significant enough for more intensive scrutiny. 8
  9. 9. Generic Evaluation Timetable Phase Design & Methodology Planning Implementation Review & Follow Up Major Output Evaluation Specification Terms of Reference Evaluation Plan Evaluation Report Action Plan Activities Purpose, scope & objectives Context Target Cohorts Questions Type of Evaluation Criteria Level of rigour Data sources Sampling Data collection Data quality Evaluation Partner selected Roles & Responsibilities Design of baseline & review of Year 1 Test data collection methods & tools with cohorts & partners Sampling Training Involvement of cohorts & partners Collection of data Analysis of data Conclusions Recommendations Learning Publication & sharing Response by programme to recommendation & action plan 9
  10. 10. Steps for Evaluation 1. Orientation • Learn about the systems change initiative’s goals and strategies. Discover the basics of the collaborative implementing the systems change initiative. Collect basic information about a system’s programs and services. 2. Evaluation planning • Decide on priorities for the evaluation. Identify research questions. Develop a data collection plan. 3. Develop data collection instruments • Develop instruments for baseline data collection. Develop instruments for collecting data at follow-up periods for subsequent data collection. 4. Collect baseline data • Collect data on key systems change indicators (i.e. effectiveness of programmes & services, collaboration) 10
  11. 11. 5. Collect follow-up data • At a follow-up period: - Collect data on systems change. Ask stakeholders about their perceptions of the systems changes and changes that have occurred between baseline and follow-up. 6. Describe change between baseline and follow-up • Compare the follow-up and baseline data to assess the extent to which systems changes have taken place. Account for stakeholder arguments about extent and type of change. 7. Analyze how the programme contributed to change between baseline and follow-up • Assess how the initiative was able to improve on the systems change indicators. 8. Develop recommendations • Identify ways that programs and services can continue to improve. Identify ways that the structure might be reconfigured to facilitate systems change. Identify ways that the system might contribute to a better structure and to improved programs and services. Identify ways to improve structures and processes to facilitate implementation. 11
  12. 12. What are our responsibilities toward the evaluation partner? Outline: • How we expect to bring about systems change • Purpose, geographical scope, context, target cohorts, governance & objectives of the programme • Scope & objectives of evaluation • What we want to evaluate • Assess outcome/impact/document good practice/ understand causal relations/what works and why/formation of future programme • Evaluation/baseline questions for measuring impact on residents • Evaluation/Baseline questions for measuring impact of process of systems change) • Why • How it should be organised • How the evaluation process needs to inform improvements in the programme • Governance of the evaluation • Reporting requirements, information governance (including GDPR) & data quality requirements and any generic requirements for supplying to Ealing Council • Deliverables needed, milestones, depdndencies, budget Provide access to: • Decision makers, practitioners & target cohorts • Relevant insights & data sources on the target cohorts that can be shared • Pre-exisitng insights by ourselves and partners on the context and conditions for systems change in the local area 12
  13. 13. Questions for bidders • What is the most effective evaluation design for the experimental context of this programme? • What data collection & analysis tools & methods would you plan to use? • What sampling method would you plan to use? • How would you make sure that target cohorts & people carrying out the research have been involved in the design of the evaluation? • How would you assure the quality & ethics of the data collection & analysis? • How would you plan to share & review emerging lessons learned from the evaluation so that the key influencers of the system can learn and apply these lessons to make systems change happen? • How would you be able to adapt your evaluation methods to adjust to the nature & scale of the activities that emerge from the programme? • How would you evaluate how the system has changed over time and explain what is contributing to this change? • What additional support from our organisation & partners would help you deliver an effective evaluation? • What costs & resources are required for each activity within your plan? • How would you ensure that you develop expertise on the context of the local area in scope? 13
  14. 14. How do we sample? • What sampling methods do we use? Or do we ask the bidders? • 1. Randomised sample with boost on target cohorts to enable specific analysis • 2. Stratified sample to dig deeper in issues / cohorts identified in the randomised sample • What is the group of cohorts from which we want to draw a sample? • Target cohorts agreed in Corporate Board presentation • How many do we need in our sample? • Dependent on budget and how specific we make cohorts • More specific the cohorts, bigger the sample • How will these be selected? • Use providers with existing national database • Baseline sample of residents to evaluate future impact of programme on residents’ outcomes • Baseline sample of staff to evaluate future impact of programme on systems change See Overview of Sampling in Annexes 14
  15. 15. How do we measure the impact & quality of the evaluation itself? • Relevance of evaluation design & methods to the purpose & objectives of the programme • Effectiveness of delivery of the evaluation • Credibility of the findings • Impact of the recommendations from the evaluation on systems change • Understanding of the conditions of systems change in the local area and the factors & patterns influencing it • Understanding of the factors influencing inactivity and opportunities for behaviour change amongst the target cohorts 15
  16. 16. How should we procure? Expertise • Expertise in iterative evaluation, in particular of systems change • Understanding of dynamics of the system of the local area in scope Types of contract • As part of Analysis & Evaluation lot • Retainer free contract • Stepwise funding (to enable extension of contract) • Prescribe proportion of funding for specific outcomes and allow rest for contingencies 16
  17. 17. Actions • Develop specification for evaluation according to overall commissioning timetable in Activity Plan • Research Team to draft specification for Evaluation and for other Analysis & Evaluation work streams with support from Procurement to include evaluating (inc. baselining) residents & systems change • Recommend randomised sample to baseline residents with boost of target cohorts to ensure specific analysis of these groups • Followed by deep dive qualitative & quantitative analysis & service design to start testing behaviour change • Task & Finish Group to review & agree 17

×