• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Think systematically - Serious Social Investing 2013
 

Think systematically - Serious Social Investing 2013

on

  • 642 views

Effective partnership with government can help scale impact. ...

Effective partnership with government can help scale impact.

Caitlin Baron from The Michael & Susan Dell Foundation and Chinezi Chijioke from McKinsey & Company speak at the Tshikululu Social Investments Serious Social Investing 2013 workshop.

Statistics

Views

Total Views
642
Views on SlideShare
431
Embed Views
211

Actions

Likes
0
Downloads
8
Comments
0

1 Embed 211

http://www.tshikululu.org.za 211

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Think systematically - Serious Social Investing 2013 Think systematically - Serious Social Investing 2013 Presentation Transcript

    • District management through data When M&E _is_ the intervention March 20, 2013
    • We have drawn on lessons from the Michael & Susan Dell Foundation‟sglobal experience in performance management in education The Michael & Susan Dell Foundation has invested $160M+ in grants in multiple districts across 9 states in the US for educational performance management. In addition, the foundation has embarked upon ambitious partnerships with government in India to measurably improve learner results across major metros. © 2013 Michael & Susan Dell Foundation Company Confidential 2 2
    • Key success factors for education data systems identifiedthrough the experience of the Dell family foundation  Easing the data collection burden must be a central driver of developing the 3 new application – it must make people‟s lives easier if it is to be successfully adopted  Thinking beyond just accountability, to actual value to the end user, is essential for success  Tools must enable a shift in district role with regard to schools from compliance-focused to support-focused  Stakeholder engagement, including extensive market research with users, is essential to get the application right and to ensure uptake upon roll out  Building an ecosystem of training and other support organizations from the earliest pilot is vital to achieve scale Proven dashboards, data standards and other protocols from the foundation‟s work in the US could be leveraged in South Africa © 2013 Michael & Susan Dell Foundation Company Confidential 3
    • Why should South Africa engage with education data systems now?• Wealth of new learner performance data now available. Limited systems to actually enable schools and districts to use it• District directors manage organizations the size of medium to large businesses with little to no information to guide them• Without an in built system to monitor performance, the education system has no way to know which interventions work and which don‟t © 2013 Michael & Susan Dell Foundation Company Confidential 4
    • Together with the DBE, in 2012 we conducted a diagnostic study of data at every level of the school system across all 9 provinces Research across all 9 provinces ▪ Deep-dive visits (2 weeks) Over 250 interviews Light touch visits (1-2 days) School locations Limpopo ▪ National-, Province-, district-, school-, and class-level interview Gauteng ▪ Diverse and representative North West mix of educational Mpumalanga environments: – Mix of urban and rural Free state Kwazulu Natal schools Northern Cape – Distribution of wealth quintiles – Spectrum of PC and internet access Eastern Cape Western Cape – Range of performance bandsNote: Infrastructure descriptions: Poor IT = Less than 2 computers, no phones, no internet // Fair IT = 2 - 5 computers used by staff, maybe internet connected and phone linesGood IT = More than 5 computers, phone lines and internet, printers, copiers etc.Below average, Average and Above average determined by using reported pass rate – with 40-60% being considered average – considers both primary and secondary schools 5
    • What did we learn? © 2013 Michael & Susan Dell Foundation Company Confidential 6
    • Despite collection and recollection of data, schools, circuits and districtsreceive little to no insight or feedback on submissions Data flows upwards to satisfy compliance requirements Public National National Queries EMIS directors Provincial Generic Province ? EMIS Reports directors District District EMIS director SMGD Other data  requests School 7
    • Schools are overburdened by data requests Duplication of data requests on schools EXAMPLE Number of survey questions asked by district per quarter 245 -68% 572 144 How does this district analyse 500+ 183 questions per quarter per school? Questions Questions Questions “Net new” asked by with answers duplicated questions district tools in SA-SAMS across set submissions of surveys 8
    • Outside of planning, few management routines use data to guidedecision making System-wide difficulty in translating collected data into insights and actions Common Capture Input data Aggregate Create Establish focus data to in and actionable skills and goals measure shareable analyse outputs routines progress system data Metric requested at province level Learner demographic and ID info Learner registration and promotion Learner daily attendance Learner subjects and timetable Leaner marks (daily, weekly, quarterly) Learner grade performance and pass quality Learner scores for standard provincial assessments Learner scores for standard national assessments Learner social needs and discipline record Educator ID, qualifications and assignment Educator curriculum coverage Educator development and training Educator attendance and leave Educator teaching and learning results and trends* School ID and contact information School infrastructure and facilities School finances School LTSM ordering and delivery School improvement plans School posts vacancies and appointments School support services - transport and feeding School teaching and learning results and trends** School management meetings Circuit and district management visits Province, district, and/or circuit level goals/targets 9
    • Our biggest surprise… Government’s response 10
    • What might the solution be? © 2013 Michael & Susan Dell Foundation Company Confidential 11
    • Districts are a critical point of leverage in the school system Primary levers for improving learner performance ▪ Strategy, policy and direction National DBE ▪ Design and delivery of instructional support material ▪ Educational system resourcing ▪ Strategy, policy and direction Provincial DoE (9) ▪ Human resources and hiring ▪ District and school resourcing Performance management ▪ Instructional and administrative support and tools improvements at ▪ Performance management the district level Districts (86) ▪ Quality assurance and compliance can put learner ▪ Professional development performance at ▪ School and community engagement the forefront of thought in classrooms and ▪ Local and operational support provincial offices Schools (~27 000) ▪ Enrolment and progression alike ▪ Delivery of facilities instructional material Classrooms ▪ Teaching and learning Districts are also historically the most ignored and atrophied part of the delivery chain across South Africa 12
    • What will it take to establish 86 data-driven school districts? Agree on a small-set of student outcome metrics across Metrics 1 grades R-12, aligned at province, district, & school level Identify and build capabilities and 5 mindsets to Script data-driven management routines for districts to implement data- Action / support and review schools‟ performance driven action Routines 2 Scale Outputs / Put data in the hands of people who can act on it by nationally, pione Dashboard ering first across creating data dashboards that: 3 districts that • are visually easy to interpret, 6 represent a wide 3 • trigger action where it is needed range of district Analyse • display analyses for root-cause identification, contexts, „to meet • designed into management routines, and districts where • are user-tested they are‟ Input and Streamline and strengthen data collection processes to Work in close [1] improve accuracy and [2] reduce duplication & burden 7 partnership with aggregate government 4 Collection 13
    • We have selected districts of three different types and have begunwork with them to map processes and design a dashboard National DBE Limpopo Free State Gauteng Other 6 provinces Waterberg Thabo Mafutsanya Ekudibeng One per province “Builder” pioneer “Architect” pioneer “Experimenter” “Twin” districts pioneer  Overstretched district in  Similarly rural setting, rural area with large but with more  Functioning • Shadows process number of schools per technology dashboard developed • Reality checks work Circuit Manager infrastructure and in Excel, tracking a to ensure support in place full range of indicators generalizability  Limited technology in place  Practice of using data  Management for to manage for results results change  Matric results only real established process underway for indicator managed 2 years SI and Design: Frog Design, Double Line Partners Training and coaching organizations: New Leaders Foundation Consulting and operations team: McKinsey 14
    • A district management system can support the transition to data-drivenperformance through a combination of processes, tools and capabilitiesBuilding KPIs and business processes Establishing tools and capabilities 60% 40% Business processes and tools will be tailored to meet districts‟ schooling and technology realities 15
    • Work has started to create visually appealing dashboards fordifferent district archetypes 16
    • 17
    • Examples of exercises from the co-development ofdashboards with district officials 18
    • 19
    • 20
    • 21
    • Participants felt the workshop was an “eye- opener” and it motivated them to improve current practices The pilot would be a good tool for us to use in the district ” My eyes are open I feel empowered to now…we are able to change It shows we have been read the statistics and haphazard in the way we see how they relate have been doing things…this is the start of change It seemed like child’s play at first but now we are empowered You can pick up problems through the data ” It was not boring, it was like playing and now we are empoweredSource: Waterberg design workshop 22
    • We have identified a core set of student and school metrics to guide school performance support and management PRELIMINARY Indicators Unit of measure Range Archetype Learner outputs ▪ ANA, NSC1 ▪ Levels, % 1-7, 0-100 ▪ All Achievement ▪ Common Tests2 ▪ Levels, % 1-7, 0-100 ▪ Experimenter ▪ Grades 9, 10, 11 pass ▪% 0-100 ▪ All Progression rates ▪ Grade 12/Grade 8 ▪% 0-100 ▪ All learners ▪ Matric pass rate ▪% 0-100 ▪ All Inputs ▪ Educator attendance ▪ % of days 0 - 100 ▪ All Attendance ▪ Learner attendance ▪ % of days 0 - 100 ▪ All Curriculum ▪ Option 1: Common Tests2 ▪ % of curriculum 0 -100 ▪ Experimenter ▪ Option 2: CAPS syllabus ▪ % of curriculum 0 -100 ▪ Ekudibeng coverage Resourcing ▪ LTSM ▪ %3 0-100 ▪ Experimenter, Architect ▪ Vacant educator ▪% 0-100 ▪ All positions1 Annual National Assessment, National senior Certificate (Matric) 2 Written test, standardised across the region/province, data with/without SBA3 100% TBDSOURCE: Team analysis 23
    • We are developing a baseline which will track core metrics along with operational improvements and school feedback Description Rationale Time to impact ▪ Measures of learner ▪ Improvements in achievement ▪ Long-term Education achievement and progression and progression are the outcomes at all stages of the basic ultimate goal for DBE and all education system stakeholders ▪ Measures of enablers that ▪ Enablers are the leading ▪ Medium-term contribute to achievement and indicators for performance in progression improvements achievement and progression ▪ Measures of the effectiveness ▪ Operational indicators ▪ Short-term Operational of data-driven management measure the levers that the improvements processes project will use to drive improvements in education ▪ Measures of the efficiency of outcomes data processes ▪ Measures of the schools‟ ▪ Schools are in the best ▪ Medium-term Feedback from assessment of the position to assess the shift schools quality, frequency and impact form compliance to support of their interactions with the and give districts feedback on district their service to schoolsSOURCE: Team analysis
    • PRELIMINARYPreliminary timeline for pioneering districts = Current phase Beyond 3 months the timeline is subject to revision as we proceed National 3 months 6 months 12 months TBC… rollout Prepare for Begin implementation ▪ Deliver online “Experimenters” implementation dashboard, linked to ▪ KPIs ▪ Implement existing data change systems ▪ Work with ▪ Map of data process within ▪ Deliver training district, circuit collection, and schools pioneer storage, and to embed districts use tools and practices ▪ Develop actual ▪ Design ▪ Deliver offline All district types All district types dashboard process dashboard, drawing ▪ Finalize DBE leads roll “Architects” system changes from existing data reporting out together structures to ▪ Forge systems ▪ Develop partnerships ▪ Conduct dashboard province and with provinces dashboard and report training national to expand prototype funding base ▪ Package ▪ Assess materials and baseline of ▪ Create local printed prepare for outputs and traing next district district “Builders” effectiveness ▪ Shift output and rollouts analysis capabilities to province ▪ Plan for implementation & roll-out 25
    • In Closing, Some Broader Reflections• Consider the power of M&E as the intervention itself – giving government and NGOs the tools and training to manage their own organizations for success• Appreciate the context into which our data requests as donors fall ‒ How can we mirror the existing reporting cycle at the grantee? How can we align with their other reporting requirements? ‒ Can we use existing data? Can we leave behind an M&E system that sustainably benefits the grantee / government entity? © 2013 Michael & Susan Dell Foundation Company Confidential 26
    • Thank you!
    • Comments to reference from Day One“Most M&E within the government is managed from a compliance perspective.People report because they have to, and do not use the M&E information tomanage their departments.”“Accountability is to late. The most important thing is to build a managementculture of continuous improvement.” Dr. Ian Goldman, The Presidency“One of our most valuable contributions in our jointly setting agenda withgovernment partners is being able to help them set more realistic targets forimprovement, that are based on a clear understanding of where you are todayand what level of change is possible in the time period.” Gail Campbell, Zenex Foundation © 2013 Michael & Susan Dell Foundation Company Confidential 28
    • Findings suggest that stakeholders face common challenges across allphases of a data-driven decision-making cycleData driven decision cycle Research themes Example analyses*  Stated goals and incentives do not always line up, Create shared focus resulting in managers prioritising the items that lead to Assessment on measurable goals funding or publicity above all else  Schools are overburdened by reporting requirements, Capture data to which exist in duplicate and triplicate because districts, measure progress provinces and national offices do not share data against goals  There is no single source of accurate data for circuits or Input data into a districts, as each function sources its own data and EMIS Analysis sharable system remains outside of core conversations  Despite its promise, SA-SAMS is not being used as a Aggregate and analyse school management tool, and is primarily used by data administrative clerks as an “electronic accountability tool”  District, circuit and school personnel do not receive Create actionable feedback on submitted data, as most data is collected and outputs for various passed upwards in compliance, with limited desire to inform ground-level actors of relevant data insights Action users  Outside of planning, few management routines use data Establish skills and to guide decision-making, instead most rely on touch and routines to review data feel and ad hoc efforts to manage crises and guide action* Detailed view of analyses can be found in appendix and supporting documents 29
    • Our current work in implementation preparation is guided by sixkey questions 1 What are the core guiding metrics for data-driven school performance management, and what is the current baseline of performance and practices? 2 What are the data-driven routines that districts should put in place for supporting schools and managing their performance on a regular (termly) basis to drive lasting improvements in education outcomes? 3 How can data best be How can data systems (inputs, processes and access) be optimised and used to optimised to ensure that accurate and unique inputs result in support schools to relevant information that supports data driven support and improve performance? performance management? 4 How can school data be best displayed to result in the most effective action by school officials, politicians and other decision makers to improve R-12 education across all districts? 5 How will we build the capabilities and mindsets needed to drive data-driven school support? 6 How do we plan for a successful rollout to all districts in the country? 30