Making csi matter project impact - g campbell

  • 370 views
Uploaded on

Project Impact: Beyond the talk of 'what we invest in' to 'what our investments have achieved.'

Project Impact: Beyond the talk of 'what we invest in' to 'what our investments have achieved.'

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
370
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
8
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Trialogue Conference May 2011
  • 2.
    • Introducing the Zenex Foundation
    • The Approach to Monitoring and Evaluation
    • Processes
    • Lessons and Challenges
    The Presentation
  • 3. Introducing the Zenex Foundation
    • Independent Private Foundation dedicated to funding Maths, Science &
    • English education
    • Funded through an Endowment Fund that has its historical roots with
    • ESSO Oil and Zenex Oil
    • Operating since 1995 and invested over R300 million in education projects
    • Strong Focus on Evaluation and has commissioned over 40 evaluations
    • .
  • 4. Developing schools for Maths, Science and Language Excellence Outcome Improved learner performance using a pipeline approach across primary and high schools. The Zenex Foundation in the donor sector is the funder, innovator and think tank for improving learner performance in mathematics, science and language through interventions, research and evaluation in schools Teacher Development Programme me Outcome Increasing numbers of professionally qualified teachers. Learner Development Programme Outcome Increasing numbers of learners with quality passes Research and Development Outcome Impact on education policy and classroom practice Cross cutting Programme: Capacity Building and Evaluation Maths, Science and Language
  • 5. Impact? Grade R Learners Foundation Phase Learners FET Phase Learners Principals Teachers
  • 6. Approach to M & E
    • It has been a general weakness in South Africa donor practices
    • Seen as a unnecessary additional expense that could have gone directly to development
    • Too little is spent on poorly designed evaluation studies that can tell you very little on the
    • impact of the project
    • Purpose of M & E in Zenex
    • Accountability: addresses the question of impact of the project
    • Improve and inform project delivery
    • Contributing to Zenex Strategy Deveopment
    • Learning and developing best practice
  • 7. Approach to M & E
    • M&E is built into the entire project management cycle and involves all role-players in the
    • project. The approach is ‘as far as possible’ consultative and participatory.
    • We use the Logic Model in designing evaluations and this is reflected in project design
    • through to evaluation design. It is important to unpack the implicit Theory of Change
    • underpinning the project intervention
    • We mainly use quasi-experimental design with a combination of both qualitative and
    • quantative methods
    • Guide 10% of the project costs and this is relative to the size of the project. Often
    • this costs have exceeded the 10% benchmark
  • 8. Processes
    • Choosing Evaluators
    • Zenex does not go the “tender route” and this is based on the rationale that there is a small
    • pool of evaluation specialists working in the education sector.
    • Worked with both individuals and agencies: Criteria include
        • Independence
        • Range of experience in various methods
        • Contextual/Sector knowledge
        • Technical knowledge in evaluation design
    • Committed to developing a cadre of Black evaluators and we budget and plan for this with
    • coaching and mentoring.
  • 9. Teacher Programme Programme Evaluation Design and Methods
    • TEACHER
    • Formative
    • Review of
    • Models
    • Impact
    Teachers Interviews Classroom Observations of Teacher Practice Review of Portfolios prepared by teachers Review of Exam resutls Learners Testing in Content Knowledge: pre & post Focus Groups Project Partner: Secondary Analysis of documents Observations of Academic Training and classroom support Interviews Time Span: During the proejct intervention and one year tracking
  • 10. Learner Programme Programme Evaluation findings
    • LEARNER
    • Formative
    • Review of the
    • models
    • Impact
    Learners Interviews Testing in content School Interveiws with teachers and principals Classroom obsrvations Project Aprtner: Project Activiteis reveiwed: Mentor meetings, camps, meetings, teacher training reveiwed Time Span Follows proejct implementation and tracks learners post school for two eyars
  • 11. School Programme Programme Evaluation findings
    • School
    • Formative
    • Review of the
    • models
    • Impact
    • Who
    • Evalauotors
    • Service Providers
    • Zenex Team
    • School Results
    Learners Testing in content Learner Book Reviews Teachers Interviews Classroom Observations School Interveiws with principals School Audit Project Aprtner Service Provider Interventions Workshops, in-class support. Time Span Follows proejct implementation
  • 12. Lessons & Challenges
    • What is working
    • Evaluations shaped Zenex’s strategy
    • Negotiate/consult with partners from position of documented research
    • Early identification of problem areas, allows for problem solving and time for reflection
    • Led to capacity building programmes identified during evaluation process
    • Allows for realistic setting of results and adjustment of project results
    • What have we learnt
    • Alignment of Roles with so many role-players involved in the proejct all engaged in some form of M & E
    • A classic example is the multiple testing of learners by different role players.
    • Relying one External sources of data in evaluation studies which have not always been reliable – for
    • example Grade 11 examinations
    • Realizing the need to investment in feasibility research before project implementation
    • Compromising on evaluation design due to budgetary constraints and reducing the value of the study.
    • Understanding the lag effect education interventions extending the length of the evaluation post the
    • project interventions
  • 13. Lessons & Challenges
    • What is working
    • Evaluations shaped Zenex’s strategy
    • Negotiate/consult with partners from position of documented research
    • Early identification of problem areas, allows for problem solving and time for reflection
    • Led to capacity building programmes identified during evaluation process
    • Allows for realistic setting of results and adjustment of project results
    • What have we learnt
    • Alignment of Roles with so many role-players involved in the proejct all engaged in some form of M & E
    • A classic example is the multiple testing of learners by different role players.
    • Relying one External sources of data in evaluation studies which have not always been reliable – for
    • example Grade 11 examinations
    • Realizing the need to investment in feasibility research before project implementation
    • Compromising on evaluation design due to budgetary constraints and reducing the value of the study.
    • Understanding the lag effect education interventions extending the length of the evaluation post the
    • project interventions
  • 14. Lessons and Challenges
    • Challenges to Address
    • The Zenex grant making cycle only allows for engagement with potential evaluators after
    • the project proposal has been approved often leading to delays in commissioning evaluations
    • Continuous use of the same team of evaluators and researchers: Question of familiarity?
    • Quality Assurance? Should Zenex look at appointing technical advisors?
    • Development Debate: the cost of the evaluation and the outcome: if a project is poorly
    • designed the evaluation may not be able to reveal very useful information – how evaluable
    • is the project?
  • 15. Thank You