The value of data and research in social investing: our experience<br />February 2010<br />1<br />
The Khanyisa experience<br />Design 2004<br />Implementation (2004 -2008) <br />Formative- 07<br /> No impact<br />evaluat...
Pre-implementation profiling<br />Know what you are dealing with<br />Confirm the critical success factors against the pro...
Baseline Evaluation<br />Intended uses:<br />Benchmarking<br />Refining the design: feedback project team <br />Driving ch...
Examples of value driven from the evaluation<br />
Curriculum management practices: QA of assessmentTeachers required to submit tests for QA<br />
Teacher practices: writing in LanguageMean no. exercises per year <br />
Writing in MathsMean no. exercises per year<br />
Learner Performance G3 Literacy<br />
Learner Performance : Math<br />
DIFFERENTIAL  ABSORBTIVE CAPACITY<br />
Using data to plan for improvement<br />
Pitfalls to avoid<br />Basic designs do not tell  why there are or no changes in performance<br />Khanyisa ran no impact s...
Some lessons to pick up<br />Use a mix of approaches to highlight the pattern of performance and to identify the reasons <...
Lessons for JET<br />15<br />
Upcoming SlideShare
Loading in …5
×

Research in Social Investment - Tshikululu Social Investments workshop 2010

684 views
619 views

Published on

Presented during Tshikululu's first Serious Social Investing workshop, which took place on 25 and 26 February 2010. Godwin Khoza (CEO, Joint Education Trust) discusses the value of research in social investing.

Published in: Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
684
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Research in Social Investment - Tshikululu Social Investments workshop 2010

  1. 1. The value of data and research in social investing: our experience<br />February 2010<br />1<br />
  2. 2. The Khanyisa experience<br />Design 2004<br />Implementation (2004 -2008) <br />Formative- 07<br /> No impact<br />evaluation<br />Profile <br />Baseline<br />12 case studies <br />
  3. 3. Pre-implementation profiling<br />Know what you are dealing with<br />Confirm the critical success factors against the project intended outcomes<br />General development conditions<br />Student and teacher numbers<br />Resourcing levels<br />Use the information to construct a set of outcomes, outputs and indicators (logical framework)<br />
  4. 4. Baseline Evaluation<br />Intended uses:<br />Benchmarking<br />Refining the design: feedback project team <br />Driving change : feedback to beneficiaries<br />Provide the basis for further investigations<br />
  5. 5. Examples of value driven from the evaluation<br />
  6. 6. Curriculum management practices: QA of assessmentTeachers required to submit tests for QA<br />
  7. 7. Teacher practices: writing in LanguageMean no. exercises per year <br />
  8. 8. Writing in MathsMean no. exercises per year<br />
  9. 9. Learner Performance G3 Literacy<br />
  10. 10. Learner Performance : Math<br />
  11. 11. DIFFERENTIAL ABSORBTIVE CAPACITY<br />
  12. 12. Using data to plan for improvement<br />
  13. 13. Pitfalls to avoid<br />Basic designs do not tell why there are or no changes in performance<br />Khanyisa ran no impact study<br />Monitoring data was not collected<br />
  14. 14. Some lessons to pick up<br />Use a mix of approaches to highlight the pattern of performance and to identify the reasons <br />Evaluate performance at the different levels of the system<br />Valid evaluation results require sample sensitive designs<br />Evaluation designs should collect a mix of educational and project data<br />Evaluation design and project design should be done iteratively <br />Use data to tweak the designs all the way & change if it doesn’t work<br />
  15. 15. Lessons for JET<br />15<br />

×