How to navigate the maze of evaluation methods and approaches
Upcoming SlideShare
Loading in...5
×
 

How to navigate the maze of evaluation methods and approaches

on

  • 2,231 views

 

Statistics

Views

Total Views
2,231
Views on SlideShare
1,321
Embed Views
910

Actions

Likes
0
Downloads
12
Comments
0

5 Embeds 910

http://betterevaluation.org 857
https://twitter.com 47
http://bettereval.trellon.org 3
http://be-theme.dev 2
http://betterevaluation.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • There is increasing recognition of the importance of evaluation. But when NGOs or government agencies start to plan an evaluation, or to engage an evaluator, it is difficult to find accessible, comprehensive and credible information about how to choose the right combination of evaluation methods, and how to implement them well – or how to manage these issues with a potential evaluator. Many guides to evaluation focus on only a few types of data collection methods and a few types of research designs. Few guides cover newer types of data collection (such as GIS logging on mobile phones, or tracking social media mentions), designs and strategies that can be used when experimental or quasi-experimental research designs are not feasible or appropriate, and methods to address issues such as clarifying and negotiating the values that should underpin the evaluation. There is information about some of these available on various web sites, but they can be difficult to locate, of varying quality, and not co-ordinated. And much of the information is partisan for particular methods, or does not include examples from practice.
  • There is increasing recognition of the importance of evaluation. But when NGOs or government agencies start to plan an evaluation, or to engage an evaluator, it is difficult to find accessible, comprehensive and credible information about how to choose the right combination of evaluation methods, and how to implement them well – or how to manage these issues with a potential evaluator. Many guides to evaluation focus on only a few types of data collection methods and a few types of research designs. Few guides cover newer types of data collection (such as GIS logging on mobile phones, or tracking social media mentions), designs and strategies that can be used when experimental or quasi-experimental research designs are not feasible or appropriate, and methods to address issues such as clarifying and negotiating the values that should underpin the evaluation. There is information about some of these available on various web sites, but they can be difficult to locate, of varying quality, and not co-ordinated. And much of the information is partisan for particular methods, or does not include examples from practice.
  • There is increasing recognition of the importance of evaluation. But when NGOs or government agencies start to plan an evaluation, or to engage an evaluator, it is difficult to find accessible, comprehensive and credible information about how to choose the right combination of evaluation methods, and how to implement them well – or how to manage these issues with a potential evaluator. Many guides to evaluation focus on only a few types of data collection methods and a few types of research designs. Few guides cover newer types of data collection (such as GIS logging on mobile phones, or tracking social media mentions), designs and strategies that can be used when experimental or quasi-experimental research designs are not feasible or appropriate, and methods to address issues such as clarifying and negotiating the values that should underpin the evaluation. There is information about some of these available on various web sites, but they can be difficult to locate, of varying quality, and not co-ordinated. And much of the information is partisan for particular methods, or does not include examples from practice.

How to navigate the maze of evaluation methods and approaches How to navigate the maze of evaluation methods and approaches Presentation Transcript

  • Sharing information to improve evaluation
  • How to navigate the maze of evaluation methods and approaches A BetterEvaluation PodcastBetterEvaluation.org Simon Hearn & Ron Mackay
  • The Challenge:Overwhelming number anddiversity of PME tools
  • Wikipedia page for Evaluation Methods
  • Steps Taken:Tools are just too many, focus onLog Frame and Outcome Mapping
  • Image: Michael Heilemann. http://www.flickr.com/photos/heilemann
  • Image: 8lettersuk. http://www.flickr.com/photos/8lettersuk/
  • Image: Lottery Monkey. http://www.flickr.com/photos/lotterymonkey/
  • Image: Simon Kneebone. http://simonkneebone.com/
  • Image: Susan Kistler. http://aea365.org
  • Image: Reven. http://www.flickr.com/photos/reven/
  • Image: russelldavies. http://www.flickr.com/photos/russelldavies//
  • Image: nickherber. http://www.flickr.com/photos/nickherber/
  • Simple organisational cycle Plan Assess Act
  • CIPP Model ContextInputs Process Product
  • Image: meg_riordan. http://www.flickr.com/photos/meg_riordan/
  • Evaluation is the activity ofsystematically collecting, analyzingand reporting information so that itcan be used by appropriate people toanswer their questions so that theycan make decisions about the future ofa project or program.
  • • Who carries out the ‘activity’?• Who or what is an evaluator?• How does an activity become ‘systematic’?• What activites and skills do ‘collecting, analysing and reporting’ involve?• How do you go about ‘collecting, analysing and reporting’?• Is that an exhaustive list of ‘the activity’ that is called ‘evaluation’• What kind of ‘information’ is being referred to?• What is the thing or things you collect information about?• Who should be involved in an evaluation?• Who are the ‘appropriate people’ who ‘use’ the ‘information’ supplied by an evaluation?• And for what purposes can they ‘use’ this information?• What kinds of questions can they ask?
  • Rainbow Framework Over 200 methods/optionsrelated to 35 tasks in 7 clusters
  • Rainbow Framework Over 200 methods/optionsrelated to 35 tasks in 7 clusters
  • BetterEvaluation.org
  • See IOCE.net for a list ofnational evaluation societies
  • Evaluation Competencies1. Reflective Practice2. Technical Practice3. Situational Practice4. Management Practice5. Interpersonal Practice Competencies for Canadian Evaluation Practice, Canadian Evaluation Society
  • Evaluation Standards1. Utility Standard2. Feasibility Standards3. Propriety Standards4. Accuracy Standards5. Evaluation Accountability Standards The Program Evaluation Standards developed by the JCSEE
  • How to navigate the maze of evaluation methods and approaches A BetterEvaluation PodcastBetterEvaluation.org Simon Hearn & Ron Mackay