• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Meta-Evaluation
 

Meta-Evaluation

on

  • 360 views

 

Statistics

Views

Total Views
360
Views on SlideShare
360
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Meta-Evaluation Meta-Evaluation Presentation Transcript

    • Meta–Evaluation Theory: Definitions, Processes, & Systems David Passmore Workforce Education & Development
    •  Meta — from Greek: μετά = “after", "beyond", "with", “adjacent”, “self”  A prefix used…indicate a concept which is an abstraction from another concept.
    •  Educational Evaluation and Decision Making ◦ 1971 ◦ Out of print  Conceived in reaction to the difficulty and practical inconsequence of: ◦ Applying experimental and quasi–experimental designs ◦ In educational and work settings that rarely afford opportunities for attaining the gold standard of random assignment of subjects to experimental conditions
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives. A dissection of the definition
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  An activity subsuming many methods and involving a number of steps or operations.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Two or more different actions that might be taken in response to some situation requiring altered action.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Situations include: (a) unmet needs exist; (b) some barrier to fulfillment of needs exists; or (c) opportunities exist which ought to be exploited.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Quantitative or qualitative data about entities (tangible or intangible) and their relationships in terms of some purpose.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Information is derived from many sources and methods. Could be from scientific data, precedence, or experience.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Information is more than a collection of facts and data. Rather, facts and data must be organized for intelligibility and must reduce the uncertainty that surrounds decision making.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Information often is limited and imperfect in evaluation situations in comparison with experimental situations in which controls are possible and are formalized by rigorous design.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Identifying evaluative information required through an inventory of the decision alternatives to be weighed and the criteria to be applied in weighing them.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Knowing at least two elements is essential: (a) decision alternatives and (b) values or criteria to be applied. These are obtained only from clients for evaluations who must make or ratify decisions.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Therefore, evaluation has an “interface” role as well as a “technical” role because its worth is defined by meeting client needs.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Collecting, organizing, and analyzing information through such formal means as observation, review of artifacts, measurement, data processing, and statistical analysis.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Ways of obtaining information are guided by scientific criteria and disciplinary preferences.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Fitting information together into systems and subsystems that best serve the purposes of the evaluation, and reporting the information to the decision maker.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Involves interaction between the evaluator and the various audiences for information. Multiple audiences are common.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  The direct audience for information is the decision maker. Also important are people and organizations who must ratify decisions made as are others who have strong stakeholder positions.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  The information delivery preferences of these audiences vary in specificity, modality, and timing.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Satisfying criteria for evaluation and matched to the judgment criteria to be employed in choosing among decision alternatives.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Criteria to be imposed on evaluations include: scientific criteria (internal and external validity, reliability, objectivity);
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  practical criteria (relevance, importance, scope, credibility, timeliness, and pervasiveness);
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  and prudential criteria (efficiency).
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Must meet the client’s identified information needs.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  The act of choosing among decision alternatives.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Judging is central because evaluation is meant to serve decision making.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.  Although judging is central to evaluation, the act of judging is not central to an evaluator’s role.
    • Evaluation is…. A process for delineating, obtaining, and providing useful information for judging decision alternatives.
    • Evaluation process serves decision–making
    •  Planning decisions  Context evaluation—This type of evaluation determines goals and objectives; defines the relevant environment, its actual and desired conditions, its unmet needs and unused opportunities, and why needs and opportunities are not being met and used; examines contingencies that pressure and promote improvements; and assesses congruities between actual and intended performance.
    •  Structuring decisions  Input evaluation— Essentially, this type of evaluation helps state objectives operationally and whether their accomplishment is feasible.
    •  Implementing decisions  Process evaluation—This type of evaluation is used to: identify and monitor potential sources of failure in an activity; to service preprogrammed decisions that are to be made during the implementation of an activity; and to record events that have occurred so that “lessons learned” can be delineated at the end of an activity. Process evaluation assesses the extent to which procedures operate as intended.
    •  Recycling decisions  Product evaluation—This type of evaluation measures criteria associated with the objectives for an activity, compares these measurements with predetermined absolute or relative standards, and makes rational interpretations of these outcomes using recorded context, input, and process information. Product evaluation investigates the extent to which objectives have been, or are being, attained.
    •  C = Context  I = Input  P = Process  P = Product
    • Who needs what when?
    • How?
    • To whom and in what form?
    • To whom and in what form?
    • Evaluation is, itself, evaluated….  Scientific criteria—  Practical criteria  Prudential criteria More expansive than evaluating quality of research
    • Evaluation is, itself, evaluated….  These criteria primarily assess “representational goodness;” that is, they assess how well the evaluation depicts a situation. These criteria, in their detailed and technical forms, are quite familiar to most people with research training. More expansive that evaluating quality of research
    • Evaluation is, itself, evaluated….  Internal validity  External validity  Reliability  Objectivity More expansive that evaluating quality of research
    • Evaluation is, itself, evaluated….  Extent to which evaluation findings can be attributed to the activity evaluated rather than to some other factor or artifact.
    • Evaluation is, itself, evaluated….  The extent to which evaluation findings are generalizable over people, places, and time.
    • Evaluation is, itself, evaluated….  Simply said, the extent to which evidence is measured accurately.
    • Evaluation is, itself, evaluated…. Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence.  The “Do you see what I see?” phenomenon. 
    • Evaluation is, itself, evaluated…. Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence.  The “Do you see what I see?” phenomenon. 
    • Evaluation is, itself, evaluated….  Relevance  Credibility  Importance  Timeliness  Scope  Persuasiveness Practical criteria…
    • Evaluation is, itself, evaluated…. Evaluation information is obtained to serve decision making.  If the information obtained does not match these decisions, then this information, no matter how well obtained scientifically, is useless. 
    • Evaluation is, itself, evaluated….  Information needs to be more than nominally relevant to a decision.
    • Evaluation is, itself, evaluated….  Not all information is important. Information with the highest importance (relevance graded by quality) must be obtained and provided, within budget constraints.
    • Evaluation is, itself, evaluated….  The evaluator must determine what the client believes is important. In some cases, the evaluator might suggest to the client what information ought to be considered important because the evaluator often has considerable experience with obtaining and using some types of information.
    • Evaluation is, itself, evaluated…. This is a “completeness” criterion.  That is, is all information obtained and provided that is needed to make a decision? 
    • Evaluation is, itself, evaluated….  Clients often are not in a position to judge whether information obtained and provided by an evaluator meets scientific criteria. Therefore, the trust invested in the evaluator by the client is an important dimension of the quality of the outcomes of an evaluation project.
    • Evaluation is, itself, evaluated….  Just like the scientific criterion of internal validity pertains to a particular evaluation situation, credibility is never generalizable and always refers to a particular situation.
    • Evaluation is, itself, evaluated…. ◦ Information from evaluation projects is provided to identified audiences. And, this information is meant to be used. ◦ The criterion of pervasiveness is met if all of the people and organizations who should, do, in fact, know about and use evaluative information.
    • Evaluation is, itself, evaluated….  Proper application of practical criteria of relevance, importance, and scope should remedy many inefficiencies in evaluation.  However, the conduct of an evaluation project must be weighed against alternative evaluation designs that could have achieved the same outcome with different time, financial, and personnel resources.
    • Meta–Evaluation Theory: Definitions, Processes, & Systems David Passmore Workforce Education & Development