Chapter3

272 views
213 views

Published on

Research Methods in Education 6th Edition

Published in: Technology, News & Politics
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
272
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Chapter3

  1. 1. RESEARCH AND EVALUATION © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON
  2. 2. STRUCTURE OF THE CHAPTER • Research and evaluation: similarities and differences • Research, politics and policy making
  3. 3. DEFINING EVALUATION The provision of information about specified issues upon which judgements are based and from which decisions for action are taken.
  4. 4. SIMILARITIES BETWEEN EVALUATION AND RESEARCH • Evaluation can examine the effectiveness of a program or policies, as can research; • They share the same methodologies (styles, instrumentation, sampling, ethics, reliability, validity, data analysis techniques, reporting and dissemination mechanisms).
  5. 5. DIFFERENCES BETWEEN EVALUATION AND RESEARCH (Smith, M. and Glass, G. (1987) Research and Evaluation in the Social Sciences. New Jersey: Prentice Hall) • The intents and purposes of the investigation; • The scope of the investigation; • Values in the investigation; • The origins of the study; • The uses of the study; • The timeliness of the study; • Criteria for judging the study; • The agendas of the study.
  6. 6. DIFFERENCES BETWEEN EVALUATION AND RESEARCH (Norris, N. (1990) Understanding Educational Evaluation. London: Kogan Page) • The motivation of the enquirer; • The objectives of the research; • Laws versus description; • The role of explanation; • The autonomy of the enquiry;
  7. 7. DIFFERENCES BETWEEN EVALUATION AND RESEARCH (Norris, N. (1990) Understanding Educational Evaluation. London: Kogan Page) • Properties of the phenomena that are assessed; • Universality of the phenomena studied; • Salience of the value question; • Investigative techniques; • Criteria for assessing the activity; • Disciplinary base.
  8. 8. CONFORMATIVE EVALUATION (Stronach, I. and Morris, B. (1994) Polemical notes on educational evaluation in an age of ‘policy hysteria’. Evaluation and Research in Education, 8 (1&2), pp. 5-19). • Short-term, takes project goals as given; • Ignores the evaluation of longer-term outcomes; • Gives undue weight to the perceptions of programme participants who are responsible for the successful development and implementation of the programme: ‘over-reports’ change; • Neglects/‘under-reports’ the views of some practitioners and critics;
  9. 9. CONFORMATIVE EVALUATION • Adopts an atheoretical approach, and regards the aggregation of opinion as the determination of significance; • Involves a tight contractual relationship with programme sponsors that disbars public reporting or encourages self-censorship to protect future funding; • Risks implicit advocacy of the programme in its reporting style.
  10. 10. MODELS OF EVALUATION • Survey; • Experiment; • Illuminative; • The CIPP model: – Context, Input, Process, Product; – Look for congruence between what was intended to happen and what actually happened in these four areas. • Objectives: – How far have the objectives been achieved.
  11. 11.  STAKE’S MODEL OF EVALUATION Congruence between intentions & observations – what actually happened   INTENTIONS     Congruence   OBSERVATIONS Intended antecedents ←• • • • • • • •→ Actual antecedents Intended transactions ←• • • • • • • •→ Actual transactions Intended outcomes ←• • • • • • • •→ Actual outcomes Antecedents = initial conditions Transactions = processes, what takes place during the program
  12. 12. RESEARCH, POLITICS & POLICY MAKING Politics, research and evaluation are inextricably linked in respect of: – Funding – Policy-related research – Commissioned research – Control and release of data and findings – Dissemination of research – How does research influence policy? – Who judges research utilization? – Consonance with political agendas
  13. 13. RESEARCH, POLITICS & POLICY MAKING Researchers and policy makers may have conflicting: • Interests • Agendas • Audiences • Time scales • Terminology • Concern for topicality
  14. 14. RESEARCH, POLITICS & POLICY MAKING Policy makers like: • Simple Impact Model • Superficial facts • Unequivocal data • Short term solutions • Simple, clear remedies for complex, generalized social problems • Certainty • Positivist methodologies Researchers work with: • Complex models • Complex data • Uncertain findings • Longer-term time scales • Subtle and provisional data on complex and multi-faceted issues • Conjecture • Diverse methodologies (fitness for purpose)

×