This document discusses explanatory evaluation as a methodology to better understand the effects of policy interventions. Explanatory evaluation aims to determine how and why interventions work (or not), providing a finer level of detail than traditional impact evaluations. It involves reconstructing the underlying theory of the intervention, developing hypotheses about change mechanisms, and testing these hypotheses through field studies with different stakeholders. The goal is to gain insights into what works for whom and in what contexts to inform future policy. The methodology is proposed as a way to complement impact evaluations and provide more explanatory power for policymakers.
This is a presentation that aims to help PhD students (in management research or related fields) to connect their research questions with the research method that could fit better.
The material is a combination of presentations from other colleagues, credit is explicitly stated in the slides. The presentation also contains material from research papers that are strongly suggested as follow-on readings.
The overlaps between Action Research and Design ResearchSandeep Purao
Cole, R. , Purao, S., Rossi, M., Sein, M. 2005. Being Proactive: Where Action Research meets Design Research. International Conference on Information Systems. (ICIS) Las Vegas, NV, December 11-14. Originally presented at ICIS.
This is a presentation that aims to help PhD students (in management research or related fields) to connect their research questions with the research method that could fit better.
The material is a combination of presentations from other colleagues, credit is explicitly stated in the slides. The presentation also contains material from research papers that are strongly suggested as follow-on readings.
The overlaps between Action Research and Design ResearchSandeep Purao
Cole, R. , Purao, S., Rossi, M., Sein, M. 2005. Being Proactive: Where Action Research meets Design Research. International Conference on Information Systems. (ICIS) Las Vegas, NV, December 11-14. Originally presented at ICIS.
Evaluation for researchers is an important tool in assessing the merit of public and charitable services that everyone can use, and identifying ways in which those services could be improved.
Dr Helen Kara, an evaluation research specialist, presents the key elements of good practice at each stage of the evaluation process, helping you to better understand your research.
To learn more about evaluation download Helen's eBook: Beginners’ Guide to Evaluation - http://bit.ly/1Kr0vsG
Current challenges for educational technology researchMartin Oliver
Current challenges for educational technology research
Mayes described educational technology research as being like the film, 'Groundhog Day', with "cycles of high expectation [...] followed by proportionate disappointment", and "a cyclical failure to learn from the past". Fifteen years on, this experience still rings true.
Is this pattern inevitable and inescapable? This paper identified several challenges faced by work in this area. Together, they go some way towards explaining this pattern, and identifying what will need to change if we are to break out of this.
These challenges include the strategic difficulty of maintaining research work across cycles of new technology; the methodological challenge of studying things people have forgotten they are using; the epistemological challenge of reconceptualising the relationship between technology, users and effects; the practical challenge of knowing our learners; and the political challenge of securing funding for anything other than instrumental, applied work.
----
Seminar at Oxford education department, 17/11/10. Cited papers listed in the speaker's notes.
I’m a young Pakistani Blogger, Academic Writer, Freelancer, Quaidian & MPhil Scholar, Quote Lover, Co-Founder at Essar Student Fund & Blueprism Academia, belonging from Mehdiabad, Skardu, Gilgit Baltistan, Pakistan.
I am an academic writer & freelancer! I can work on Research Paper, Thesis Writing, Academic Research, Research Project, Proposals, Assignments, Business Plans, and Case study research.
Expertise:
Management Sciences, Business Management, Marketing, HRM, Banking, Business Marketing, Corporate Finance, International Business Management
For Order Online:
Whatsapp: +923452502478
Portfolio Link: https://blueprismacademia.wordpress.com/
Email: arguni.hasnain@gmail.com
Follow Me:
Linkedin: arguni_hasnain
Instagram : arguni.hasnain
Facebook: arguni.hasnain
Key aspects of formulating a research problem. Need of understanding the research problem. What, Why, and How in research. Characteristics of a good research problem.
This presentation is related to tools of Educational Research. This presentation slides deals various tools of educational research likes rating scale, opionnaire, checklist, aptitude test, inventory, observation, interview, schedule etc. This presentation slides also describe the item analysis, steps for item analysis and online survey tools.
Evaluation for researchers is an important tool in assessing the merit of public and charitable services that everyone can use, and identifying ways in which those services could be improved.
Dr Helen Kara, an evaluation research specialist, presents the key elements of good practice at each stage of the evaluation process, helping you to better understand your research.
To learn more about evaluation download Helen's eBook: Beginners’ Guide to Evaluation - http://bit.ly/1Kr0vsG
Current challenges for educational technology researchMartin Oliver
Current challenges for educational technology research
Mayes described educational technology research as being like the film, 'Groundhog Day', with "cycles of high expectation [...] followed by proportionate disappointment", and "a cyclical failure to learn from the past". Fifteen years on, this experience still rings true.
Is this pattern inevitable and inescapable? This paper identified several challenges faced by work in this area. Together, they go some way towards explaining this pattern, and identifying what will need to change if we are to break out of this.
These challenges include the strategic difficulty of maintaining research work across cycles of new technology; the methodological challenge of studying things people have forgotten they are using; the epistemological challenge of reconceptualising the relationship between technology, users and effects; the practical challenge of knowing our learners; and the political challenge of securing funding for anything other than instrumental, applied work.
----
Seminar at Oxford education department, 17/11/10. Cited papers listed in the speaker's notes.
I’m a young Pakistani Blogger, Academic Writer, Freelancer, Quaidian & MPhil Scholar, Quote Lover, Co-Founder at Essar Student Fund & Blueprism Academia, belonging from Mehdiabad, Skardu, Gilgit Baltistan, Pakistan.
I am an academic writer & freelancer! I can work on Research Paper, Thesis Writing, Academic Research, Research Project, Proposals, Assignments, Business Plans, and Case study research.
Expertise:
Management Sciences, Business Management, Marketing, HRM, Banking, Business Marketing, Corporate Finance, International Business Management
For Order Online:
Whatsapp: +923452502478
Portfolio Link: https://blueprismacademia.wordpress.com/
Email: arguni.hasnain@gmail.com
Follow Me:
Linkedin: arguni_hasnain
Instagram : arguni.hasnain
Facebook: arguni.hasnain
Key aspects of formulating a research problem. Need of understanding the research problem. What, Why, and How in research. Characteristics of a good research problem.
This presentation is related to tools of Educational Research. This presentation slides deals various tools of educational research likes rating scale, opionnaire, checklist, aptitude test, inventory, observation, interview, schedule etc. This presentation slides also describe the item analysis, steps for item analysis and online survey tools.
Realizing the true potential of connectivity remains a formidable challenge for marketers. Fragmented technologies lead to data silos and workflow inefficiency. Delivering relevant and timely messages across multiple channels is, in most cases, still a pipedream. Is there hope? How does a company with a distributed marketing environment overcome the inherent challenges of compliance and localization? How does it maximize value?
Join us for a presentation with Principal Analyst Ian Michiels, as we explore how Top Performing organizations successfully address the challenges of connectivity.
• Learn how Top Performers prioritize investments in marketing technology integration.
• Recommendations to align corporate marketing and filed marketing for more relevant marketing communications.
• Find out how Top Performers identify the cost of disconnected marketing technologies and back-office operations and justify investments to fix these issues.
KEYSTONE HPSR Initiative // Module 7: Realist evaluation // Slideshow 1: Realist and theory driven approaches in HPSR
This is the only slideshow of Module 7: Realist evaluation, of the KEYSTONE Teaching and Learning Resources for Health Policy and Systems Research
To access video sessions and slides for all modules copy and past the following link in your browser:
http://bit.ly/25vVVp1
Module 7: Realist evaluation
Programmes and policies are complex in their design and implementation because of the number of interacting agents, components and forces that influence people and organisations in a given system. In this module through the realist evaluation approach explores why programmes/interventions work for some and not for others and getting to the core issue of trying to understand the conditions under which the interventions works.
There is 1 slideshow in this module.
Module 7: Realist evaluation
Module 7 Slideshow 1: Realist and theory- driven approaches in HPSR
The other modules in this series are:
Module 1: Introducing Health Systems & Health Policy
Module 2: Social justice, equity & gender
Module 3: System complexity
Module 4: Health Policy and Systems Research frameworks
Module 5: Economic analysis
Module 6: Policy analysis
Module 8: Systems thinking
Module 9: Ethnography
Module 10: Implementation research
Module 11: Participatory action research
Module 12: Knowledge translation
Module 13: Research Plan Writing
KEYSTONE is a collective initiative of several Indian health policy and systems research (HPSR) organizations to strengthen national capacity in HPSR towards addressing critical needs of health systems and policy development. KEYSTONE is convened by the Public Health Foundation of India in its role as Nodal Institute of the Alliance for Health Policy and Systems Research (AHPSR).
The inaugural KEYSTONE short course was conducted in New Delhi from 23 February – 5 March 2015. In the process of delivering the inaugural course, a suite of teaching and learning materials were developed under Creative Commons license, and are being made available as open access resources. The KEYSTONE teaching and learning resources include 38 videos and 32 slide presentations organized into 13 modules. These materials cover foundational concepts, common approaches used in HPSR, and guidance for preparing a research plan.
These resources were created and are made available through support and funding from the Alliance for Health Policy & Systems Research (AHPSR), WHO for the KEYSTONE initiative.
This presentation includes academic material on what constitutes a contribution in academic research. It is the result of inputs from several researchers - see presentation sources for more details and follow-up reading.
invoNET 2012 Presentation.
Public involvement in research: assessing impact through a realist evaluation.
Presenters: David Evans, Vito Laterza & Rosie Davies
Introductions: Simon Denegri, Chair of INVOLVE
Academic Research Impact (ARI) Ecosystem Theory: An IntroductionMichael Thompson
How do you design, plan, evaluate, and execute your research in a way that is most impactful in a connected world?
These slides provide an introduction to Academic Research Impact (ARI) Ecosystem Theory - A ecosystem-based working theory on what things to consider when thinking about Academic Research Impact Management and Maximization, predicting system to individual-level research impact behavior, planning ARI, ARI Accountability, and characterizing how ARI progresses at an individual, micro, meso, and macro-level.
HI6008 Business Research Lecture 01(1) (1).pptxabeerarif
Assignment 3 Reflective writing aims to get you to think
about your learning and understand your learning experiences.Evaluate the effectiveness and your usefulness of the learning experience
Make judgements that are clearly connected to observations you have made.
Answer the questions:
− What is your opinion about learning experience?
− What is the value of this experience?
2. Explain how this learning process will be useful to you
Consider: In what ways might this learning experience serve you in:course
− program
− future career
− life generally
Answer the question: ‘How you will transfer or apply your new knowledge and
insights in the future?’
3. Describe objectively what happened in the learning process
Give the details of what happened in the learning process. Answer the question:
‘What you did, read, see, and hear?
4. Evaluate what you learn
Make judgments connected to observations you have made in the Business
Research. Answer the question: ‘How Business Research was useful for your
Research Learning Process?’
5. Explain your learning process:
EUMIND - Europe Meets India - action of COMP@CT network/Euneos Corp. was presented in the workshop at the ESP25 conference on Friday March 11, 2011. The presenters were Ludo Mateusen Netherlands (live online) and K.D.Shijo India (live online). The live online session was moderated by Ilpo Halonen, Finland. on-site.
1. Explanatory Evaluation:
how to better understand effects
of interventions
Henk Sligte
Kohnstamm Institute for Educational Research
University of Amsterdam
Henk.Sligte@UvA.NL
http://kohnstamminstituut.uva.nl/htm/english.htm
2. Research for
Dutch Ministry of Education
• Report (in Dutch)
• Pater, C., Sligte, H., van Eck, E
(2012). Verklarende evaluatie,
een methodiek. Amsterdam:
Kohnstamm Instituut
• Explanatory Evaluation, a
methodology for policy
evaluation
http://www.kohnstamminstituut.uva.nl/rapporten/pdf/ki882.pdf
3. Evaluation
• Evidence based policy (and practice): effect
(impact) evaluation what works?
• Dutch Ministry: we need explanations for
effects we need a methodology to
complement impact evaluation
• The other side of the medal: explanatory
evaluation what works for whom in what
circumstances? How and why does the
policy work (or not)? Finer granularity is
needed
• In future: both types to be applied
4. Tentative question
here and now:
If the research
method works in
explaining effects of
policy interventions,
can it then also work
in explaining
interventions and
innovations in
education, in schools?
5. Effect (Impact) evaluation
• Through (quasi)experiments demonstration of
causal relation between intervention and
found effects
– Experiment: Controlled Randomized Trials
– Quasi: natural experiment
– Difference-in-difference: compare with business as
usual (the whole population minus the
experimental)
Pretest-posttest model
Compare the experimental groups with similar
control or reference groups
Effects can exclusively be ascribed to intervention
6. BUT:
• The rationale (the why) for these effects
remains unknown: black box
• Explanatory evaluation focuses specifically
on this and can further validate the results
of an impact assessment (effect evaluation)
• Questions addressed:
– How and why does an intervention work?
– If there is no or a smaller effect than expected,
why is that the case?
– Which unwanted / unintended effects occur (also
unexpected positive)?
7. The development of
the methodology
• Literature study, especially Pawson, R. &
Tilley, N. (1997). Realistic Evaluation.
London: SAGE Publications
• Study of two Dutch policy interventions in
the area of early school leaving
– VM2: prevent drop-out
– De Wijkschool: stimulate drop-outs because of
multiple problems (drugs, criminality, debts,
broken families, etc) to go back to school
8. Effects of VM2
• VM2
– School drop-out in first year vocational
education enormous
– Intervention: combine systems of preparatory
and regular vocational education
– Effects: compare experimental group with
population as a whole on numbers of dropouts
– Conclusion: less dropout… “in xx percentpoint
of the cases a significant lower rate of dropouts
was the result”…
– How to find more differential effects and how to
understand them better?
9. Explanatory evaluation (EE)
• Realistic Evaluation: get under the surface of the
direct observable...
• What works for whom in what circumstances and
how and why?
• Ex ante (before), ex durante (during), ex post (after)
• Central to answering this question is the
reconstruction of the policy (or program) theory
• The policy theory is the sum of assumptions or
hypotheses that explain how the policy should work
These can be represented in statements like:
• If (intervention) ... Then ... (Outcome), and in some
cases also conditions But (take care that…)
10. EE: contexts differ
• In the reconstruction of the theory, it is important to
have an eye for the conditions and factors that play a
role in a variety of contexts.
• The context (or conditions) can be physical, but also
social, for example, certain groups or characteristics
of target groups (an intervention works only for
motivated students).
• Attention to circumstances helps explain why an
intervention works better in one case than in the other
case.
• The intervention works through certain mechanisms,
mechanisms that make "things work" and can be seen
as the driving force behind an intervention.
11. EE
• Analysis of policy documents
• Creation of concrete causal generative schemes
(=hypotheses) on the basis of this analysis
• Context-Mechanism-Outcome (CMO) schemes
– Problem mechanisms, leading to unwanted outcomes or
results in (parts of the) society
– Intervention that generates Change Mechanism(s)
– Change mechanisms overcome problem mechanisms
– Better outcomes
• Check with staff members (at Ministry) that were
responsible for policy intervention
• Adjust CMO-schemes, make new CMO-schemes
12.
13. EE: field studies
• Test the theory, test the hypotheses (CMO)
• Enter the field studies. The researcher must
have the different groups of actors in the
picture.
• Consider what information to whom can be
achieved, what should be asked from whom:
who knows what?
• Formulate relevant questions for each (type
of) informant to answer and think about the
most appropriate method for each type
involved persons.
14. Field studies
• Reconstruction and comparison with the
policy theory takes place in a learning
dialogue at different levels.
• Types (levels) of respondents:
– Policy implementers (e.g. project leaders)
– Intermediaries (e.g. teachers)
– Target group (e.g. students, elderly, etc)
• Researchers: open attitude but focus on
understanding and consensual knowledge
(do we agree that this is what really
happened?)
15. EE: Field studies
• Each actor from his own role in the
operation of the policy can reflect on
assumed mechanisms and give an
explanation of (un) planned and (un)
desired effects.
• Question to the dialogue partners is
– whether the assumptions indeed work for them
– are assumptions about their behaviour correct,
– what side effects they face.
• Cyclical process: stop rules...
16. EE: Critical evaluation
• Here the explanations are found why the
policy intervention has or has not the desired
effects, the goal of an explanatory evaluation.
• If the effects of an intervention are not as
positive as expected, the researcher may ask
the following questions:
– Is the theory or policy measure(s) plausible?
– Was the theory sufficiently differentiated?
– Was implementation successful?
– Were necessary conditions met?
– What should be modified?
• Learning effects for policy makers…
18. R&D in education
• Development interventions/innovations
• Do you measure effects (outcomes) and
how? Pre-test/post-test?
• Compare with reference/control groups?
• What do you measure?
– Knowledge, skills, attitudes, higher order skills
(reflection, learn to learn, learn to see activities
as learning), motivation? Other, new things?
• Find out what works: outcomes, effects
• Find out what works for whom in what
contexts and why
19. Explanatory evaluation
• What is the theory behind the Development?
• Reconstruct the assumptions (ifthen/CMO)
• Distinguish assumptions at levels:
– The ideas, the theories
– The structures and systems developed and used
– The anticipated effects on various groups of actors
• Do field research
• Evaluate whether your assumptions are shared
with various actors at different levels: test the
hypotheses
20. Generic interview scheme
• THEN – NOW – LATER
• Contexts-Mechanisms-Outcomes
• Start with NOW
• What Outcomes realised? Differentiate (level
of ideas-theories, structures-systems,
behaviour)
• THEN: What Hypotheses? What Processes
(mechanisms) caused the Outcomes?
• What crucial success & failure factors?
• LATER: what new/adjusted outcomes?
• How to achieve the outcomes?
21. 2) Go back to start of project 4) Go to end of project
3) What Hypotheses? 6) What Interventions
What Mechanisms and Mechanisms
Caused the Outcomes? are needed to achieve
Success and failure the Outcomes?
factors?
PAST PRESENT FUTURE
1) What Outcomes are now realized? 5) What Outcomes
a) ideas/theories are expected?
b) structures/new roles What to be realized?
c) the workfloor What new ideas/theories?
Differences in Contexts? What success and failure?