Keynote 1. How can you tell if is not working? Evaluating the impact of educational innovations: David Streatfield, Global Libraries Initiative Consultant
Theory-Based Approaches for Assessing the Impact of Integrated Systems Research - Brian Belcher, Royal Roads University. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Some musings on evaluating the impacts of integrated systems research - Karl Hughes, PIM. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Research: Our Experience - Monica Biradavolu, SPIA. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Using case studies to explore the generalizability of 'complex' development i...Barb Knittel
Discussion of the questions of internal and external validity and how case-based approaches are relevant for informing replication and scale up. Case studies can help to extrapolate key facts regarding context dynamics, process mechanisms, implementation capability, and trajectories of change (Michael Woolcock, World Bank).
Impact evaluations aim to predict the future, but they are rooted in particular contexts and to what extent they generalize is an open and important question. I founded an organization to systematically collect and synthesize impact evaluation results on a wide variety of interventions in development. These data allow me to answer this and other questions for the rst time using a large data set of studies. I consider several measures of generalizability, discuss the strengths and limitations of each metric, and provide benchmarks based on the data. I use the example of the eect of conditional cash transfers on enrollment rates to show how some of the heterogeneity can be modelled and the eect this can have on the generalizability measures. The predictive power of the model improves over time as more studies are completed. Finally, I show how researchers can
estimate the generalizability of their own study using their own data, even when data from no comparable studies exist.
Read more at: www.hhs.se/site
Theory-Based Approaches for Assessing the Impact of Integrated Systems Research - Brian Belcher, Royal Roads University. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Some musings on evaluating the impacts of integrated systems research - Karl Hughes, PIM. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Research: Our Experience - Monica Biradavolu, SPIA. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Using case studies to explore the generalizability of 'complex' development i...Barb Knittel
Discussion of the questions of internal and external validity and how case-based approaches are relevant for informing replication and scale up. Case studies can help to extrapolate key facts regarding context dynamics, process mechanisms, implementation capability, and trajectories of change (Michael Woolcock, World Bank).
Impact evaluations aim to predict the future, but they are rooted in particular contexts and to what extent they generalize is an open and important question. I founded an organization to systematically collect and synthesize impact evaluation results on a wide variety of interventions in development. These data allow me to answer this and other questions for the rst time using a large data set of studies. I consider several measures of generalizability, discuss the strengths and limitations of each metric, and provide benchmarks based on the data. I use the example of the eect of conditional cash transfers on enrollment rates to show how some of the heterogeneity can be modelled and the eect this can have on the generalizability measures. The predictive power of the model improves over time as more studies are completed. Finally, I show how researchers can
estimate the generalizability of their own study using their own data, even when data from no comparable studies exist.
Read more at: www.hhs.se/site
Measuring the impact of integrated systems research
Panel Speakers: Vincent Gitz, Natalia Estrada Estrada Carmona, Monica Biradavolu and Karl Hughes. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Reviewing the evidence on implementation and long-term impact of integrated landscape approaches - James Reed, CIFOR. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Cluster evaluation: Learning to complete the virtuous circle! - James WilsonOrkestra
Interesting article about cluster evaluation written by James Wilson in collaboration with Madeline Smith and Emily Wise for the TCI Network's 'Shared Values' publications that was distributed at the European Conference in Bulgaria (March 2018).
Does evidence actually influence policy? What can be done to improve the record?
Presentation by Priya Deshingkar, Research Director of the Migrating out of Poverty RPC
This annotated compendium of evaluation planning guides can help you understand the basics of conducting an evaluation; learn how to create a logic model and indicators; understand evaluation terminology; develop performance management metrics; and evaluate your research, knowledge translation and commercialization activities, outputs and outcomes.
The implementation 'black box' and evaluation as a driver for change. Presentation by Katie Burke and Claire Hickey of the Centre for Effective Services.
A ceLTIc project webinar. The ceLTIc project shows how to enable LTI (Learning Tools Interoperability) connectors to build a flexible infrastructure.This session will discuss how the JISC-funded ceLTIc:sharing project is evaluating the use of LTI to provide a shared service for institutions interested in evaluating WebPA. It will include a demonstration of linking to the tool from Blackboard Learn 9 and Moodle, as well as how the outcomes service along with the unofficial memberships and setting extensions are being used to enhance this integration in a VLE-independent way.
Jisc conference 2012
Measuring the impact of integrated systems research
Panel Speakers: Vincent Gitz, Natalia Estrada Estrada Carmona, Monica Biradavolu and Karl Hughes. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Reviewing the evidence on implementation and long-term impact of integrated landscape approaches - James Reed, CIFOR. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Cluster evaluation: Learning to complete the virtuous circle! - James WilsonOrkestra
Interesting article about cluster evaluation written by James Wilson in collaboration with Madeline Smith and Emily Wise for the TCI Network's 'Shared Values' publications that was distributed at the European Conference in Bulgaria (March 2018).
Does evidence actually influence policy? What can be done to improve the record?
Presentation by Priya Deshingkar, Research Director of the Migrating out of Poverty RPC
This annotated compendium of evaluation planning guides can help you understand the basics of conducting an evaluation; learn how to create a logic model and indicators; understand evaluation terminology; develop performance management metrics; and evaluate your research, knowledge translation and commercialization activities, outputs and outcomes.
The implementation 'black box' and evaluation as a driver for change. Presentation by Katie Burke and Claire Hickey of the Centre for Effective Services.
Similar to Keynote 1. How can you tell if is not working? Evaluating the impact of educational innovations: David Streatfield, Global Libraries Initiative Consultant
A ceLTIc project webinar. The ceLTIc project shows how to enable LTI (Learning Tools Interoperability) connectors to build a flexible infrastructure.This session will discuss how the JISC-funded ceLTIc:sharing project is evaluating the use of LTI to provide a shared service for institutions interested in evaluating WebPA. It will include a demonstration of linking to the tool from Blackboard Learn 9 and Moodle, as well as how the outcomes service along with the unofficial memberships and setting extensions are being used to enhance this integration in a VLE-independent way.
Jisc conference 2012
The field of program evaluation presents a diversity of images a.docxcherry686017
The field of program evaluation presents a diversity of images and claims about the nature and role of evaluation that confounds any attempt to construct a coher- ent account of its methods or confidently identify important new developments. We take the view that the overarching goal of the program evaluation enterprise is to contribute to the improvement of social conditions by providing scientifically credible information and balanced judgment to legitimate social agents about the effectiveness of interventions intended to produce social benefits. Because of its centrality in this perspective, this review focuses on outcome evaluation, that is, the assessment of the effects of interventions upon the populations they are intended to benefit. The coverage of this topic is concentrated on literature published within the last decade with particular attention to the period subsequent to the related reviews by Cook and Shadish (1994) on social experiments and Sechrest & Figueredo (1993) on program evaluation.
The word ‘evaluation’ has become increasingly used in the language of community, health and social services and programs. The growth of talk and practice of evaluation in these fields has often been promoted and encouraged by funders and commissioners of services and programs. Following the interest of funders, has been a growth in the study and practice of evaluation by community, health and social service practitioners and academics. When we consider why this move in evaluative thinking and practice has occurred, we can assume the position of the funder and simply answer, ‘...because we want to know if this program or service works’. Practitioners, specialists and academics in these fields have been called upon by governments and philanthropists to aid the development of effective evaluation. Over time, they have led their own thinking and practice independently. Evaluation in its simplest form is about understanding the effect and impact of a program, service, or indeed a whole organization. Evaluation as a practice is not so simple however, largely because in order to assess impact, we need to be very clear at the beginning what effect or difference we are trying to achieve.
The literature review begins with an overview of qualitative and quantitative research methods, followed by a description of key forms of evaluation. Health promotion evaluation and advocacy and policy evaluation will then be explored as two specific domains. These domains are not evaluation methodologies, but forms of evaluation that present unique requirements for effective community development evaluation. Following this discussion, the review will explore eight key evaluation methodologies: appreciative enquiry, empowerment evaluation, social capital,
social return on investment, outcomes based evaluation, performance dashboards and scorecards and developmental evaluation. Each of these sections will include specific methods, the values base of each methodo ...
An Adaptive Learning Process for Developing and Applying Sustainability Indicators with Local Communities
`
For more information, Please see websites below:
`
Organic Edible Schoolyards & Gardening with Children
http://scribd.com/doc/239851214
`
Double Food Production from your School Garden with Organic Tech
http://scribd.com/doc/239851079
`
Free School Gardening Art Posters
http://scribd.com/doc/239851159`
`
Increase Food Production with Companion Planting in your School Garden
http://scribd.com/doc/239851159
`
Healthy Foods Dramatically Improves Student Academic Success
http://scribd.com/doc/239851348
`
City Chickens for your Organic School Garden
http://scribd.com/doc/239850440
`
Simple Square Foot Gardening for Schools - Teacher Guide
http://scribd.com/doc/239851110
Project Cycle and Causal Hypothesis _ Theory of Change.pptxGeorgeKabongah2
The project life cycle is the order of processes and phases used in delivering projects. It describes the high-level workflow of delivering a project and the steps you take to make things happen.
Similar to Keynote 1. How can you tell if is not working? Evaluating the impact of educational innovations: David Streatfield, Global Libraries Initiative Consultant (20)
Exploring accessibility challenges in library systems for visually impaired users:
A case study of an Accessibility Audit and training programme carried out at MU Library - CONUL T&L Annual Seminar 2024
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
"Protectable subject matters, Protection in biotechnology, Protection of othe...
Keynote 1. How can you tell if is not working? Evaluating the impact of educational innovations: David Streatfield, Global Libraries Initiative Consultant
2. “ … any effect of the service (or of an
event or initiative) on an individual, group
or community.”
may be positive or negative
may be intended or accidental
may affect users, library staff, senior
managers – others
Markless and Streatfield, 2013
3. The impact can show itself:
in individual cases, or
more general changes, such as shifts in:
• quality of life: e.g. self-esteem; confidence;
feeling included; work or social engagement
• educational and other outcomes: e.g. skills
acquired; educational attainment; levels of
knowledge.
4. Changes in:
What people do (behaviour)
How they do things
How much they know
Their attitudes (e.g. confidence;
valuing library staff!)
5. Phase 1: Stealing from educational evaluation
Phase 2: Stealing from international
development evaluation
Phase 3: Stealing from the programme theory
community
6. The simple logic model
Hypothesis refuting ... and rewriting
Illuminative evaluation
Impact evaluation
The problem of complexity
7. Some people who may be affected by library and
information services:
History students
Library staff who teach
Mathematics teachers and learners
Music students
Ph D candidates
People who wish to be (more) employable
School archive users
Students becoming more digitally literate
Students with intellectual disabilities
8. Recognising complexity: the complex logic
model
where the paths from action to impact are
complex, with disproportionate
relationships (in which, at critical levels,
a small change can make a big difference)
Emergent evaluation
emergent impacts (which cannot readily
be specified at the outset).
Rogers, 2008
9. In the 1990s education and development
evaluation (especially in the USA) was
focused on attribution studies based on
experimental designs and randomly
controlled trials.
Now (especially in Europe) evaluators seek
to gauge the impact of complex change
programmes, and new services in complex
settings using rigorous theory-based
evaluations.
10. These:
◦ make programme assumptions and
implementation issues transparent
◦ enable evaluators to recognise different
levels and types of contributions to
change, and
◦ pave the way towards credible causal
claims, not relying on direct attribution.
11. Various theory-based approaches have been
developed but these all start with articulation
of a Theory of Change. This underpins a
programme or intervention, to create a
framework for focusing and conducting its
evaluation and explaining any effects.
12. A ToC describes how and why a desired change is
expected to happen in a particular context. It
focuses on mapping out or ‘filling in’ (what has
been described as the ‘missing middle’) between
what a programme does (its activities,
mechanisms or interventions) and how these can
lead to objectives being achieved.
HOW?
It identifies the desired long-term objectives and
then works back from these to identify how and
why people expect things to work to achieve
change.
13. Plausible - do evidence and common sense
suggest that the activities will lead to the desired
outcomes?
Doable – are the resources be available to carry
out the initiative?
Testable – is the ToC specific and complete
enough to track its progress in credible ways?
“ A good ToC is embedded in the context of the
intervention and is developed incorporating the
perspectives of key stakeholders, beneficiaries
and the existing relevant research.”
Mayne (2012)
14. Complexity is now the norm and complex situations and
initiatives require flexible, agile evaluation approaches to
deal with multiple factors, relationships and layers.
Rigorous and systematic early articulation of the ways in
which the programme expects to bring about clearly
identified changes in individuals and communities at a
range of different levels (programme theory)
Systematic use of relevant research literature from
appropriate disciplines to inform the ToC.
Using the ToC to carefully focus data collection activities,
increasingly by using mixed methods
Greene, 2008
15. Theory-based evaluation approaches offer an
important change in focus for the LIS field, in
which attribution is hard to prove.
Adopting these will make it easier for LIS
leaders to be seen as credible. They will less
often be required to make simplistic and
unconvincing attribution claims for their
services.
However, more credible LIS evaluation requires
a significant and sustained investment of time
and resources.
16. Greene, J.C. (2008), Is mixed methods social Inquiry a
distinctive methodology? Journal of Mixed Methods
Research, 2, 1, 7-22; article first published online: June
29, 2016 https://doi.org/10.1177/1558689807309969
Mayne, J. (2012), Contribution analysis: coming of age?
Evaluation, 18, 3, 270-280
https://www.researchgate.net/publication/254091562_
Contribution_Analysis_Coming_of_Age
Rogers, P. J. (2008). Using programme theory to
evaluate complicated and complex aspects of
interventions. Evaluation, 14(1), 29–48.
http://dx.doi.org/10.1177/1356389007084674
17. Realist evaluation – addressing ‘What works for whom, in what
circumstances, in what respects and how?’; evaluation participants
offer context, mechanism and outcomes statements (CMOs) against
which to evaluate (Pawson and Tilley, 2004)
Contribution analysis – designs a ToC to make credible causal
claims: how and to what extent has an intervention plausibly
contributed to programme/service goals? Evidence is systematically
collected in relation to these claims, to build a contribution story
Outcomes harvesting – a six step process that systematically and
collaboratively identifies and classifies outcomes (changes in
individuals, groups or communities), then gathers views about
these and how they are achieved. The results are analysed and
tabulated to provide evidence-based contribution statements.
Editor's Notes
Still focus in many places on attribution and causality
Testable NOT mean RCT!!!!
By complexity we mean a system in which relationships, including causality, are non-linear; there are multiple perspectives to encompass, the system is dynamic and produces unpredictable change. This is ‘messy space’: everything is connected because incidents or changes in one part of the system affect all other parts (Eoyong and Berkas, 1999; Preskill and Gopal. 2014). Focus on mixed methods; relationships and levels of mixing and innovative data collection (LIS good at this e.g. Photographs. Pictures, group concept mapping...)
Point is that these are now more widely accepted approaches e.g. DfID; European Commission
CMOs are theory of change in realist evaluation
Whole methodologies build round ToC/programme theory