JISC Assessment & Feedback Programme Online workshop 7 th   November  20 11 Designing useful evaluations Dr Rachel A Harris
Aims of the session <ul><li>Look at the evaluation cycle. </li></ul><ul><li>Consider project ‘logic’. </li></ul><ul><li>Re...
Curriculum Delivery Programme & the role of Evaluation <ul><li>The JISC & Becta funded programme invited projects to “ tra...
Initial evaluation stages <ul><li>Envision potential  impact  /achievements: </li></ul><ul><ul><li>potential  indicators o...
A multitude of approaches <ul><li>Action research ( Atelier-D, Duckling, KUBE ) </li></ul><ul><li>Independent internal/ext...
Action Research McNiff and Whitehead’s (2006) ‘ A ction-reflection cycle’ Observe Reflect Act Evaluate Modify Move in new ...
CIPP Evaluation (Cascade)
Assessment & Feedback Programme <ul><li>The programme  focuses on  “ large-scale changes  in assessment and feedback pract...
Designing your evaluation (I) <ul><li>What do you want to happen as a result of your project’s assessment and feedback inn...
What do you want to know? & What indicators could you use?
Report back
Big picture evaluation questions <ul><li>Process evaluation  – administrative and  pedagogical  design ,  and implementati...
Questions ‘cheat sheet’ <ul><li>How well was the project designed &  implemented? </li></ul><ul><li>How valuable were the ...
Share your evaluation questions <ul><li>Some examples: </li></ul><ul><li>What value do students and tutors gain from using...
Designing your evaluation (II) <ul><li>How will you find out whether  the intended outcomes have been achieved ?  (What, H...
Existing sources <ul><li>What existing sources of evidence are you planning to use? </li></ul><ul><li>Previous course eval...
How will you find out?
The Baseline <ul><li>Where are you starting from? </li></ul><ul><li>Qualitative versus quantitative evidence </li></ul><ul...
What happens next? <ul><li>Reflect on your thinking from today. </li></ul><ul><li>Review the feedback on your draft plans....
Any questions?
Programme Evaluation Resources   https://programmesupport.pbworks.com/w/page/42511148/Evaluation%20Resources Dr Rachel A H...
Upcoming SlideShare
Loading in …5
×

Designing useful evaluations - An online workshop for the Jisc AF programme_Inspire Research Ltd

5,629 views
5,508 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
5,629
On SlideShare
0
From Embeds
0
Number of Embeds
4,909
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • 13 projects funded by JISC, and 2 by Becta
  • The traditional view would be that by describing the current position, the baseline can “ provide a sound basis on which the success of the project could later be evaluated ” (Cascade). However, a baseline study can also be seen as the start of the essential project management requirement “ to be aware of all factors which may impact positively or negatively on the effectiveness of the project ” (KUBE).
  • Atelier-D Achieving Transformation, Enhanced Learning &amp; Innovation through Educational Resources in Design , Open University Investigated web technologies for developing a virtual design studio space to support student learning throughout the Design programme of the Open University. CASCADE University of Oxford Implemented new strategies to improve curriculum delivery models to allow the University of Oxford’s Department for Continuing Education to respond more flexibly to stakeholders’ needs. COWL C oventry Online Writing Laboratory , Coventry University Developed and extended the pedagogy, reach and diversity of academic writing services, through a technology-enhanced teaching and learning environment. DUCKLING Delivering University Curricula: Knowledge, Learning &amp; Innovation Gains , University of Leicester Developed delivery, presentation and assessment processes to enhance the work-based learning experience of students studying remotely. ESCAPE Effecting Sustainable Change in Assessment Practice &amp; Experience , University of Hertfordshire Responding to National and Institutional concerns regarding assessment and feedback, the project worked with two Schools to develop assessment for learning activities to enhance the assessment experience for learners and staff. ISCC Information Spaces for Creative Conversations , Middlesex University and City University Addressed a recurrent problem in design education of students sometimes being disengaged from key creative conversations, a problem that can be exacerbated by learning technologies. KUBE Kingston Uplift for Business Education , Kingston College Set out to enhance the learning experience of students studying on higher-level business education programmes delivered at Kingston College on behalf of Kingston University. MAC Making Assessment Count , University of Westminster Enhanced curriculum delivery through the development of an innovative assessment feedback process, eReflect. Springboard TV : An internet TV station to enrich teaching &amp; learning , College of West Anglia Set out to address challenges associated with recruitment, learner satisfaction, engagement, progression and employability by designing an innovative learner journey delivered in a simulated TV production and broadcast environment.
  • “ Action research is a form of enquiry that enables practitioners everywhere to investigate and evaluate their work” [1] For DUCKLING, the cycle started by ‘observing’ via a baseline study of staff, students and employers from both disciplines involved in the project. This was conducted using surveys and interviews and addressed the challenges of course delivery. The team ‘reflected’ by analysing the results of the baseline and feeding these back to course teams to inform the course redesign. The ‘action’ involved integrating four technologies into the redesign. This was subsequently ‘evaluated’ by gathering feedback from students and staff, analysing the findings and feeding this back to the course teams to inform any further ‘modifications’. The blue circular arrows show the suggested evaluation cycle that projects within the Assessment and Feedback programme are likely to follow. [1] McNiff, J. &amp; Whitehead J. (2006). All you need to know about action research . London: SAGE Publications Ltd. See p9
  • CIPP – Context, Input, Process &amp; Product/Outcome Evaluation The independent evaluator first devised an Evaluation Plan for Cascade , with evaluation questions, activities, and data-collection methods, and defined measures of success. This was developed after reviewing project documents, such as the JISC call, and project plan, and meeting with the project team. Two days were spent identifying aims and key measures of success for the Cascade focus areas.   The relationship of project aims to focus areas and from there the evaluation areas is shown above . This demonstrates how the project aims were used as the starting point for the Cascade evaluation.
  • The programme call notes the focus “is on large-scale changes in assessment and feedback practice, supported by technology , with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements ”. Projects will be expected to address (at least some of) these areas in their evaluation activities. This will feed into the development of an Evaluation and Synthesis Framework , The Framework will identify broad focus areas, such as though highlighted above. It will include key questions that we anticipate will be answered by project activities, but also the key evaluation questions projects highlight in their evaluation plans.
  • Adapted from the HEA Enhancement Academy Evaluation and Impact Assessment Approach. 1.   What do you want to happen as a result of your project’s activities? (Intended outcomes) 2.   Who and/or what might your activities impact on? (Levels of indicators) 3.   How will you know you have achieved your intended outcomes? Or What indicators could you use to demonstrate what you have achieved? (Indicators) Remembering that it may not be possible to evaluate the full range of project activities. (How might you investigate why the impact occurred ?)
  • Graphic downloaded from the A merican E valuation A ssociation (AEA) eLibrary http://bit. ly / evalsignificance
  • Adapted from Davidson (2009) . She also provides a useful evaluation questions “cheat sheet” 1.       What was the quality of the project’s content/design and how well was it implemented? 2.       How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? 3.       What were the barriers and enablers that made the difference between successful and disappointing implementation and outcomes? 4.       What else was learned (about how or why the effects were caused/prevented, what went right/wrong, lessons for next time)? 5.       Was the project worth implementing? Did the value of the outcomes outweigh the value of the resources used to obtain them? 6.       To what extent is the project, or aspects of its content, design or implementation, likely to be valuable in other settings? How reusable is it elsewhere? 7.       How strong is the project’s sustainability? Can it survive/grow in the future with limited additional resources?   Davidson, E.J. (2009) Improving evaluation questions and answers: Getting actionable answers for real-world decision makers. Presented at the American Evaluation Association conference, Orlando, Florida. Retrieved November 20, 2009, from http://comm. eval .org/EVAL/EVAL/Resources/ ViewDocument /Default. aspx ? DocumentKey =e5bac388-f1e6-45ab-9e78-10e60cea0666
  • Adapted from Davidson (2009) . She also provides a useful evaluation questions “cheat sheet” 1.       What was the quality of the project’s content/design and how well was it implemented? 2.       How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? 3.       What were the barriers and enablers that made the difference between successful and disappointing implementation and outcomes? 4.       What else was learned (about how or why the effects were caused/prevented, what went right/wrong, lessons for next time)? 5.       Was the project worth implementing? Did the value of the outcomes outweigh the value of the resources used to obtain them? 6.       To what extent is the project, or aspects of its content, design or implementation, likely to be valuable in other settings? How reusable is it elsewhere? 7.       How strong is the project’s sustainability? Can it survive/grow in the future with limited additional resources?   Davidson, E.J. (2009) Improving evaluation questions and answers: Getting actionable answers for real-world decision makers. Presented at the American Evaluation Association conference, Orlando, Florida. Retrieved November 20, 2009, from http://comm. eval .org/EVAL/EVAL/Resources/ ViewDocument /Default. aspx ? DocumentKey =e5bac388-f1e6-45ab-9e78-10e60cea0666
  • Adapted from the HEA Enhancement Academy Evaluation and Impact Assessment Approach.
  • This page was ran as a poll within the session.
  • Gill Ferrell spoke to this slide regarding the experience of the Curriculum Design projects with Baselining.
  • Graphic downloaded from the A merican E valuation A ssociation (AEA) eLibrary http://bit.ly/evalcorrelation
  • Designing useful evaluations - An online workshop for the Jisc AF programme_Inspire Research Ltd

    1. 1. JISC Assessment & Feedback Programme Online workshop 7 th November 20 11 Designing useful evaluations Dr Rachel A Harris
    2. 2. Aims of the session <ul><li>Look at the evaluation cycle. </li></ul><ul><li>Consider project ‘logic’. </li></ul><ul><li>Review a nticipated outcomes & impact. </li></ul><ul><li>Appraise which project activities to evaluate . </li></ul><ul><li>Identify evaluation questions. </li></ul><ul><li>Review how the evaluation will be undertaken. </li></ul><ul><li>Find out what other projects are doing! </li></ul><ul><li>Consider the baseline </li></ul><ul><li>Other evaluation issues? </li></ul>
    3. 3. Curriculum Delivery Programme & the role of Evaluation <ul><li>The JISC & Becta funded programme invited projects to “ transform how they deliver and support learning across a curriculum area through the effective use of technology ”. </li></ul><ul><ul><li>Emphasis on gathering and using evidence </li></ul></ul><ul><ul><li>Identifying what works </li></ul></ul><ul><ul><li>Adding value by generating evidence </li></ul></ul><ul><ul><li>How did projects determine if delivery was ‘transformed’ and the technology was ‘effective’? </li></ul></ul>Project outputs: http:// jiscdesignstudio . pbworks .com/
    4. 4. Initial evaluation stages <ul><li>Envision potential impact /achievements: </li></ul><ul><ul><li>potential indicators of impact/achievement, </li></ul></ul><ul><ul><li>credible evidence of impact , from different stakeholders ’ perspective s . </li></ul></ul><ul><li>Baseline: </li></ul><ul><ul><li>first set of evaluation data </li></ul></ul><ul><ul><li>review the current context </li></ul></ul><ul><ul><li>identify existing practice </li></ul></ul><ul><ul><li>determine stakeholders’ attitudes </li></ul></ul><ul><ul><li>clarify challenges (and identify barriers) </li></ul></ul>
    5. 5. A multitude of approaches <ul><li>Action research ( Atelier-D, Duckling, KUBE ) </li></ul><ul><li>Independent internal/external evaluator ( Cascade ) </li></ul><ul><li>Appreciative inquiry ( ESCAPE ) </li></ul><ul><li>Balanced scorecard ( COWL ) </li></ul><ul><li>Formative evaluation ( MAC ) </li></ul><ul><li>Micro & macro data sources ( Springboard TV ) </li></ul><ul><li>Rich qualitative methods ( ISCC ) </li></ul>
    6. 6. Action Research McNiff and Whitehead’s (2006) ‘ A ction-reflection cycle’ Observe Reflect Act Evaluate Modify Move in new directions Baseline Pilot Revisit Review
    7. 7. CIPP Evaluation (Cascade)
    8. 8. Assessment & Feedback Programme <ul><li>The programme focuses on “ large-scale changes in assessment and feedback practice, supported by technology “ with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements ”. </li></ul><ul><ul><li>Pedagogic enhancements </li></ul></ul><ul><ul><li>Workload efficiencies </li></ul></ul><ul><ul><li>Building the business case </li></ul></ul>
    9. 9. Designing your evaluation (I) <ul><li>What do you want to happen as a result of your project’s assessment and feedback innovation ? ( Anticipated Outputs and Outcomes, & Anticipated Impact ) </li></ul><ul><li>Who and/or what might this innovation impact on? ( Stakeholders & think about l evels) </li></ul><ul><li>How will you know the intended outcomes have been achieved ? Or What indicators could you use to demonstrate what ha s been achieved? ( Measures ) </li></ul>
    10. 10. What do you want to know? & What indicators could you use?
    11. 11. Report back
    12. 12. Big picture evaluation questions <ul><li>Process evaluation – administrative and pedagogical design , and implementation . </li></ul><ul><li>Outcome evaluation – value of outcomes . </li></ul><ul><li>Learning – barriers and enablers, surprises, causal explanations . </li></ul><ul><li>Overarching questions about value/worth . </li></ul><ul><li>Forward or outward focused evaluation questions – reusability, sustainability . </li></ul>
    13. 13. Questions ‘cheat sheet’ <ul><li>How well was the project designed & implemented? </li></ul><ul><li>How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? </li></ul><ul><li>What were the barriers and enablers? </li></ul><ul><li>What else was learned? </li></ul><ul><li>Was the project worth implementing? </li></ul><ul><li>To what extent are the project’s content, design or implementation likely to be valuable elsewhere? </li></ul>
    14. 14. Share your evaluation questions <ul><li>Some examples: </li></ul><ul><li>What value do students and tutors gain from using spoken e-feedback? </li></ul><ul><li>How has providing spoken rather than written feedback impacted on tutor workload? </li></ul><ul><li>What value did teachers and students gain from collaborating long-term on assessment and feedback? </li></ul><ul><li>What impact has providing ‘ balance of assessment ’ data had on programme teams? </li></ul>
    15. 15. Designing your evaluation (II) <ul><li>How will you find out whether the intended outcomes have been achieved ? (What, How and When evidenced) </li></ul><ul><li>How will you use this information? </li></ul><ul><li>Who else might this information be of use to? </li></ul><ul><li>What other questions should be considered? </li></ul>
    16. 16. Existing sources <ul><li>What existing sources of evidence are you planning to use? </li></ul><ul><li>Previous course evaluations </li></ul><ul><li>Departmental timesheets </li></ul><ul><li>Data from previous studies </li></ul><ul><li>Centrally held institutional data </li></ul><ul><li>NSS data </li></ul><ul><li>No plans to use existing data </li></ul><ul><li>Others? </li></ul>
    17. 17. How will you find out?
    18. 18. The Baseline <ul><li>Where are you starting from? </li></ul><ul><li>Qualitative versus quantitative evidence </li></ul><ul><li>Hitting a moving target </li></ul><ul><li>Previous projects’ baselines: http:// jiscdesignstudio . pbworks .com/w/page/46422956/Example%20baseline%20reports </li></ul>
    19. 19. What happens next? <ul><li>Reflect on your thinking from today. </li></ul><ul><li>Review the feedback on your draft plans. </li></ul><ul><li>Consider how you might want to refine your evaluation plans. </li></ul><ul><li>Use your evaluation questions to inform baseline activities. </li></ul><ul><li>Programme mapping! </li></ul>
    20. 20. Any questions?
    21. 21. Programme Evaluation Resources https://programmesupport.pbworks.com/w/page/42511148/Evaluation%20Resources Dr Rachel A Harris twitter : raharris email: rachel @inspire-research.co. uk web: www.inspire-research.co.uk

    ×