Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Designing useful evaluations - An online workshop for the Jisc AF programme_Inspire Research Ltd

8,324 views

Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Designing useful evaluations - An online workshop for the Jisc AF programme_Inspire Research Ltd

  1. 1. JISC Assessment & Feedback Programme Online workshop 7 th November 20 11 Designing useful evaluations Dr Rachel A Harris
  2. 2. Aims of the session <ul><li>Look at the evaluation cycle. </li></ul><ul><li>Consider project ‘logic’. </li></ul><ul><li>Review a nticipated outcomes & impact. </li></ul><ul><li>Appraise which project activities to evaluate . </li></ul><ul><li>Identify evaluation questions. </li></ul><ul><li>Review how the evaluation will be undertaken. </li></ul><ul><li>Find out what other projects are doing! </li></ul><ul><li>Consider the baseline </li></ul><ul><li>Other evaluation issues? </li></ul>
  3. 3. Curriculum Delivery Programme & the role of Evaluation <ul><li>The JISC & Becta funded programme invited projects to “ transform how they deliver and support learning across a curriculum area through the effective use of technology ”. </li></ul><ul><ul><li>Emphasis on gathering and using evidence </li></ul></ul><ul><ul><li>Identifying what works </li></ul></ul><ul><ul><li>Adding value by generating evidence </li></ul></ul><ul><ul><li>How did projects determine if delivery was ‘transformed’ and the technology was ‘effective’? </li></ul></ul>Project outputs: http:// jiscdesignstudio . pbworks .com/
  4. 4. Initial evaluation stages <ul><li>Envision potential impact /achievements: </li></ul><ul><ul><li>potential indicators of impact/achievement, </li></ul></ul><ul><ul><li>credible evidence of impact , from different stakeholders ’ perspective s . </li></ul></ul><ul><li>Baseline: </li></ul><ul><ul><li>first set of evaluation data </li></ul></ul><ul><ul><li>review the current context </li></ul></ul><ul><ul><li>identify existing practice </li></ul></ul><ul><ul><li>determine stakeholders’ attitudes </li></ul></ul><ul><ul><li>clarify challenges (and identify barriers) </li></ul></ul>
  5. 5. A multitude of approaches <ul><li>Action research ( Atelier-D, Duckling, KUBE ) </li></ul><ul><li>Independent internal/external evaluator ( Cascade ) </li></ul><ul><li>Appreciative inquiry ( ESCAPE ) </li></ul><ul><li>Balanced scorecard ( COWL ) </li></ul><ul><li>Formative evaluation ( MAC ) </li></ul><ul><li>Micro & macro data sources ( Springboard TV ) </li></ul><ul><li>Rich qualitative methods ( ISCC ) </li></ul>
  6. 6. Action Research McNiff and Whitehead’s (2006) ‘ A ction-reflection cycle’ Observe Reflect Act Evaluate Modify Move in new directions Baseline Pilot Revisit Review
  7. 7. CIPP Evaluation (Cascade)
  8. 8. Assessment & Feedback Programme <ul><li>The programme focuses on “ large-scale changes in assessment and feedback practice, supported by technology “ with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements ”. </li></ul><ul><ul><li>Pedagogic enhancements </li></ul></ul><ul><ul><li>Workload efficiencies </li></ul></ul><ul><ul><li>Building the business case </li></ul></ul>
  9. 9. Designing your evaluation (I) <ul><li>What do you want to happen as a result of your project’s assessment and feedback innovation ? ( Anticipated Outputs and Outcomes, & Anticipated Impact ) </li></ul><ul><li>Who and/or what might this innovation impact on? ( Stakeholders & think about l evels) </li></ul><ul><li>How will you know the intended outcomes have been achieved ? Or What indicators could you use to demonstrate what ha s been achieved? ( Measures ) </li></ul>
  10. 10. What do you want to know? & What indicators could you use?
  11. 11. Report back
  12. 12. Big picture evaluation questions <ul><li>Process evaluation – administrative and pedagogical design , and implementation . </li></ul><ul><li>Outcome evaluation – value of outcomes . </li></ul><ul><li>Learning – barriers and enablers, surprises, causal explanations . </li></ul><ul><li>Overarching questions about value/worth . </li></ul><ul><li>Forward or outward focused evaluation questions – reusability, sustainability . </li></ul>
  13. 13. Questions ‘cheat sheet’ <ul><li>How well was the project designed & implemented? </li></ul><ul><li>How valuable were the outcomes to participants (students, lecturing staff, administrators)? To the institution, the community, the economy? </li></ul><ul><li>What were the barriers and enablers? </li></ul><ul><li>What else was learned? </li></ul><ul><li>Was the project worth implementing? </li></ul><ul><li>To what extent are the project’s content, design or implementation likely to be valuable elsewhere? </li></ul>
  14. 14. Share your evaluation questions <ul><li>Some examples: </li></ul><ul><li>What value do students and tutors gain from using spoken e-feedback? </li></ul><ul><li>How has providing spoken rather than written feedback impacted on tutor workload? </li></ul><ul><li>What value did teachers and students gain from collaborating long-term on assessment and feedback? </li></ul><ul><li>What impact has providing ‘ balance of assessment ’ data had on programme teams? </li></ul>
  15. 15. Designing your evaluation (II) <ul><li>How will you find out whether the intended outcomes have been achieved ? (What, How and When evidenced) </li></ul><ul><li>How will you use this information? </li></ul><ul><li>Who else might this information be of use to? </li></ul><ul><li>What other questions should be considered? </li></ul>
  16. 16. Existing sources <ul><li>What existing sources of evidence are you planning to use? </li></ul><ul><li>Previous course evaluations </li></ul><ul><li>Departmental timesheets </li></ul><ul><li>Data from previous studies </li></ul><ul><li>Centrally held institutional data </li></ul><ul><li>NSS data </li></ul><ul><li>No plans to use existing data </li></ul><ul><li>Others? </li></ul>
  17. 17. How will you find out?
  18. 18. The Baseline <ul><li>Where are you starting from? </li></ul><ul><li>Qualitative versus quantitative evidence </li></ul><ul><li>Hitting a moving target </li></ul><ul><li>Previous projects’ baselines: http:// jiscdesignstudio . pbworks .com/w/page/46422956/Example%20baseline%20reports </li></ul>
  19. 19. What happens next? <ul><li>Reflect on your thinking from today. </li></ul><ul><li>Review the feedback on your draft plans. </li></ul><ul><li>Consider how you might want to refine your evaluation plans. </li></ul><ul><li>Use your evaluation questions to inform baseline activities. </li></ul><ul><li>Programme mapping! </li></ul>
  20. 20. Any questions?
  21. 21. Programme Evaluation Resources https://programmesupport.pbworks.com/w/page/42511148/Evaluation%20Resources Dr Rachel A Harris twitter : raharris email: rachel @inspire-research.co. uk web: www.inspire-research.co.uk

×