Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

U koersythesis&evaluation


Published on

Presentation for UKOER Phase 2 projects in Elluminate - October 2010

Published in: Education
  • Be the first to comment

  • Be the first to like this

U koersythesis&evaluation

  1. 1. UK OER 2:<br />synthesis and evaluation<br />Allison Littlejohn, Helen Beetham, Lou McGill, Isobel Falconer, Claire Carroll<br />Caledonian Academy<br />Glasgow Caledonian University, UK<br /><br />
  2. 2. Vision<br />Extending reputation<br />Evaluation and Synthesis Team<br />Allison Littlejohn<br /> Isobel Falconer<br />Helen Beetham<br /> Lou McGill<br />
  3. 3. Vision<br />Extending reputation<br />Aims of the session<br />Introduce the evaluation & synthesis framework developed during the pilot phase<br />Identify pertinent evaluation questions for this phase<br />Consider how you will gather evidence to answer your questions<br />How we will support you to do this...<br />Prioritise issues for future elluminate sessions<br />
  4. 4.
  5. 5. Vision<br />Extending reputation<br />
  6. 6. Focus areas<br />Approaches to OER release<br />Developing, managing and sharing OERs<br />Expertise<br />Business cases and benefits realisation<br />Cultural issues<br />Roles, rewards and division of labour<br />Legal issues<br />Technical and hosting issues<br />Quality issues<br />Pedagogy and end-use issues<br />Which areas do you need more support/information about?<br />Which areas do you expect to enhance with new knowledge and evidence?<br />10 mins to think about this and comment in text chat box.<br />
  7. 7.
  8. 8. Vision<br />Extending reputation<br />Rationale for a Framework<br />Structure to reflect key dimensions of the programme<br /><ul><li>provides a foundation and common language for collating data
  9. 9. offers a range of questions/issues to support evaluation and review
  10. 10. supports the collation of key messages, challenges, solutions and outputs
  11. 11. enables the identification of key areas of interest and highlights useful approaches</li></li></ul><li>Vision<br />Extending reputation<br />What do we mean by 'issues'?<br />Issues for JISC/HEA programme managers: questions that have an answer for this programme; problems that have a solution for this programme<br />e.g. 'we're having problems getting our consortium agreement signed off – do you have any suggested wording?'<br />Issues for programme support: problems that need expert advice from those working on OER<br />e.g. 'we have a number of approaches to hosting that we're considering – can you advise us of the pros and cons?'<br />Issues for evaluation: questions that don't have a clear answer (yet) but that the programme is investigating<br />e.g....<br />
  12. 12. Vision<br />Evaluation example<br />global issues = benefits, uses and impacts<br />Extending reputation<br /><ul><li>What are sustainable business/benefit models, building on Pilot phase and Good intentions report?
  13. 13. What are the benefits of OER in the current financial climate? Do they offer economies of scale?
  14. 14. How can we ensure learners make the best use of OER? What are the opportunities for learners to share content?
  15. 15. How are OERs used by academics in their own teaching? What new skills and expertise are needed?
  16. 16. Will OER contribute to a growth in online/distance learning and new pedagogies (will the HEFCE Online Learning Task Force work enable this)?
  17. 17. Who are OERs showcased to and with what impact?
  18. 18. What kinds of communities benefit from OER sharing and/or have existing ‘open’ practices</li></li></ul><li>Evaluation example<br />Local issues = strand specific and contextual issues<br />Extending reputation<br /><ul><li>What is the impact of cascade activities on capacity and organisational readiness
  19. 19. How are Web 2.0 tools best used to maximise discoverability and re-use?
  20. 20. How are repositories including JorumOpen, local and community repositories being used and how could they be used more effectively?
  21. 21. What is the impact of collection/collation on discovery and re-use?
  22. 22. What is the impact of OER release in specific topic areas to meet sector challenges?
  23. 23. What can case studies and tracking data teach us about uses, benefits and impacts?</li></li></ul><li>Vision<br />Extending reputation<br />Activity<br /><ul><li>Which evaluation issues are you interested in?</li></ul> Try to re-frame questions in a way that is specific to your project. Make the question tractable – ie. Something you can realistically find evidence to answer through the work of your project (NOT a separate research study!)<br /><ul><li>What evidence do you expect to be able to gather to help answer or enrich our understanding of this issue?</li></ul> Make your proposals for evidence gathering realistic within the time and resources you have available<br />See our handout on evidence for the kind of things we mean.<br />
  24. 24. Vision<br />Extending reputation<br />Approaches – supporting evaluation activities<br /><ul><li>Projects will be paired-up
  25. 25. to provide mutual support, sharing experience, light touch
  26. 26. Each pair will be offered support from one of us</li></ul>Activity in task bar<br />How could pairing up with a project help you? What activities would be useful?<br />What support do you think you need?<br /><br />
  27. 27. Vision<br />Extending reputation<br />Project pairing – possible activities<br /><ul><li> check evaluation plans are sensible, credible, valid, do-able, relevant to programme aims
  28. 28. provide peer review of content, interfaces, and other practical outputs from the projects
  29. 29. share ideas, experiences and interim outcomes of evaluation
  30. 30. swap time and expertise, e.g. analyse data for one another (extra objectivity)
  31. 31. share resources e.g. external consultants buddy up for e.g. workshops, critical moments (writing up etc)</li></li></ul><li>Vision<br />Extending reputation<br />Role of the evaluation team<br />Support projects in evaluation work<br /><ul><li>Wiki, blog, evaluation resources, one-to-one
  32. 32. Collate evidence across projects</li></ul>Evaluate programme overall against original objectives:<br /><ul><li>OERs releasedandcollected
  33. 33. Practices around OER reviewed/reformed/cascaded
  34. 34. Lessons learned about OER release, management, discovery and use
  35. 35. Benefits in terms of challenges & stakeholders</li></ul>Report on findings e.g. alternative approachesto OER, approaches that are sustainable and usable, evidence of uptake and use, identified benefits to users<br />Link with other members of support team and across strands – including with 'users' study<br />Wiki -<br />Blog -<br />
  36. 36. Vision<br />Activity<br />Extending reputation<br />Re-visit the evaluation questions you listed earlier<br />Are these already answered on our wiki?<br />Go and find one Pilot Phase output that will help you?<br />Do you need to refine your questions?<br />Do we need to refine the framework?<br />Have you already made connections with another project – would it be useful to pair up with them?<br />
  37. 37. Vision<br />Extending reputation<br />Innovating e-Learning JISC online conference <br />23rd-26th November 2010<br />Bringing innovation to life: From adversity comes opportunity<br />25th November<br />Sustaining OER innovation through collaboration and partnership<br /> Simon Thomson (Leeds Metropolitan University) <br />and Andy Beggan (University of Nottingham)<br /><br />
  38. 38. Vision<br />Extending reputation<br />Priorities for Future Elluminate Session<br />What should future Elluminate sessions focus on?<br />