Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

ROER4D Impact Studies Workshop


Published on

A record of the Impact Studies workshop held at Wawasan University, 1-5th Dec 2014

Published in: Education
  • Be the first to comment

  • Be the first to like this

ROER4D Impact Studies Workshop

  1. 1. Menu/title will go here OER Impact Studies Workshop Dr. Rob Farrow 1-5 Dec 2014 Penang, Malaysia
  2. 2. ROER4D Impact Studies Workshop Day 1
  3. 3. Introduction Rob Farrow
  4. 4. @philosopher1978
  5. 5. #oerrhub
  6. 6. Goals of workshop facilitation  Shared understanding of ROER4D-IS  Harmonization of impact studies  Sharing the OERRH experience  Refinement of ROER4D-IS proposals
  7. 7. Methods for workshop facilitation  Critical discussion of existing research  Peer review  Facilitating reflection on methods and claims  Exploration of key concepts  Making explicit what is assumed or implied  Identifying problematic areas  Effective planning
  8. 8. Things to avoid ✖ Dictating what methodology should be ✖ Being disrespectful or patronising ✖ Pleasing me
  9. 9. Icebreaker Where in the world?  Name, institution, country  One key question  Swapping places to present partner institutions
  10. 10. Overview of ROER4D-IS (CHW)  Overview of objectives, activity, progress  Expectations of impact studies [see other slide deck]
  11. 11. ROER4D Objectives  Build empirical knowledge base  Developing research capacity  Building scholarship networks  Open curation of research  Communicate research to influence policy
  12. 12. ROER4D Strategies  Knowledge building (degrees of openness, OA)  Building research capacity (harmonization)  Build network through conference, workshops, etc.  Open curation (repositories, social media)  Collaborative, supportive approach to leadership  Seeking out creative synergies  Effective (agile?) methods for collaboration  Iterative evaluation
  13. 13. Expectations of ROER4D-IS  Case studies provide detail relative to broad understanding of the Global South developed through survey work and ROER4D as a whole  Balancing needs of network with individual needs  Open by default: CC-BY, open data, OA publishing
  14. 14. Lunch, then Presentation by grantees
  15. 15. AVU / Teacher Education in Sub-Saharan Africa  Need for trained teachers and updated curriculum  OER offer promise of addressing issues of access, quality, cost  AfDB / UNDP resources in core subjects (Teacher Education)  Fullan (2006) theory of change underpins change knowledge  Examination of the conditions that sustain OER use  Comparative analysis across 12 institutions  Participatory approach to the research; qualitative data; phenomenology
  16. 16. Darakht-e Danesh / Afghanistan  Conflict has destroyed educational infrastructure  OER gives educators independent access to content  OER supports much needed adaptation and localization  DD Library accessed via web, e-learning lab and mobile  “Effective measurement” of impact on teaching quality  Assumption that access to CPD resources will improve learning outcomes (via improved literacies/competences)  Survey based approach (which questions?) supported by analytics from learning lab and website access; student records  Theory of change: how is openness playing a role?
  17. 17. OER Impact in Asian non-formal ed. / Mongolia, India  Plurality of ‘impacts’ (knowledge, skills, aspirations, attitudes) on learners and trainers from various OER types Focused on strategies for collaboration and sharing between formal and non-formal learning providers  Identify policies that improve quality and affordability of learning  Using Bennett’s (1979) hierarchy of outcomes to evaluate impact  Performance indicators = quantitative, qualitative, financial  Open = openly licensed? (If not, what?)
  18. 18. OER in teacher education / OU Sri Lanka  Action Research methodology (communities of practice)  Fullan (1993) as a framework for understanding change  4 hypotheses: changing pedagogical beliefs & practices; reduce cost of learning; improve the quality of learning  Running workshops to raise awareness  Stakeholders: learners, teachers from six provinces & various levels of study, subjects, etc. (nb. teachers as learners)  Interpretative Phenomenological Analysis (IPA) as organising framework for qualitative data collected – emerging themes / meanings
  19. 19. OU UK / Teacher Education in E. Africa  Some research suggests that ‘quality’ teachers improve learning  National policies advocate ‘learner-centred’ education but this is vague  Focus on co-construction of knowledge as feature of openness  TESSA is a consortium of OER producing universities & other organizations who developed a repository of OER for teacher learning  Practitioner responses to OER – attitudinal? Wider changes?  5 institutions: qualitative data; interpretation; phenomenology  Ontological & epistemological ‘shifts’ – is this clear?  How precise a conception of openness is appropriate here?
  20. 20. Practices and Openness in African HE / UCT  Global South tends to be seen as a recipient rather than provider  UCT has several MOOC available or in production (FutureLearn)  Various dimensions of openness: access, licensing, instruction  Impact of MOOC on educator and student practice & view of open  Impact of MOOC on valuing and repurposing of OER  How MOOC initiate OER use and creation  Methods: surveys, interviews, learning analytics, case studies  Attempt to map research questions to MOOC development cycle
  21. 21. Cost-Effectiveness Analysis of OER / U Philippines OU  Comparison of open vs non-open course development costs  Quasi-experimental research design  Participants chosen randomly from three disciplines (education, health, management)  Strict separation of OER vs standard groups  Measuring: teacher competence; learner performance; quality of materials – but how? Key indicators around savings per unit, efficacy
  22. 22. Virtual University Pakistan / Impact of OER in Pakistan  Study split between two institutions  Target of 88% ‘literacy’ by 2015 – only 60% at the moment  Internet access and use is rising (nb. laptop scheme)  Focus on lecture delivery; student performance; policy  Large scale survey augmented by interviews  Using Fullan’s theory of change  COUP framework to assess cost difference and impact on student outcomes ( / /
  23. 23. ROER4D Impact Studies Workshop End of Day 1
  24. 24. ROER4D Impact Studies Workshop Day 2
  25. 25. Enhancing Research Value Between OER Practitioners across the Global North/South Divide Through Open Collaboration Dr. Rob Farrow
  26. 26. • Research project at The Open University (UK) • Funded by William & Flora Hewlett Foundation for two years • Tasked with building the most comprehensive picture of OER impact • Organised by eleven research hypotheses • Collaboration model works across different educational sectors • Global reach but with a USA focus • Openness in practice: methods, data, dissemination OER Research Hub #oerrhub
  27. 27. Project Co-PILOT
  28. 28. Keyword Research Hypothesis Performance OER improve student performance/satisfaction Openness People use OER differently from other online materials Access OER widen participation in education Retention OER can help at-risk learners to finish their studies Reflection OER use leads educators to reflect on their practice Finance OER adoption brings financial benefits for students/institutions Indicators Informal learners use a variety of indicators when selecting OER Support Informal learners develop their own forms of study support Transition OER support informal learners in moving to formal study Policy OER use encourages institutions to change their policies Assessment Informal assessments motivate learners using OER ‘Evidence’ is only evidence in relation to a claim or hypothesis: the project hypotheses form the core of the metadata model.
  29. 29. OER Evidence Report 2014 erevidence
  30. 30. OER Impact Map
  31. 31. • Research instruments applied consistently across collaborations: surveys, interview questions, focus groups, etc. • Supplemented by integration of secondary research • ‘Agile’ research, sprinting • Thematic and methodological cohesion provided by research hypotheses Research Process
  32. 32. • Synthesis and aggregation of other case studies • Sharing networks, resources and experiences • Comparisons with Global North • Initial agile enquiry through OLnet, SCORE and OERRH fellows networks • Capacity for further, responsive research Essence of the proposal
  33. 33. Synthesis Synthesis Methods • Isolating data by hypothesis, sector, country, or any combination • Collaborative curation of research data • Data visualization, reporting • Editorial quality control exercised centrally Validation • Iteration through current and future patterns of evidence • Open citation trails allow public auditing of evidence • Community voting
  34. 34. in service of The Open University
  35. 35. in service of The Open University Flowmap changes according to country selected…
  36. 36. Hypothesis Reporting
  37. 37. Interface prototype at 6,000+ OERRH survey responses across 180 countries
  38. 38. in service of The Open University Survey Data Explorer (prototype)
  39. 39. 1/12/14 11/3/15 19/6/15 27/9/15 5/1/16 14/4/16 23/7/16 D1 Project Plan D2 Agile Research Guidelines D3 Methodological Framework D4 Data Harmonization D5 Hypothesis Review D6 ROER4D Impact Map D7 Competition D8 ROER4D Data Explorer D9 Outreach D10 Open Dissemination D11 Webinar Programme D12 Risk & Issues Log D13 Final Report
  40. 40. Exercise: Activity Theory
  41. 41. Exercise: Activity Theory Scandinavian school of AT seeks to synthesize several approaches, including constructivism; pragmatism & actor-network theory Context is important
  42. 42. Exercise: Activity Theory Scandinavian school of AT seeks to synthesize several approaches, including constructivism; pragmatism & actor-network theory We are not interested here in AT as a tool of analysis or explanation, but as a way of describing the different elements of the socio-technical systems around OER implementation that will be studied in ROER4D-IS Emphasis on tacit knowledge: you know your own context
  43. 43. tem.png
  44. 44. subject actor(s) involved in a process object purpose of the system community social context instruments tools & technologies division of labour among actors / power rules that regulate the system [outcome what actually happens]
  45. 45. tem.png
  46. 46. Exercise: Activity Theory One approach that can work for this exercise is to complete the grid before and after the intervention to broadly identify ‘impact’ How does it compare to the stated research question? Can we begin to refine? Similarities and differences between contexts Share your description -
  47. 47. Exercise: Activity Theory Goals of the exercise:  Improved description of the research context  Identification of similarities / differences across case studies  Identifying possible partner for peer review exercise in day 3  Steps towards a general understanding of Global South context?
  48. 48. Focusing on relation between hypotheses and evidence
  49. 49. For your research hypothesis, what would be the “perfect” evidence or ‘proof’?
  50. 50. What would be the next best thing? … and if everything else failed?
  51. 51. Examples from OERRH Hypothesis: OER improve student performance/satisfaction Gold: Longitudinal study pre/post OER intervention grades; control of all variables
  52. 52. Examples from OERRH Hypothesis: OER improve student performance/satisfaction Silver: proxy data from surveys (confidence, interest, motivation, etc.)
  53. 53. Examples from OERRH Hypothesis: People use OER differently from other online materials Gold: Covert tracking of openly licensed vs non open materials
  54. 54. Examples from OERRH Silver: Triangulation of survey questions around behaviours Bronze: Anecdotal evidence; interviews; focus groups
  55. 55. Examples from OERRH Hypothesis: Open education acts as a bridge to formal education Gold: repository analytics with click through to formal registration (OpenLearn)
  56. 56. Examples from OERRH Silver: Triangulation of survey questions around attitudes Bronze: Anecdotal evidence; interviews; focus groups
  57. 57. For your case study / hypotheses… Gold: Pre/post intervention – but what are the metrics? Attitudinal data where this is appropriate for hypothesis Concept mapping to illustrate changing pedagogical beliefs Rich qualitative description of change A theory of change that can explain patterns in findings Establishes causal relationship between intervention and effect Silver: Establishing relationships of correlation Proxies from survey data Bronze:
  58. 58. We will pick up on this again when we look at risk assessment (Day 4)…
  59. 59. Thinking through key terms and language
  60. 60. Impact as…  a change over time (in what?)  influence (on what?)  negative / positive / neutral  immediate vs medium-long term  intended vs unintended consequences  direct vs indirect
  61. 61. Open as…  openly licensed  free  online  sharing  participatory  accessible  “unfettered” / empowering  openness as general scholarly ethos, open-mindedness  decentralization of knowledge? / democratization (of what?)  a set of practices (OEP)  directed towards social justice / public good?
  62. 62. OER as…  Context of production vs. context of use  Openly licensed resources  Amenable to 4 Rs  Public domain  Free? (zero cost or freely available?)  Educational!  Designed to support learning?
  63. 63. Two issues in OER impact research: 1. No agreed definition/metrics for ‘impact’ 2. Isolating particular influence of openness on educational outcomes
  64. 64. OERRH strategies for amelioration: 1. Holistic, agile approach to data collection 2. Embrace multiplicity of interpretations
  65. 65. ROER4D strategies for amelioration: Theories of change 1. Sharing  increased access  better lessons / student performance 2. Viral openness / enacted practice leads to participation 3. OER production encourages an important kind of collaboration
  66. 66. ROER4D strategies for amelioration: Theories of change 4. Participation in 4 Rs changes / challenges epistemological assumptions 5. Adaptation influences quality 6. Local adaptation makes resources more locally relevant 7. Integrating OER into teaching leads to changes in practice
  67. 67. ROER4D strategies for amelioration: Shared understanding of OER as free & openly licensed Making explicit the interpretation of openness used in context Precise indicators of OER impact (direct/indirect) Clarity with regards to the rationale, conceptual framework and methodologies used
  68. 68. ROER4D Impact Studies Workshop End of day 2  Homework = prepare for peer review of proposals (ideally pair up based on shared elements, e.g. sector, geography, hypotheses)
  69. 69. Self-critique of proposals  Any questionable assumptions?  Suggestions for improvement?  Can it be made clearer?
  70. 70. ROER4D Impact Studies Workshop Day 3
  71. 71. ROER4D Impact Studies Workshop Day 3  It gets easier from here as we move from difficult conceptual issues to refining existing proposals
  72. 72. ROER4D Impact Studies Workshop Day 3 Critique of proposals  Any questionable assumptions?  Suggestions for improvement?  Can it be made clearer?
  73. 73. AVU / Teacher Education in Sub-Saharan Africa  Lack of trained teachers / limited teaching resources for teacher education  Curriculum for maths and science is localized, but to what extent is this integrated into policy and practice?  10 institutions with potential to further examine impact on teachers in training – how is localization affecting them?  Main challenges around finding data that can illustrate relationship between OER use and outcomes around teacher training  What kind of evidence? Curriculum adaptation (changes in learning design?) plus descriptions of adaptation
  74. 74. Darakht-e Danesh / Afghanistan  Making it clearer how openness plays a role through collaboration  Making sure that this focuses on openness rather than just being a general evaluation of the DD platform  Indicators – site analytics to measure uptake but how is improved knowledge, learning and practice going to be measured?  Differences in patterns of access / use according to gender should be especially interesting here
  75. 75. OER Impact in Asian non-formal ed. / Mongolia, India  Do these materials count as OER if they are not openly licensed? Shall we just adopt the 4 Rs model? Hewlett definition?  Prioritising impact as a theme  Relating policy to practice through key PI  For each hypothesis: identify indicators and relate to theory of change  Difficulty of accessing farming materials in Mongolian, Tamil language  Are the OER used ‘native’ or coming from outside the community?  Learning analytics? How will this happen? Contingency plans?  Need to give adequate time for institutional approval
  76. 76. OER in teacher education / OU Sri Lanka  Methods = teaching observations; interviews; activity logs  Looking for evidence of pedagogical change: learning design  Evidence expected from analysis of teaching materials used
  77. 77. OU UK / Teacher Education in E. Africa  Variable uptake of OER, some already in place – how is this influencing practices?  How are teacher trainees changing their understanding of ‘knowledge’ and their own practices as a result of TESSA  Teacher educators from 5 institutions will provide data through survey – data will be concept mapped and used as basis of interviews, etc.
  78. 78. Practices and Openness in African HE / UCT  Lecturers express difficulty in making MOOC materials open  What is the influence of OER on the pedagogies used?  Do MOOC structures require openness? If so, what does this mean?  What is the impact of use/creation of OER on other aspects of pedagogical practice?  Evidence expected from baseline survey of lecturer behaviour/attitudes as well as from analysing the various artefacts that are created and shared
  79. 79. Cost-Effectiveness Analysis of OER / U Philippines OU  Interest in quality of OER, impact on cost/access  Will focus on how faculty choose and adapt OER  Mandatory openness – faculty may not use copyrighted materials and are even obliged to become OER creators  Distinguishing direct/indirect costs?  Cost-benefit analysis may be most difficult in year one as the start-up costs will be applied in this year  Purposive rather than randomized sampling because only some courses are using OER
  80. 80. Virtual University Pakistan / Impact of OER in Pakistan  There is a baseline study around use of IT resources already in place to provide comparison with OER  High drop out rate, low quality textbooks – can these be ameliorated by OER? Compare attrition rates of OER and non OER using students but need to be aware of length of the study relative to academic year  Surveys to learn about impact of OER on teaching practice  Potential difficulty of evaluating / comparing textbook quality?  Ability to separate OER and non-OER student cohorts could produce useful comparative data but need to be clear about the metrics
  81. 81. Plenary discussion: review of proposals  What is the problem to which OER is a potential solution? How?  Importance (and difficulty) of separating the impact of the general intervention and the impact of the ‘open’ elements of the intervention  Different dimensions of openness  Impact of OER on course design and pedagogical methods (how should this be captured?)  What is the process for getting the revised proposals approved?  Can we identify synergies between the impact studies? Can these inform the creation of working groups within the IS?
  82. 82. Plenary discussion: review of proposals  Hypotheses should be as clearly stated as possible  Hypotheses should make clear how the ‘open’ element is under examination  The evidence that is collected should be connected clearly with both the hypothesis and the open element of the hypothesis  All proposals should include a section on the objectives of the study – there is a general objective for all studies (around impact of OER in Global South) and some specific objective related to the research questions  A further objective is concerned with the effective communication of results to influence future practice/policy
  83. 83. Plenary discussion: review of proposals  Seek out opportunities for harmonization between the impact studies  Share methods and research instruments where possible
  84. 84. Individual feedback and planning End of Day 3
  85. 85. ROER4D Impact Studies Workshop Day 4
  86. 86. ROER4D Impact Studies Workshop Day 4 Time for reflection
  87. 87. ROER4D Impact Studies Workshop Day 4 Time for reflection 30 mins – rephrase hypothesis – explicit theory of change – describing all relevant aspects of context – methodology share by email with ROER4D staff
  88. 88. Harmonization workshop  Harmonization facilitates comparison across ROER4D sub-projects and the wider research literature  Developing a model for best practice in harmonization  Aggregation and categorization of existing OER research surveys  Clarification of concepts with original research teams over 9 months
  89. 89. Henry Trotter ROER4D Impact Studies Workshop, Penang, Malaysia 4 December 2014 Research Question Harmonisation in ROER4D
  90. 90. Knowledge building Research capacity Networking Curation & Communication 1. Build an empirical knowledge base on the use and impact of OER in education 2. Develop the capacity of OER researchers 3. Build a network of OER scholars 4. Curate research documents and Communicate research to inform education policy and practice ROER4D Objectives
  91. 91. 4 goals: • Harmonise our research questions, where possible, with that of other OER studies such as OER Research Hub, OER Asia, JISCOER, etc. • Harmonise our research questions, where possible, across our 12 projects • Use this QH process to build the research capacity of our sub-project researchers and research associates • Provide a model of best practices for other research for Research capacitation through Question Harmonisation
  92. 92. 1. Consulted 9 major OER surveys to develop a bank of potential questions…
  93. 93. …and multiple OER studies to compare those questions
  94. 94. 2. Discussed question options, chose the best & recorded rationale for decision
  95. 95. 3. Shared Qs with researchers, showing how they would appear in survey form
  96. 96. 3. Shared Qs with researchers, showing how they would appear in survey form
  97. 97. 4. Engaged with researchers online via Adobe Connect to harmonise questions 15 synchronous sessions over 9 month period
  98. 98. …but to do so, we had to work out everyone’s time zones & best meeting time
  99. 99. …but to do so, we had to work out everyone’s time zones & best meeting time
  100. 100. …but to do so, we had to work out everyone’s time zones & best meeting time
  101. 101. 5. Continued discussion off-line via discussion forum and/or email
  102. 102. 6. Harmonised concepts as part of process (via Adobe Connect & Google Docs)
  103. 103. 6. Harmonised concepts as part of process (via Adobe Connect & Google Docs)
  104. 104. 6. Harmonised concepts as part of process (via Adobe Connect & Google Docs)
  105. 105. 7. Piloted survey based on harmonised questions with ROER4D members and other OER colleagues (version 1)
  106. 106. 8. Assessed results and gave feedback to researchers on pilot survey
  107. 107. 9. Revised the questions and shared them with network (version 2) …
  108. 108. …providing access to all QH session videos & docs that went into the process
  109. 109. 10. Enjoined researchers to share their adaptations of the harmonised survey for their own sub-projects via webinar sessions…
  110. 110. …and recruited some of them to share their research knowledge experience with us next year during the bi-weekly Adobe Connect sessions Evaluation Question: What research skills could YOU contribute to the research capacity building? Formulating research instrument questions (5) • Cheryl Hodgkinson-Williams (research questionnaire development) • Meenu Sharma (developing research instruments) • Sanjaya Mishra (Scale development) • Mohan Menon (development of research tools) • Jose Dutra (instrument development) Analysing qualitative data (2) • Cheryl Hodgkinson-Williams • Tess Cartmill (using NVivo) Developing a conceptual framework (2) • Cheryl Hodgkinson-Williams • Meenu Sharma Report writing (2) • Sukaina Walji • Meenu Sharma Writing a research question (1) • Cheryl Hodgkinson-Williams Presenting research work (1) • Sukaina Walji Analysing quantitative data (1) • George Sciadas
  111. 111. Outcomes (positive) 1. Through extensive collaboration, deliberation and testing, we developed a set of questions that were: • well-harmonised with other large OER surveys • sensitive to and adapted for the Southern context • successful at obtaining useful data on academics’ creation and use of OER 2. The process allowed us to sharpen and harmonise our concepts, creating a better understanding of the terms that we use across the entire project. 3. It created a strong sense of community amongst the researchers that participated, a valuable outcome given that many feel alone as OER researchers in their contexts. (This also helped fulfill ROER4D’s third objective, which is to build a network of OER scholars.) 4. Increased the research capacity of many of the scholars that participated, which was the broader objective of this question harmonisation effort.
  112. 112. Outcomes (negative) 1. Research capacitation was uneven for a variety of reasons. Some researchers: • were unable to attend due to time conflicts • were disinterested in the process • missed the point of the exercise (despite attending sessions) • did not avail themselves of support structures outside the webinars (mentors, etc.) to shore up the knowledge or concepts to which they were exposed. 2. The technology (especially Adobe Connect and our institutional broadband connections) often let us down, turning vibrant conversations into clunky, painful interactions. 3. The process took longer than anticipated. 4. The sub-project which could have benefited from this process the most and utilise the harmonised questions in a powerful and extensive manner essentially decided not to use them, thereby reducing the impact that the process could have had on
  113. 113. Lessons learned What worked? 1. Having regular sessions: the consistency of the process was crucial for creating the opportunities necessary to build research capacity and to develop a sense of community amongst participants. 2. Inviting researchers to share their own work: this allowed members to get valuable feedback and to feel “heard” by their peers. 3. Working collaboratively and “openly” (within the project): the transparency of the process – especially the network team’s creation of “public” Google docs which researchers could engage – created greater credibility and accountability, enhancing members’ buy-in. What didn’t work? 1. The “voluntary” model: for practical and pedagogical reasons, we chose to make this a voluntary process, but this resulted in uneven attendance and interest. 2. Initiating the process after other key issues had already been decided: the process would have likely run more smoothly if it had been built into the programme from the beginning, with clear
  114. 114. So the question is… Would some sort of question or concept harmonisation process be useful for the ROER4D Impact Studies group? And if so, how would it work?
  115. 115. Harmonization of Impact Studies  Working from a common vocabulary (n.b. translation issues; getting caught in semantics)  Shared methods for shared hypotheses?  Use existing ROER4D survey questions where possible  Problem of differing research paradigms / assumptions / contexts  Thematic classification of results  Harmonization of research processes?
  116. 116. Harmonization of Impact Studies  OER can expand access to education (4) (n.b. formal / informal)  Local adaptation of OER leads to improvement in learning (6)  Exposure to OER leads to open practice (6)  Reuse/re-purposing leads to changed pedagogy (7)  Integration of OER improves quality of teaching resources (3)  OER can provide alternative perspectives that are useful for teaching and learning (2)  OER use reduces student attrition (in public schools) Key concepts: openness; impact; quality; access; reuse; repurposing; adoption; cost; adaptation; practice Adoption team to share work already done in the area of concept mapping
  117. 117. ROER4D Impact Studies Workshop Day 4 Review of contemporary research
  118. 118. Examples of exemplary OER research  There are none!  Wiley (2009) ‘Decade of Development’ – history of OER movement  McAndrew et al (2012) Assessing OER impact… (Bridge to Success)  CHW (2014) ‘Degrees of Ease…’  Schaffhauser (2014) 5 ideas for spreading OER / 5 myths of OER
  119. 119. Examples of exemplary OER research  Link to ROER4D bibliography  Any references provided by Raj?
  120. 120. What are the features of effective OER research? Clear research questions Builds on existing relevant disciplinary knowledge Context sensitive Original Ethical Robust, clearly articulated design Clarity around assumptions Awareness of roles/interests Clear terminology Explicit conceptual framework Clear methodologies Good analysis Relevance Advances thinking in the field Replicability Cost-effectiveness Communication Awareness of limitations Reliable
  121. 121. IMPACT research is necessarily empirical (based on experience) … but there is still going to be INTERPRETATION of the data that is collected
  122. 122. ‘Eyes that Survey the World’: the latest data snapshot from OER Research Hub B. de los Arcos, R. Farrow, L.A. Perryman, B. Pitt The Open University, UK @OER_Hub
  123. 123. Photo CC BY-SA 2.0
  124. 124. Photo CC BY-NC 2.0 Data • 20+ surveys; • 60+ interviews with educators and OER experts; • 6 focus groups; • Impact statements
  125. 125. Sample Photo CC BY-SA 2.0
  126. 126. 6,390 responses from 180 countries: 50.3% informal learners, 24.7% formal learners, 21.6% educators, 3.4% librarians; 50.1% female; 48.7% male; 64% speakers of English as first language; 9.9% declare a disability; 33.3% hold a postgraduate degree; 34.8% use OER in Science.
  127. 127. Photo CC BY-SA 2.0 marfis75 PhotoCCBY-NC2.0AlexProimos
  128. 128. ‘OER increase student satisfaction with the learning experience’ % %
  129. 129. ‘OER lead to improved student grades’ % %
  130. 130. “Over the course of an entire semester all the kids turned in on average 82% of their homework, which is significant for me as an instructor because that made me feel that what I was asking them to do at home, (…) whatever it happened to be, that they saw the meaning in doing that.” “The greatest impact comes when I share the MERLOT website with students. They instantly connect with others who share their best practices. Then they develop their own best practices to share with their students and colleagues. There is such a great ripple effect when people are willing to share; especially when the information is easy to locate.”
  131. 131. Photo CC BY-SA 2.0 Photo CC BY 2.0
  132. 132. Photo CC BY-SA 2.0 86.3% of educators adapt OER to suit their needs
  133. 133. “The problem where I teach now is that we have no money; my textbooks, my Science textbooks are 20 years old, they’re so outdated, they don’t relate to kids (...) so I pick and pull from a lot of different places to base my units.” “I will maybe look and find an instructional video that’s maybe 2 or 3 minutes long that gets to the point better than I could, and I would use it, or I will look for lessons and if they are for Grade 5 or Grade 3 I don’t use all of it, I just adapt it, I take out what I don’t want and rearrange it.” “What I do is I look at a lot of free resources but I don’t usually give them directly to my students because I usually don’t like them as much as something I would create, so what I do is I get a lot of ideas.”
  134. 134. • I’ve created resources 95% • I’ve created resources and published them online 44% • I’ve created resources and published them online under a CC license 5% (Flipped Learning)
  135. 135. Reflection Photo CC BY-NC-SA 2.0
  136. 136. ‘I use a broader range of teaching and learning tools’ 40.6% ‘I reflect more on the way that I teach’ 37% ‘I have broadened my coverage of the curriculum’ 36.7% ‘I more frequently compare my teaching with others’ 32.1%
  137. 137. “It used to be that when I thought about preparing for a lesson I would look at a book and see what they did and I then would kind of teach a lesson similar to it but now I can go online watch a video or look at somebody else’s material that they put out there, see what they’re doing and either modify what they’re doing and bring it into my classroom or just get a totally different perspective on it and allow my students to get multiple perspectives on a topic.”
  138. 138. Photo CC BY-SA 2.0
  139. 139. Do students save money using OER? %
  140. 140. Do institutions save money using OER? %
  141. 141. “Down the road they may. Students talk to other potential students. When they find out that teachers care about cost and readability, they are more likely to choose your college” “Since we are all using online version, the school saves a lot of paper and money” “Without any doubt my students are saving money! Only one has purchased a copy of the textbook - everyone else uses their laptop, tablet, or prints out what they want.”
  142. 142. Photo CC BY-NC-ND 2.0
  143. 143. 57% of informal learners already have a degree 31% of formal learners used OER to try university- content level before signing up for a paid-for course 88.4% of all learners choose OER for the opportunity to study at no cost
  144. 144. dence Photo CC BY-SA 2.0
  145. 145. ‘COUP’ Framework The COUP is the Open Education Group’s framework for studying the impact of open textbooks, open educational resources, and open pedagogy in secondary and post-secondary education. COUP stands for: - Cost - Outcomes - Use - Perceptions
  146. 146. ‘COUP’ Framework Presentation by David Wiley (2012 OER Asia) Example of use
  147. 147. John Hilton III’s slides from Open Education 2014 (thanks for sharing, John!)
  148. 148. A Review of Research on the Perceptions and Efficacy of OER (and a call for more!) John Hilton III Open Education Group
  149. 149. Problem A recent nationally representative survey of 2,144 faculty members in the United States found that “most faculty remain unaware of OER.”  Source: Babson 2014 Survey, “Opening the Curriculum.”
  150. 150. Possible Solutions  Increasing efforts to “market” OER.  Increasing number of outstanding OER material.  Increasing the number of academic, peer- reviewed studies regarding the efficacy and teacher and student perceptions of OER materials.
  151. 151. Increasing the number of academic, peer-reviewed studies regarding the efficacy and teacher and student perceptions of OER materials. The Babson 2014 survey found that college professors rate “proven efficacy” and “trusted quality” as the number 1 and number 2 most important criteria for selecting teaching resources.
  152. 152. Published Efficacy and Perception Studies 1. Article focused on efficacy or perception in actual practice (not simply theory). 2. The resource(s) examined in the study needed to be OER that were the primary learning resource(s) used in the class. 3. In order to be selected for inclusion in this study, the research needed to have been published by a peer- reviewed journal, or be an institutional research report. Blog posts and conference proceedings were excluded from this data set.
  153. 153. References  Allen, I., Seaman, J. (2014). Opening the Curriculum: Open Educational Resources in U.S. Higher Education, 2014. Report available at:  Bliss, T., Robinson, T. J., Hilton, J., & Wiley, D. (2013). An OER COUP: College teacher and student perceptions of Open Educational Resources. Journal of Interactive Media in Education, 1–25. Retrieved from  Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive Learning Online at Public Universities: Evidence from Randomized Trials. Ithaka S+R. Retrieved from  Feldstein, A., Martin, M., Hudson, A., Warren, K., Hilton, J., & Wiley, D. (2012). Open textbooks and increased student access and outcomes. European Journal of Open, Distance and E-Learning. Retrieved from  Hilton, J., Gaudet, D., Clark, P., Robinson, J., & Wiley, D. (2013). The adoption of open educational resources by one community college math department. The International Review of Research in Open and Distance Learning, 14(4), 37–50.  Hilton, J., & Laman, C. (2012). One college’s use of an open psychology textbook. Open Learning: The Journal of Open and Distance Learning, 27(3), 201–217. Retrieved from  Lindshield, B., & Adhikari, K. (2013). Online and campus college students like using an open educational resource instead of a traditional textbook. Journal of Online Learning & Teaching, 9(1), 1–7. Retrieved from  Lovett, M., Meyer, O., & Thille, C. (2008). JIME-The open learning initiative: Measuring the effectiveness of the OLI statistics course in accelerating student learning. Journal of Interactive Media in Education, 2008(1).  Pawlyshyn, Braddlee, Casper and Miller (2013). Adopting OER: A Case Study of Cross-Institutional Collaboration and Innovation. Educause Review,  Petrides, L., Jimes, C., Middleton‐Detzner, C., Walling, J., & Weiss, S. (2011). Open textbook adoption and use: Implications for teachers and learners. Open learning, 26(1), 39-49.  Robinson T. J., Fischer, L., Wiley, D. A., & Hilton, J. (2014). The impact of open textbooks on secondary science learning outcomes. Educational Researcher, 43(7): 341- 351.  Wiley, D., Hilton, J. Ellington, S., and Hall, T. (2012). “A preliminary examination of the cost savings and learning impacts of using open textbooks in middle and high
  154. 154. Efficacy and Perception Studies 1. Lovett et al. (2008) measured the result of an implementation an online, OER component of Carnegie Mellon University’s Open Learning Initiative (OLI). Over two semesters, forty-four students utilized the OER as part of this study. Researchers examined test scores (three midterm and one final exam) of those students who took the traditional course versus those who utilized the OER materials. They found that no significant difference between the two groups.
  155. 155. Efficacy and Perception Studies 2. Petrides et al. (2011), utilized surveys of instructors and students who utilized an open statistics textbook called Collaborative Statistics. In total, 31instructors and 45 students participated in oral interviews or focus groups that explored their perceptions of the OER which they had utilized. They found that “Cost reduction for students was the most significant factor influencing faculty adoption of open textbooks” (p. 43), partly because it increased student access. 65% of students on the survey reported a preference for using open textbooks in the future because they are generally easier to use.
  156. 156. Efficacy and Perception Studies 3. Bowen et al. (2012) compared the use of a traditional statistics textbook with Carnegie Mellon’s OLI at six different institutions. Participating students were randomly assigned to either the face-to-face class with a traditional textbook, or a “hybrid” class that used the OER resource. Both groups took the same standardized test at the beginning and end of the semester, as well as a final examination. 605 students took the OER version of the course, while 2,439 took the traditional version. Students who utilized OER performed slightly better on the standardized exam than those who did not. However the difference in outcomes was not statistically significant.
  157. 157. Efficacy and Perception Studies 4. Hilton and Laman (2012), focus on introductory Psychology courses taught at Houston Community College (HCC). In the fall of 2011, twenty-three sections composed of 690 students used an open psychology textbook. The textbook was available for free online, and digital supplements produced by faculty were also freely to HCC students. The introduction of an open textbook was correlated with the increase in class grade point average, an increase of the average score on the departmental final examination and a lower course withdrawal rate. No causation was claimed.
  158. 158. Efficacy and Perception Studies 4. (Cont.) One hundred and fifty-seven students completed surveys regarding their perceptions of the OER. 84% of students surveyed agreed with the statement that “Having a free online book helps me go to college.”
  159. 159. Efficacy and Perception Studies 5. Wiley et al. (2012) examined the standardized test scores of students using the open textbooks in secondary science classes in three different school districts. Approximately 1,200 students used open textbooks during this study. Researchers examined their end-of-year standardized test results and found no apparent differences between the results of students who used traditional and open textbooks.
  160. 160. Efficacy and Perception Studies 6. Research by Feldstein et al. (2012) took place at Virginia State University. OER were implemented across nine different courses in the business department. 1,393 students took courses utilizing OER. Researchers found that students in courses that used OER more frequently had better grades and lower failure and withdrawal rates than their counterparts in courses that did not use OER. While their results had statistical significance, because of a new core curriculum employed at Virginia State University’s Business school, the two sets of courses were not identical. Thus while these data provide interesting correlations, they cannot establish causality.
  161. 161. Efficacy and Perception Studies 6. (Cont.) Three hundred and fifteen students completed a survey regarding their perspective on the shift to the OER, and Almost 95% of responding students strongly agreed or agreed that the OER were “easy to use” and 78% of respondents felt that the OER “provided access to more up-to-date material that is available in my print textbooks.” Approximately two-thirds of students strongly agreed or agreed that the digital OER were more useful than traditional textbooks and that they preferred the OER digital content to traditional textbooks.
  162. 162. Efficacy and Perception Studies 7. Bliss et al. (2013), studied OER adoption at eight different institutions of higher education. Fifty-eight teachers and 490 students across the eight colleges completed surveys regarding their experiences in utilizing OER. Approximately 50% of students said that the OER materials had the same quality as traditional textbooks and nearly 40% said that they were better. Students focused on several benefits of the open textbooks. Many cited technical advantages of the digital texts. In addition, the free cost of their open texts seemed critical to many students. 55% of teachers reported that the open material were of the same quality as the materials that had previously been used, and 35% felt that they were better.
  163. 163. Efficacy and Perception Studies 8. Lindshield and Adhikari (2013) studied the perceptions of students who utilized a digital OER textbook in a Human Nutrition class. One hundred and ninety-eight students completed a survey in which they shared their perceptions of the OER text. “Students favorably rated their level of satisfaction, liking the idea of the [digital OER], ease of [digital OER] use, not having to buy a textbook, and preferring the [digital OER] versus buying a textbook for the course.” Moreover they found that students disagreed or somewhat disagreed with statements to the effect that they would like to have a traditional textbook in addition to the OER.
  164. 164. Efficacy and Perception Studies 9. Pawlyshyn et al. report on the adoption of OER at Mercy College. In the fall of 2012, 695 students utilized OER in Mercy’s basic math course, and their pass rates were compared with those of the fall of 2011, in which no OER were utilized. Researchers found that the pass rates increased from 63.6% in fall 2011 (when traditional learning materials were employed) to 68.9% in fall 2012 when all courses were taught with OER. Similarly, students who were enrolled in OER versions of a reading course performed better than their peers who enrolled in the same course using non-OER materials.
  165. 165. Efficacy and Perception Studies 10. Hilton et al. (2013) chronicles a study that took place at Scottsdale Community College (SCC). In the fall of 2012, OER were employed throughout five different math courses at SCC, affecting 1,400 students. Issues with the initial placement tests made it so only four of the courses could be compared; nevertheless, the results of Fall 2012 (when OER was used) compared to Fall 2011 and 2010 showed that student results on department exams were approximately the same before and after the OER implementation.
  166. 166. Efficacy and Perception Studies 10. (Cont.) Surveys were completed by 910 students and eighteen faculty members at SCC who reported on their view of the OER. The majority of students (78%) said they would recommend the OER to their classmates. Similarly, 83% of students agreed with the statement that “Overall, the materials adequately supported the work I did outside of class” (only 5% of students disagreed with this statement). Faculty members were likewise positive about the open materials. 50% said that it was of the same quality as traditional textbooks, 33% said it was better, and 17% said it was worse.
  167. 167. Efficacy and Perception Studies 11. Robinson et al. (2014) examines the use of open science textbooks in three secondary science subjects across several schools in a suburban school district. This rigorous study used propensity score matched groups in order to control for teacher effect, socioeconomic status, and eight other potentially confounding variables. There were 1,274 students in each condition, treatment and control. In examining the results of the end-of-year state standardized test there were small, but statistically significant difference between the two groups, favoring those who utilized OER.
  168. 168. Efficacy and Perception Studies 12. Allen and Seaman in their Babson Survey (2014) surveyed 2,144 college professors regarding their opinions on OER. Of the 34% (729) who expressed awareness of OER, 61.5% of respondents said that OER materials had about the same “trusted quality” as traditional resources, 26.3% said that traditional resources were superior, 12.1% said that OER were superior. 68.2% said that the “proven efficacy” were about the same 16.5% said that OER had superior efficacy and 15.3% said that traditional resources had superior efficacy.
  169. 169. Synthesizing  In terms of student and teacher perspective of OER, there were 2,115 students and 836 faculty members whose perceptions were surveyed across the seven studies pertaining to perceptions of OER. In no instance did a majority of students or teachers report that the OER were of inferior quality. Across multiple studies in various settings, students consistently reported that they faced financial difficulties and that OER provided a financial benefit to them. A general finding seems to be that roughly half of teachers and students find OER to be comparable to traditional resources, a sizeable minority believe they are superior, and a smaller minority find them to be inferior.
  170. 170. Synthesizing  7,301 students were reported to have utilized OER materials across the eight studies that attempted to measure results pertaining to student efficacy. While causality was not claimed by any researcher, the use of OER was sometimes correlated with higher test scores, lower failure and/or withdrawal rates. None of the eight studies that measured efficacy had results in which students who utilized OER performed worse than their peers who used traditional textbooks.
  171. 171. Synthesizing  While some may be disappointed that OER materials have not been found to significantly increase student learning outcomes, this “non-finding” is nevertheless very important.  Given that (1) students and teachers generally find OER to be as good or better as traditional textbooks, (2) students do not perform worse when utilizing OER, students, parents and taxpayers stand to save literally billions of dollars without any negative impact on learning through the adoption of OER.
  172. 172. Two Requests 1. If you are aware of a peer-reviewed efficacy or perceptions study that I have not mentioned, will you please let me know? 2. Will you initiate research studies focused on perceptions and efficacy of OER? Scholarly articles in this arena will increase awareness and adoption of OER. If you would like help in designing or implementing such studies, my colleagues at the Open Education Group are happy to assist.
  173. 173. Publishing is not that hard! 1. International Review of Research on Open and Distance Learning 2. Journal of Interactive Media in Education 3. Open Praxis 4. Subject-specific disciples (e.g., Science Education, Math Education, etc.)
  174. 174. A Review of Research on the Perceptions and Efficacy of OER John Hilton III Open Education Group
  175. 175. ROER4D Impact Studies Workshop Day 4 Communications Strategy
  176. 176. Who are the stakeholders?
  177. 177. How will you communicate with them?
  178. 178. How will you know whether you have been heard?
  179. 179. What actions are you trying to inspire?
  180. 180. Smart Chart Communication Planning Tool
  181. 181. Open Research Communication
  182. 182. Open Access Publication YES!
  183. 183. Open Release of Research Data CC0 (Public Domain) / CC-BY
  184. 184. Open Research: Process “Open research is research conducted in the spirit of free and open source software. Much like open source schemes that are built around a source code that is made public, the central theme of open research is to make clear accounts of the methodology freely available via the internet, along with any data or results extracted or derived from them. This permits a massively distributed collaboration, and one in which anyone may participate at any level of the project.”
  185. 185. Open Research: Process Five principles: 1. Radical, realtime transparency 2. Make work discoverable 3. Minimise barriers to participation 4. Update in regular rhythm 5. Use social media to publicly engage
  186. 186. Ethics and Risk perspectives
  187. 187.
  188. 188. The field of ethics (or moral philosophy) involves systematizing, defending, and recommending concepts of right and wrong behavior. Internet Encyclopedia of Philosophy
  189. 189. Post World War II, war crimes trials produces Nuremberg Code (1947) for research involving human subjects Belmont Report (1979) sets out the principles of ethical research & still acts as basis for experimental research Criticised by Shore (2006) for failure to recognize difference (gender, ethnicity, culture, geography, etc)
  190. 190. Principles of Ethical Research • Exercise control over research process • Ethical research design, sampling, data collection • Respect for the autonomy and self-determination of research participants • Informed (and freely given) consent • Privacy & confidentiality (including data management) • Fairness, impartiality & transparency • Non-maleficence (do no harm) • Beneficence (maximise benefits of research)
  191. 191. Open Research When you make research open, novel and interesting things happen to the research process
  192. 192. Ethics in OER Research Hub (1/2) Considerations in line with ‘traditional’ research: • Compliance with UK Data Protection Act (1998) and the USA’s Protection of Human Subjects (45 CFR 46) • Risk assessment • Free recruitment of research participants • Institutional approvals (IRB) as needed • Informed consent • Data collection / storage in compliance with policy of The Open University (UK)
  193. 193. Ethics in OER Research Hub (2/2) New dimensions resulting from greater openness: • collaborative research design; agile working in partnership needs to maintain epistemological integrity • third-party data; respecting the consent provided at the time • open release of research data; issues around privacy and security of data; obligations to participants; wording of consent form • open licensing of research instruments; responsibility to set standards for research excellence • open dissemination: blogging, open access publication, School of Open course, duty to share findings widely
  194. 194. Openness in education The digital nature of OER and the particular methods of producing and using them represent a considerable challenge to existing practice in education: • Implications for proprietary methods of publication, dissemination • Evolving pedagogical roles & responsibilities • Relation to academic career development • Correct use (and attribution) of intellectual property • Blurring boundaries between private and ‘connected’ life • Building consensus and influencing policymakers
  195. 195.
  196. 196. Morality and open education “When educational materials can be electronically copied and transferred around the world at almost no cost, we have a greater ethical obligation than ever before to increase the reach of opportunity. When people can connect with others nearby or in distant lands at almost no cost to ask questions, give answers, and exchange ideas, the moral imperative to meaningfully enable these opportunities weighs profoundly. We cannot in good conscience allow this poverty of educational opportunity to continue when educational provisions are so plentiful, and when their duplication and distribution costs so little.” Caswell, Henson, Jensen & Wiley (2008)
  197. 197. Morality and open education Paris Declaration on OER (2012) builds on the previous ten years of OER advocacy as well as article 26 of the Universal declaration on human rights (UDHR, 1948) and article 13.1 of The International Covenant on Economic, Social and Cultural Rights (UN, 1966) in recognition of “the right of everyone to education”
  198. 198. “Publicly funded resources should be openly licensed resources”
  199. 199. Morality and open education • Are we morally obliged to release OER? For its own sake? For the sake of improving access to education as a moral good? • Are we morally obliged to release data openly? Can there be adequate safeguards? Is the risk too great? • Do we need more evidence around OER efficacy? • Education as common good supported indirectly by OER, open data, etc. • The moral significance of inaction
  200. 200. Risks that might affect the research… Changing currency exchange rates Failure to secure IRB ethical approval(s) Security of the research sites / equipment Risk to human participants (instability) Robbery / criminal activity in research sites Lack of professionalism / skills Scheduling issues – academic year, etc. Collaborator dependencies Subcontracting; recontracting Insufficient data is gathered in time Key stakeholders become unavailable Translation issues Reliability of data collected online
  201. 201. advice and guidance being ethical
  202. 202. OERRH Ethics Manual: Guidance It’s not possible to anticipate every possible effect of openness in unmonitored spaces: • Understanding the potential for collected information to be personally, professionally or commercially sensitive • Policies should make it clear when data can be shared with others and under what conditions, licence, etc. • Though open, dissemination strategies should respect existing agreements with those who have been recorded or provided data • Openly available third party materials should be used fairly. • Data mined from social networks may need to be treated with caution
  203. 203. Summary of Guidance • Just because it’s legal doesn’t mean that it is ethical • Check terms & conditions thoroughly if you’re at all unsure on legal side • Think about the control you exercise over the process and how to use influence. • CC-BY-NC/ND license options may give more control over data, but are arguably less open – is there a balance to be struck? Open versions of familiar principles: • Minimize harm • Ensure that consent is as informed as it reasonably can be • Respect for privacy and personhood
  204. 204.
  205. 205. School of Open course on #openresearch
  206. 206. 0_final.doc
  207. 207. Ethics in the Open and-the-future-of-education/ Ethics, Openness and the Future of Education
  208. 208. ROER4D Impact Studies Workshop End of Day 4
  209. 209. ROER4D Impact Studies Workshop Day 5
  210. 210. ROER4D Impact Studies Workshop Day 5 Planning next steps
  211. 211. Risks that might affect the research…
  212. 212.
  213. 213. Planning next steps • Timelines • Processes • Milestones • Sharing results
  214. 214. OER Impact Evaluation Methodologies Workshop Next Steps and Timelines
  215. 215. Next Steps 15 December 2014 – 1 January 2015 Proponents to submit revised proposals, abstracts and budgets to WOU ( cc  from 15 January 2015 Proponents can expect to receive feedback on their revised proposals  from 15 January 2015 WOU to send out Memorandum of Grant Conditions  15 January 2015 – 15 February 2015 Proponents to return signed Memorandum of Grant Conditions  February 2015 WOU to send out 1st tranche grant funds to Sub-projects (85% of project expenses)
  216. 216. Schedule of Financial and Technical Reports 01 March 2015 Official commencement date for all Sub-projects 31 August 2016 Official completion date for all Sub-projects
  217. 217. Schedule of Financial and Technical Reports  01 March 2015​ Official commencement date for all Sub-projects  15 June 2015 1st Technical Reports due from Sub-project (covering 3-month period from 1 March – 31 May 2015) 15 September 2015 2nd Technical Reports due from Sub-projects (covering 3-month period from 1 June – 31 August 2015)
  218. 218. Schedule of Financial and Technical Reports  15 March 2016 3nd Technical Reports due from Sub-projects (covering 6-month period from 1 September 2015 – 28 February 2016) 1st Financial Reports due from Sub-projects (covering 12-month period from 1 March 2015 – 28 February 2016)  31 August 2016 Official completion date for all Sub-projects
  219. 219. Schedule of Financial and Technical Reports 30 September 2016 4rd (Final) Technical Reports covering entire grant period, from 1 March 2015 – 31 August 2016 2nd (Final) Financial Reports covering entire grant period, from 1 March 2015 – 31 August 2016  from 30 October 2016 ​ Final fund disbursements to Sub-projects (up to balance 15% of project expenses)
  220. 220. OER Impact Evaluation Methodologies Workshop Budget Preparation
  221. 221. Budget Categories Research Personnel: Include remuneration, honoraria, allowances, and benefits paid to the principal investigator, co-investigators and other project staff Project advisors may be included if they are being paid on a regular basis and are hired for a longer period (more than a year). International travel costs for research personnel are covered in a separate budget item – International Travel
  222. 222. Budget Categories Consultants: Include all expenses related to acquiring the services of a consultant for a specific activity within the project Include fees, travel, accommodation, living expenses, and support services hired directly by the consultant for the project Indicate the total cost for each consultant as a single lump sum, and use a note to give a breakdown of the costs
  223. 223. Budget Categories International Travel: Include costs for international travel by project staff listed under “Research Personnel” Include costs for ground transportation, accommodation, meals, airfare, departure taxes, travel insurance and other expenses related to international travel Adhere to travel management processes of own institution but must follow terms stipulated in grant agreement, i.e. economy class travel and most direct route
  224. 224. Budget Categories Research Expenses: Include all costs related to carrying out the research and disseminating the research findings Include items such as payments to people who gather data or provide casual labour, consumable goods, computer services, in-country travel, reference materials, translation, printing, etc.
  225. 225. Budget Categories Indirect costs: Include administrative costs not directly related to the research Include clerical, accounting, or secretarial help, communications costs, photocopying in total, do not exceed 10% of total project cost If grant-seeking institution is absorbing the indirect costs partially or in total, indicate accordingly and deduct the amount from the total project cost
  226. 226. Currencies and Bank Transaction Costs Budgets must adhere to the upper limits stipulated in the Call – MYR150,000 – MYR225,000 All budgets must be submitted in MYR, based on local currency calculations Exchange rate and date of conversion to MYR must be shown in the budget Grant payments will be made in MYR. Note that there will be no reimbursements for additional costs for bank charges and currency fluctuations. The PI of the sub-project must deal with any shortfall in the budget due to exchange rate loss and/or bank charges, by adjusting project expenses
  227. 227. In General Budget line items must accompanied by clear budget notes Ensure budgets are apportioned appropriately across the 18 months’ project timeline Two payment tranches will be made (initial payment for 12 months’ expenses and final payment for six months’ expenses upon submission of the final project reports)
  228. 228. Some Anticipated FAQs  What happens if project costs are incurred before the grant agreement is signed? Such costs cannot be covered by the grant.  Can I revise my budget during the grant period? This can be done only with the agreement and approval of the Project Coordinators, and with proper justification. The revised grant total budget must be within the limit of the original approved budget.  Can I change the working currency of the project during the period of the grant? Normally, no, unless there are exceptional circumstances.
  229. 229. Some Anticipated FAQs  Do the Technical and Financial Reports have to be submitted in a certain format? The formats for each type of report will be provided to grantees.  Do I need to maintain a separate bank account for the grant monies?  No, unless the grantee institution prefers to do so.  Do I need to submit all receipts for project expenses? You must retain a proper accounting record of all project expenses, together with supporting invoices/receipts, and they should be available for submission with your financial reports, if required.
  230. 230. ROER4D Impact Studies Workshop Day 5 Evaluation
  231. 231. OERRH Evaluation Framework
  232. 232. Key questions for evaluation (1/2) 1. What aspect(s) of the project should be evaluated? 2. Who is the evaluation for? 3. What is it they want to find out? 4. What evaluation methods will be used? 5. What changes will be made when the results are gathered?
  233. 233. Key questions for evaluation (2/2) 6. What are the evaluation criteria and what is their source? 7. When will the evaluation take place? 8. Who will be involved in the evaluation? 9. What constraints will be placed upon the evaluation? 10. How and when will the evaluation results be disseminated?
  234. 234. ROER4D Impact Studies Workshop Day 5 Wrapping up…
  235. 235. Final thoughts…  Value of individual discussion – less ‘confrontational’  Support going forward  Communication between IS grantees  Communication with ROER4D ‘mothOERship’  Learning more than expected! (working harder than expected!)  Google Group for contact: docs, discussion, hangouts  Apr 2015 meeting? (tbc)  Possible attendance at OE Consortium Conference in Banff 2015  Thanks everyone
  236. 236. Thank you! …and good luck! @philosopher1978
  237. 237. Join us in building understanding of open education School of Open course on #openresearch OERRH Evidence Report OERRH Ethics Manual Contribute to OER Impact Map