Networked Learning Conference 2011


Published on

xDelia presentation on evaluation framework devised to guide the interdisciplinary design and development work within the project.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Networked Learning Conference 2011

  1. 1. Using Participatory Evaluation to Support Collaboration in an Interdisciplinary Context Gill Clough, Gráinne Conole & Eileen Scanlon
  2. 2. Agenda <ul><li>Context for the Design and Evaluation Framework </li></ul><ul><ul><li>Developed as part of the evaluation of the xDelia European project </li></ul></ul><ul><li>Theoretical approach </li></ul><ul><ul><li>Participatory, Iterative, Useful </li></ul></ul><ul><li>The D&E framework in action </li></ul><ul><ul><li>How it supports project collaborations </li></ul></ul><ul><ul><li>Applied to the Games Design Workshop </li></ul></ul><ul><li>Supporting interdisciplinary collaboration </li></ul>
  3. 3. xDelia – 3-year EU Project <ul><li>It’s goal: </li></ul><ul><ul><li>To use bio-sensors and serious games to identify & address the effects of emotional bias on financial decision making in 3 fields: </li></ul></ul><ul><ul><ul><li>Traders </li></ul></ul></ul><ul><ul><ul><li>Investors </li></ul></ul></ul><ul><ul><ul><li>Individuals aged between 16 to 29 </li></ul></ul></ul>
  4. 4. Challenges and Opportunities <ul><li>Interdisciplinary consortium </li></ul><ul><ul><li>CIMNE, Spain (Project Management) </li></ul></ul><ul><ul><li>OU Business School, UK (Traders and Investors) </li></ul></ul><ul><ul><li>Bristol University, UK (Individual Financial Capability) </li></ul></ul><ul><ul><li>Blekinge Tekniska Högskola, Sweden (Serious Games) </li></ul></ul><ul><ul><li>Erasmus University, The Netherlands (Cognitive Psychology) </li></ul></ul><ul><ul><li>Forschungszentrum Informatik, Germany (Bio-sensors) </li></ul></ul><ul><ul><li>OU Institute of Educational Technology, UK (Evaluation) </li></ul></ul><ul><li>Challenge: </li></ul><ul><ul><li>to collaborate together effectively to achieve a common goal </li></ul></ul><ul><li>Opportunity: </li></ul><ul><ul><li>learn from each other and conduct top quality interdisciplinary research </li></ul></ul>
  5. 5. xDelia D&E Framework <ul><li>Framework for the design and evaluation of effective project interventions </li></ul><ul><li>Benefits </li></ul><ul><ul><li>Provide a lens for reflection </li></ul></ul><ul><ul><li>Capture of the project experiences at key moments </li></ul></ul><ul><ul><li>Ensures there is a common understanding to the underpinning approach to design and evaluation </li></ul></ul><ul><ul><li>Guides the development of the research questions and methods </li></ul></ul><ul><li>Support interdisciplinary collaboration in a distributed project </li></ul>
  6. 6. Theoretical Approach <ul><li>Approach </li></ul><ul><ul><li>Participatory </li></ul></ul><ul><ul><li>Iterative </li></ul></ul><ul><ul><li>Useful </li></ul></ul><ul><li>Theoretical perspectives: </li></ul><ul><ul><li>Practical participatory evaluation (Cousins and Whitmore, 1998) </li></ul></ul><ul><ul><li>Collaborative inquiry (Oliver et al., 2007, Poth and Shulha, 2008) </li></ul></ul><ul><ul><li>Participatory design (Namioka and Schuler, 1993, Poth and Shulha, 2008). </li></ul></ul><ul><ul><li>Also draws on </li></ul></ul><ul><ul><ul><li>Utilization-focused evaluation (Patton 20067) </li></ul></ul></ul><ul><ul><ul><li>Learning design (Conole, 2009) </li></ul></ul></ul>Researcher controlled Practitioner controlled Consultation Deep participation Primary users All groups
  7. 7. Evaluations <ul><li>Workshop interventions: </li></ul><ul><ul><li>Type i) Prototype development workshops </li></ul></ul><ul><ul><li>Type ii) Substantive, subject-orientated workshops </li></ul></ul><ul><ul><li>Type iii) Design and evaluation workshops </li></ul></ul><ul><li>Study interventions: </li></ul><ul><ul><li>Research activities that aim to provide data for the research </li></ul></ul>
  8. 8. Design and Evaluation Framework
  9. 10. The framework in action <ul><li>The D&E framework works at two levels </li></ul><ul><ul><li>Macro level: overall conceptual framework and approach </li></ul></ul><ul><ul><li>Micro level: specific support from design question through to analysis </li></ul></ul><ul><li>Ongoing meta-evaluation analysis </li></ul><ul><ul><li>Articulates key moments in the project </li></ul></ul><ul><ul><li>Captures different partner stakeholder perspectives </li></ul></ul><ul><ul><li>Identifies the success factors for and challenges of this kind of interdisciplinary research </li></ul></ul>
  10. 11.
  11. 12. Questions <ul><li>Game Design Questions </li></ul><ul><ul><li>What games do we want to develop further? </li></ul></ul><ul><ul><li>What concepts do we want to develop further? </li></ul></ul><ul><ul><li>What are the key questions for games to improve financial capability? </li></ul></ul><ul><li>Evaluation Questions </li></ul><ul><ul><li>What aspects of the workshop worked and what did not work? </li></ul></ul><ul><ul><li>What are partner’s initial perceptions of the project? </li></ul></ul>
  12. 13. Interventions <ul><li>Design Layer </li></ul><ul><ul><li>Briefing on financial capability </li></ul></ul><ul><ul><li>Mixed-partner groupings </li></ul></ul><ul><ul><li>Brainstorm games models </li></ul></ul><ul><ul><li>Select one game and prototype </li></ul></ul><ul><ul><li>Group evaluation </li></ul></ul><ul><li>Evaluation Layer </li></ul><ul><ul><li>Audio and video workshop </li></ul></ul><ul><ul><li>Interview partners </li></ul></ul><ul><ul><li>Post workshop survey </li></ul></ul>
  13. 14. Analysis <ul><li>Design </li></ul><ul><ul><li>Identify key questions for games </li></ul></ul><ul><ul><li>Better collective understanding of game design process & evaluation criteria for games </li></ul></ul><ul><ul><li>Document an initial set of prototypes </li></ul></ul><ul><li>Evaluation </li></ul><ul><ul><li>Abstract examples of good practice </li></ul></ul><ul><ul><ul><li>Interdisciplinary learning evaluated through post-event survey </li></ul></ul></ul><ul><ul><li>Feedback critical evaluation of the event </li></ul></ul><ul><ul><ul><li>Workshop template and guidelines page on wiki </li></ul></ul></ul><ul><ul><li>Identify and address issues of interdisciplinarity </li></ul></ul><ul><ul><ul><li>Analyse baseline interview data to identify themes </li></ul></ul></ul><ul><ul><ul><li>Provide support and guidance for collaborative working founded on interdisciplinary learning </li></ul></ul></ul>
  14. 15. Interdisciplinarity Themes
  15. 16. Collaboration tools <ul><li>Wiki and Email </li></ul><ul><li>WP6 participation in collaborations </li></ul><ul><ul><li>video-conferences </li></ul></ul><ul><ul><li>studies </li></ul></ul><ul><li>Cloud-based file repositories </li></ul>
  16. 17. Implications for practice <ul><li>Evaluation team adopt a reflective participatory role </li></ul><ul><ul><li>Two-way participation in both workshops and study interventions </li></ul></ul><ul><ul><li>Identification of XDelia mediating technology artefacts and analysis of their impact on collaboration and learning: Flashmeeting, Adobe Connect, Wiki, Google Docs, Google Wave, Email, Dropbox, Diigo </li></ul></ul><ul><li>Support local interventions via a online Evaluation toolbox </li></ul><ul><ul><li>Toolbox consists of an evolving ‘pick and mix’ of evaluation methods </li></ul></ul><ul><ul><li>Available on the Cloudworks social media site </li></ul></ul><ul><ul><li>Each evaluation method is a ‘Cloud’ on Cloudworks with discussion space </li></ul></ul><ul><ul><li>Anyone can improve the Clouds by adding more links and references </li></ul></ul>
  17. 19. Into the Future <ul><li>Evolving project glossary supporting shared understandings </li></ul><ul><li>Participatory evaluation of study interventions </li></ul><ul><li>Refine the evaluation toolbox for micro-level application of D&E framework </li></ul><ul><li>Apply the D&E framework to the study interventions </li></ul><ul><li>In depth analysis of data on interdisciplinary collaboration </li></ul><ul><li>Additional partner interviews </li></ul><ul><li>Analysis of mediating technologies used in the project </li></ul>
  18. 20. Gill Clough [email_address] Gráinne Conole Institute of Educational Technology The Open University Walton Hall Milton Keynes MK7 6AA <ul><li> </li></ul>
  19. 21. References <ul><ul><li>Clough, G., Conole, G.C. and Scanlon, E. (2009). Behavioural Finance and Immersive Games: A Pan-European Framework for Design and Evaluation. In Same places, different spaces. Proceedings Ascilite Auckland 2009 . </li></ul></ul><ul><ul><li>Conole, G. (2009) Capturing and Representing Practice. In Tait, A., Vidal, M., Bernath, U. & Szucs, A. (Eds.) Distance and E-learning in Transition: Learning Innovation, Technology and Social Challenges. London, John Wiley and Sons. </li></ul></ul><ul><ul><li>Conole, G. and Culver, J. (2009). The design of Cloudworks: Applying social networking practice to foster the exchange of learning and teaching ideas and designs. In Computers and Education - Learning in Digital Worlds – Selected contributions from the CAL09 conference . [viewed 19 Nov 2009]. </li></ul></ul><ul><ul><li>Cousins, J.B. and Whitmore, E. (1998). Framing participatory evaluation. New Directions for Evaluation , 80, 5-23. </li></ul></ul><ul><ul><li>Oliver, M., Harvey, J., Conole, G. and Jones, A., (2007). Evaluation. In G. Conole and M. Oliver (Eds.), Contemporary Perspectives in E-Learning Research. (pp.203–216). London: Routledge </li></ul></ul><ul><ul><li>Patton, M. Q. (2008) Utilization-Focused Evaluation, Sage Publications, London. </li></ul></ul><ul><ul><li>Poth, C, A. and Shulha, L. (2008). Encouraging stakeholder engagement: A case study of evaluator behaviour. Studies in Educational Evaluation The Process of Evaluation: Focus on Stakeholders , 34, 218–233. </li></ul></ul>