From proof of concept to evidence of impact:  towards a principled approach to evaluating learning design tools   Liz Mast...
Evaluating the impact of  the LDSE: conceptual foundations <ul><li>Achieve an impact on lecturers’ practice in designing t...
What constitutes  evidence of impact? <ul><li>Innovating in relation to one’s current practice </li></ul><ul><li>Giving du...
Delimiting the field of vision Organisational penetration; After Koballa (1988) (from Fishbein & Ajzen, 1975); also Ertmer...
Qualitative measurement criteria <ul><li>Awareness Recognising the potential for enhancing practice through engaging in th...
Qualitative measurement criteria <ul><li>Awareness </li></ul><ul><li>Reactions </li></ul><ul><li>Engagement </li></ul><ul>...
An example <ul><li>“ I got taken along to see what reusable learning objects were and got this lovely example of fulcrum, ...
Methodological issues <ul><li>Reliance on self-reports: e.g. from interviews and surveys </li></ul><ul><li>Constraints in ...
Upcoming SlideShare
Loading in …5
×

From proof of concept to evidence of impact: evaluating learning design tools

892 views
822 views

Published on

Presentation to accompany paper for the Art & Science of Learning Design workshop 2011

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
892
On SlideShare
0
From Embeds
0
Number of Embeds
52
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • &gt;&gt;CLICK Objective of the LDSE project in terms of evaluation: &gt;&gt;CLICK What do we understand by “impact”? Everyday dictionary definition is as good a place as any to start. (Oxford Dictonaries Online) &gt;&gt;CLICK In terms of teaching and learning, Biggs (2003 book on university teaching) implies that to bring about “enduring change” in their approach, the lecturer needs both to act and to think differently about teaching. However, the verb “think” conceals an array of constructs that encompasses beliefs, attitudes, intentions and a host of other “affective variables” – section 2 of the written paper goes into some of these, which we will meet on a later slide. &gt;&gt;CLICK Finally, most measurements of impact are quantitative, and at bottom what will really demonstrate the impact using the LDSE – for many – will be the difference in students’ learning outcomes, which of course are measured quantitatively. However, where our purpose is to change behaviour then, as James Wertsch (2002) reminds us: “ “ with the introduction of a new […] tool into the flow of human action we should be on the lookout for qualitative transformation of that action rather than a mere increment in efficiency or some other quantitative change.” Thus, we should give primacy to qualitative methods: descriptive evidence from observations, and reactions and feedback collected directly from participants in think-aloud procedures, interviews, focus groups and/or questionnaires. &gt;&gt; CLICK Conflates objectives 1 and 2 – where objve 1 addresses teachers’ pedagogy in general, and objve 2 focuses on TEL. &gt;&gt; CLICK Dictionary definition: “a strong influence,” where “influence” can be defined as “the capacity to have an effect on the character, development, or behaviour of someone or something” (Oxford Dictionary) &gt;&gt; CLICK Biggs implies that to bring about “enduring change” in their approach, the lecturer needs both to act and to think differently about teaching. &gt;&gt; CLICK And Wertsch reminds us that, “with the introduction of a new […] tool into the flow of human action we should be on the lookout for qualitative transformation of that action rather than a mere increment in efficiency or some other quantitative change.”
  • So, as a result of their interactions with the LDSE, what do we want to see people doing more of, less of, doing what they didn’t do before in relation to TEL design? – Recall the step in the LD cycle that we are addressing. I interviewed ten “informant-practitioners” (IPs) in semi-structured interviews during the requirements elicitation phase of the LDSE project. They all had at least five years’ teaching experience, were well versed in digital technologies, and represented a range of roles: subject lecturers, staff developers and learning technologists. So, we could expect them to be highly aware of their own behaviors and also well placed to articulate the behaviors and conceptions (or misconceptions) of early-career lecturers, as well as those of seasoned academics who have yet to engage with TEL. The questions were devised to explore further the characteristics of TEL practice that we had encountered in our own previous research or are acknowledged in the TEL research literature. Skate through these – they are explored in the paper &gt;&gt;CLICK Innovating in relation to one’s current practice, particularly in relation to TEL. &gt;&gt;CLICK Giving due weight to students’ needs (i.e. as opposed to being driven by content?) &gt;&gt;CLICK 3. Espousing theories of learning and teaching appropriate to their context To inform the process of design; As a scaffold for reflecting on one’s own practice; To lend weight to arguments about pedagogy. &gt;&gt;CLICK 4. Engaging in critical reflection on one’s practice, where reflection: is informed by appropriate theories and frameworks of teaching and learning, not merely by measures of professional competence; informs the subsequent (re)design and facilitation of students’ learning. &gt;&gt;CLICK 5. Building personal professional knowledge “ Building on the work of others” Consulting the findings of pedagogic research. &gt;&gt;CLICK 6. Participating in a community of teachers and contributing to the development of collective professional knowledge: Acknowledging that one’s own practice can be a source of inspiration to others, Being willing both to share one’s work for peer review and to peer-review other teachers’ work.
  • When deciding how to measure impact, look at it along a number of dimensions: &gt;&gt;CLICK Depth within the individual at whom the impact is directed, analysed in terms of: Behaviour (1 st -order change, 2 nd -order change) Intentions Attitudes Beliefs &gt;&gt;CLICK Organisational penetration (i.e. into the sociocultural context): Individual  Small group e.g. course teams  Department or faculty  Institution &gt;&gt;CLICK Time: Length of time that participants are given to engage with the LDSE Period over which participants are followed by the researchers and, hence, the maximum depth and penetration. Note that Ertmer reports research suggesting that school-teachers may take up to six years to develop social-constructivist approaches to their use of TEL. So, use this framework to: Define the boundaries of our expectations when we are setting up an evaluation; Identify the possible extent of impact from the data that we collect; And, hence, prevent us from making facile generalisations/claims on the basis of our findings.
  • The written paper lays out a number of methodological concerns associated with assessing changes in behaviour and thinking. These may be resolved, in part, by this adaptation of the impact levels defined by the Academy Evaluation and Impact Assessment Approach (HEA 2009) – which I believe may have its origins in Kirkpatrick’s writings (1966). NB These examples relate to use of the LDSE – not to the behaviours previously mentioned. &gt;&gt;CLICK 1. Awareness Recognising the potential for enhancing practice through engaging in this activity and making the choice to learn more . Awareness of the existence of Web-based supportive resources / these practices &gt;&gt;CLICK 2. Reactions Feeling positive about enhancing practice in this way and using it in a workshop – in structured “artificial” activities. Perceiving its potential benefits and choosing to explore it further. Making an informed choice to go a little deeper . &gt;&gt;CLICK Engagement Engage with enhancing practice in this way and Applyi the LDSE to an authentic work task – within the “protective” context of a workshop or in their workday situation Identify new ideas, approaches and perspectives that could be applicable to their work – e.g. from pedagogic analysis by LDSE, patterns, A&amp;G. (i.e. the beginnings of a developmental/transformational engagement with the tool). Probably too early to be classed as a 1 st -order change. Beginning to learn and develop some ideas relevant to their practice
  • &gt;&gt;CLICK 3 times to redisplay the first 3 criteria &gt;&gt;CLICK 4. Learning from Learning or develop ideas relevant to their practice Using the LDSE to redesign their teaching/learning to take account of these new ideas/insights – perhaps on an incremental basis, building on initial experiences. So, beginning to plan how to use these ideas in their practice Very early indications (1 st -order changes) of shifts in their pedagogy (cf the quote from the LAMS evaluation: “I think LAMS has made me want to be less instructivist”). &gt;&gt;CLICK 5. Applying the learning Applying what they have learned or developed to their practice Running the redesigned sessions with their students: capture data on outcomes, in both students’ performance and affective response (motivation, enjoyment); also on their own experience as the teacher/facilitator. – So beginning to identify the effects of applying these new ideas in practice Evidence of some shifts in their pedagogy but too early to draw any conclusions whether these are 1 st -order or 2 nd -order changes. May lie beyond the scope of the LDSE project. 6. Effects on student learning Identifying instances of students learning better as a result of enhanced practice So, building up an evidence base of “learning better” over time (i.e. need more than just 2 or 3 sessions). Confidence to explore the potential for innovating further in their teaching. Permanent shifts (second-order changes) in their pedagogy (cf the quote from the LAMS evaluation: “I think LAMS has made me want to be less instructivist”). Lies beyond the scope of the current project – but may look out for early indications of these shifts.
  • Overflow for notes from previous page: Example from data: “ I got taken along to see what reusable learning objects were and got this lovely example of fulcrum, load and effort and a car crashing into a wall. And I sat there and I watched these beautiful animations and I thought, ‘Well that’s not what I do because I don’t teach a concept that can be grasped like that.’ And there was this moment, and I can’t remember what anybody said, but […] I had an epiphany because I suddenly went, ‘Oh, so when I’m teaching that means I could do this!’ And then I was on board, but it was that… just something suddenly got said that tipped it over and I could see the potential, […] something that connected with my own teaching practice and said, ‘Ah, but you do that.” Classics lecturer – went on to develop her own learning object, and ended up introducing it into her university. These measures are essentially qualitative, apart from the final quantitative one: improvement in learning outcomes, which lies outside the scope of the project. So we have to look at the qualitative descriptions – i.e. the kinds of ways in which they change their behaviour – and the extent (in terms of the six levels), to which they have changed it in terms of the desirable behaviours that we have identified. We would then want to map their accounts of changed behaviour to the desirable changes in behaviour that we identified. What will this tell us? A close mapping would indicate that working with the LDSE has had the desired impact; a poor or patchy mapping could tell us either that a) the LDSE isn’t serving its intended purpose of b) our hypothesis regarding these behaviours is somewhat deficient.
  • Methodological issues: &gt;&gt;CLICK Until we can reach the goal of measuring changes in student learning outcomes – by which time there will have been many other factors coming into play in teir learning, not just the teacher’s design – we have to rely on self reports. But these are unreliable. &gt;&gt;CLICK Restrictions in time and resources may limit the reach of our investigations in terms of behaviors and affective variables, meaning that in some cases we may only capture first-order changes in behavior or emergent shifts in attitude (and, if we are fortunate, beliefs). Neither of these is sufficient to indicate the sustainability of change. &gt;&gt;CLICK And so it may mean shifting our research questions somewhat: from “Have we achieved an impact, through the LDSE, on teachers’ practice in designing TEL?” to “What does it take to achieve an impact on practice, in terms of the individual, the LDSE and the sociocultural context?” and, more narrowly, “What will it take, in terms of the individual’s interactions with the LDSE, to have an impact on practice?” – see section 5 of paper – examine the process as it happens, also uncover the conditions that may be conducive to having an impact. At present, this entails focusing on the user-tool relationship – another whole paper, influenced by ideas of Wertsch.
  • From proof of concept to evidence of impact: evaluating learning design tools

    1. 1. From proof of concept to evidence of impact: towards a principled approach to evaluating learning design tools Liz Masterman University of Oxford ASLD Workshop 13 th -14 th October 2011
    2. 2. Evaluating the impact of the LDSE: conceptual foundations <ul><li>Achieve an impact on lecturers’ practice in designing teaching and learning, particularly in relation to their use of TEL where it is relevant and appropriate </li></ul><ul><li>Strong effect on character, development or behaviour </li></ul><ul><li>Act and think differently about teaching (Biggs, 2003) </li></ul><ul><li>Qualitative transformation , not incremental or quantitative change (Wertsch, 2002 </li></ul>
    3. 3. What constitutes evidence of impact? <ul><li>Innovating in relation to one’s current practice </li></ul><ul><li>Giving due weight to students’ needs </li></ul><ul><li>Espousing appropriate theories of learning and teaching </li></ul><ul><li>Engaging in critical reflection on one’s practice </li></ul><ul><li>Building personal professional knowledge </li></ul><ul><li>Participating in a community of teachers and contributing to the development of collective professional knowledge </li></ul>
    4. 4. Delimiting the field of vision Organisational penetration; After Koballa (1988) (from Fishbein & Ajzen, 1975); also Ertmer (2007); Kaufman et al. (1996) Time Beliefs a) General pedagogic beliefs (active construction of knowledge) b) Beliefs about digital technologies in T&L (potential to benefit learning) Attitude +ve Values (Intends to use PRS to increase student participation in lectures) (Uses PRS once or twice only) Intentions re TEL 1 st -order change (reversible) Behaviours re TEL 2 nd -order change (irreversible) (PRS embedded in T&L practice) Confronts and strengthens both belief a) and belief b)
    5. 5. Qualitative measurement criteria <ul><li>Awareness Recognising the potential for enhancing practice through engaging in this activity </li></ul><ul><li>Reactions Feeling positive about enhancing practice in this way </li></ul><ul><li>Engagement Engaging with enhancing practice in this way… </li></ul>HEA Evaluation and Impact Assessment Approach (2009)
    6. 6. Qualitative measurement criteria <ul><li>Awareness </li></ul><ul><li>Reactions </li></ul><ul><li>Engagement </li></ul><ul><li>Learning from Learning or develop ideas relevant to their practice </li></ul><ul><li>Applying the learning Applying what they have learned or developed to their practice </li></ul><ul><li>Effects on student learning Identifying instances of students learning better as a result of enhanced practice </li></ul>HEA Evaluation and Impact Assessment Approach (2009)
    7. 7. An example <ul><li>“ I got taken along to see what reusable learning objects were and got this lovely example of fulcrum, load and effort and a car crashing into a wall. And […] I thought, ‘Well that’s not what I do because I don’t teach a concept that can be grasped like that.’ And there was this moment, […] I had an epiphany because I suddenly went, ‘Oh, so when I’m teaching that means I could do this!’” </li></ul>
    8. 8. Methodological issues <ul><li>Reliance on self-reports: e.g. from interviews and surveys </li></ul><ul><li>Constraints in time and resources  first-order changes in behavior or emergent shifts in attitude </li></ul><ul><li>Describe impact in the process of happening , rather than measure impact as outcome </li></ul>

    ×