Kate McKegg and Nan Wehipeihana (2010). A practitioners introduction to Developmental Evaluation

1,917 views
1,834 views

Published on

Kate McKegg and Nan Wehipeihana (2010). Developmental Evaluation: A practitioner's introduction. A pre-conference workshop presented at the Australasian Evaluation Society (AES) Conference, September 2010, Wellington, New Zealand.

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,917
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
24
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Ongoing development adaptation of a programme, strategy or innovation to new conditions in complex dynamic situationAdapting effective general principles to a new context the dynamic middle between top-down and bottom-up forces for changeDeveloping a rapid response in turbulent, disaster situations and exploring real-time solutions and generative innovative responsesPre-formative development of a potentially scalable innovation a period of exploration to shape a model that is more fully conceptualised, potentially scalable interventionMajor systems change and cross-scale development evaluation how major systems change unfolds, evidence emergent tipping
  • Ongoing development adaptation of a programme, strategy or innovation to new conditions in complex dynamic situationAdapting effective general principles to a new context the dynamic middle between top-down and bottom-up forces for changeDeveloping a rapid response in turbulent, disaster situations and exploring real-time solutions and generative innovative responsesPre-formative development of a potentially scalable innovation a period of exploration to shape a model that is more fully conceptualised, potentially scalable interventionMajor systems change and cross-scale development evaluation how major systems change unfolds, evidence emergent tipping
  • Supports ongoing real-time decisions about what to change, expand, close out or further developFormative evaluation is most commonly conceived as evaluative effort aimed to help programmes with initial design and set up, improve implementation, i.e., getting programmes settled and stable - ready for summative evaluation - or more importantly aimed to set up a programme model of some sort,Summative evaluation addresses the question – did the program work? This requires a clear understanding and specification of the IT. This needs to be identifiable if we are going to be able to say something about the IT, and formative evaluation has played an important role in this process. Summative evaluation’s purpose is most commonly understood to be about determining the value or worth of a programme, in order to make decisions about continuing, expanding or stopping it. DE can be Pre-programmatic – such as working with a community to try and find ways to work with youth – lots of things are tried, piloted, tested out – not everything works, and changes are made…this is development, and is not just about improvement – it is about exploring the parameters of an innovation – as it takes shape. What’s being tried is an approach, more than a model. It is about helping people clarify, focus and articulate what they are doing, as they do it. DE also supports ongoing development – adaption. And this is not necessarily the same as improvement. Change is conceived very often necessary adaption to new circumstances or conditions. Implementing strategy in organisations, or even programs in the real world just doesn’t happen in a neat way. Although there is the intended implementation, what unfolds is always somewhat different. Some things get dropped or go undone, and some new things emerge, as new opportunities arise. In this ‘real world’ of developmental implementation, DE helps programs and organisations track and document the changes and adaptions as they happen. The distinction between improvement and development is NOT absolute or unambiguous – it is nuanced, and depends on intention and perception. But importantly, DE is driven by the program, or by the organisation – so if the program wants to call what they are doing development, then it’s DE they are doing. (Chapter 2)
  • Formative evaluation is most commonly conceived as evaluative effort aimed to help programmes with initial design and set up, improve implementation, i.e., getting programmes settled and stable - ready for summative evaluation - or more importantly aimed to set up a programme model of some sort,Summative evaluation addresses the question – did the program work? This requires a clear understanding and specification of the IT. This needs to be identifiable if we are going to be able to say something about the IT, and formative evaluation has played an important role in this process. Summative evaluation’s purpose is most commonly understood to be about determining the value or worth of a programme, in order to make decisions about continuing, expanding or stopping it. DE can be Pre-programmatic – such as working with a community to try and find ways to work with youth – lots of things are tried, piloted, tested out – not everything works, and changes are made…this is development, and is not just about improvement – it is about exploring the parameters of an innovation – as it takes shape. What’s being tried is an approach, more than a model. It is about helping people clarify, focus and articulate what they are doing, as they do it. DE also supports ongoing development – adaption. And this is not necessarily the same as improvement. Change is conceived very often necessary adaption to new circumstances or conditions. Implementing strategy in organisations, or even programs in the real world just doesn’t happen in a neat way. Although there is the intended implementation, what unfolds is always somewhat different. Some things get dropped or go undone, and some new things emerge, as new opportunities arise. In this ‘real world’ of developmental implementation, DE helps programs and organisations track and document the changes and adaptions as they happen. The distinction between improvement and development is NOT absolute or unambiguous – it is nuanced, and depends on intention and perception. But importantly, DE is driven by the program, or by the organisation – so if the program wants to call what they are doing development, then it’s DE they are doing. (Chapter 2)
  •  
  • Providing timely feedback, reflection and generation of learning's for development processTracking emergent findings and connectionsResponding to the unexpectedCollaboration and co-creationCapacity developmentInternal accountability focus – values orientationDealing with ambiguity and lack of control
  • Providing timely feedback, reflection and generation of learnings for development processTracking emergent findings and connectionsResponding to the unexpectedCollaboration and co-creationCapacity developmentInternal accountability focus – values orientationDealing with ambiguity and lack of control
  • While this answer IS the most appropriate answer for common living, this puzzle shows how easy it is to give the most obvious answer first and to ignore alternative possibilities. It's so easy not to consider the context of the problem. Different situations need different answers. http://www.brainstorming.co.uk/puzzles/dropblock.html
  • See handout
  • Kate McKegg and Nan Wehipeihana (2010). A practitioners introduction to Developmental Evaluation

    1. 1. Developmental evaluation<br />- A practitioner’s introduction <br />Kate McKegg and Nan Wehipeihana<br />
    2. 2. Our whakapapa <br /><ul><li> Who am I?
    3. 3. Where am I from?
    4. 4. What is my history?</li></li></ul><li>Introductions<br />Who are you?<br />Where are you from?<br />What brings you to this workshop?<br />What do you hope to learn today?<br />Pair up with someone you don’t know (or don’t know very well)<br />Listen and pay attention to why each other is here today<br />Tell us your partner’s name and what they hope to get out of this session<br />
    5. 5. Workshop Outline<br />
    6. 6. Developmental Evaluation - what is it?<br />Developmental evaluation supports innovation development to guide adaption to emergent and dynamic realities in complex environments. <br />(Michael Quinn Patton, 2010)<br />
    7. 7. Five purposes of developmental evaluation<br />
    8. 8. Five purposes of developmental evaluation<br />Adapted from Patton (2010)<br />
    9. 9. Development and innovation in complex situations<br />Intractable issues e.g., poverty, homelessness, environmental degradation, family violence, obesity<br />Uncertainty about options and solutions<br />Political and economic turbulence <br />Environment characterised by unpredictability and the unexpected emergence of issues and ideas (e.g., rise of social networking)<br />Rapid adaption and change occurring across many contexts<br />
    10. 10. DE is informed by complexity and systems thinking (picture sourced from Patton, 2009)<br />
    11. 11. Some key complexity concepts<br />Adaption<br />Emergence<br />Non-linearity<br />Uncertainty<br />Interdependence<br />
    12. 12. Exercise One: Handout 1: match each scenario with a complexity concept<br />
    13. 13. Some complexity concepts: Handout 2: practical application<br />
    14. 14. DE - a distinct niche<br />Two distinct niches<br />Pre-formative<br />Dynamic situations<br />Support exploration and innovation before there is a program model (pre-formative)<br />Support the ongoing development and adaption of a programme, or innovation in emergent and complex situations (dynamic situations)<br />
    15. 15. DE - a distinct niche<br />Different to formative and summative<br />Differs from improvement orientated evaluation (making a programme better)<br />It aims to support the ongoing real-time decisions – what to change, expand, close out further develop<br />Important Distinction:<br />Differentiating ongoing strategic thinking and periodic strategic planning <br />DE as a form of thinking and acting strategically as an innovative intervention unfolds<br />
    16. 16. Increase participation in sport and recreation – ‘by Māori’<br />2007-08 review and programme ‘re-visioning’ <br />Focus on participation ‘as Māori’ participation<br />Tino rangatiratanga (self determination)<br />20 years of Māori development<br />The story of He Oranga Poutama<br />
    17. 17. Case example – He Oranga Poutama and complexity<br />Uncertainty around the new programme concept and direction, not sure what ‘as Māori’ participation would involve<br />New innovative programme and goal<br />Values, principles and vision driven<br />No formal evaluation evidence from other efforts about what to expect, no real data available to measure key concepts and variables<br />Complex environment – contracting economy, political environment, changing organisation<br />
    18. 18. DE – and creative thinking<br />Requires critical AND creative thinking<br />We have to think outside the box<br />Ask questions about what makes sense as we try to connect evaluation to the program development<br />
    19. 19. Creative thinking<br />Draw four straight lines which go through the middle of all of the dots without taking the pencil off the paper<br />
    20. 20. Creative thinking:<br />Thinking outside the boxhttp://www.brainstorming.co.uk/puzzles/ninedotsnj.html<br />Step 2<br />Step 3<br />Start here 1<br />Step 4<br />
    21. 21. MORNING TEA<br />
    22. 22. When not to use DE<br />Not able or willing to commit the time to actively participate in the evaluation and to build and sustain relational trust – no room for the set and forget evaluation management approach<br />High levels of certainty required – no tolerance for ambiguity<br />Lack of openness to experimentation and reflection – high level of design certainty required<br />No adaptive capacity – no real ability to make changes<br />Unwillingness to ‘fail’ or hear ‘bad news’ – fear of failure context overrides learning focus<br />Poor relationships between management and staff and evaluators – respectful and trusting relationships critical in a context of uncertainty, ambiguity and turbulence<br />
    23. 23. Traditional evaluation is not well suited to all contexts<br />
    24. 24. Exercise 2:Teasing out complexity<br />Handout 3:<br />Read the brief description of Whānau Ora<br />What if anything is complex about Whānau Ora?<br />Share and discuss in small groups<br />Make notes to feedback to the larger group<br />
    25. 25. Roles and relationships of the DE evaluator<br />Evaluator is part of the innovation team<br />Facilitator and learning coach<br />Brings evaluative thinking to the group<br />Support or share innovators’ values and vision<br />Support the feedback process around “what is being developed”<br />
    26. 26. DE - the context and the evaluator<br />
    27. 27. Skills, abilities and dispositions<br />The ability to build and maintain trusting relationship<br />Critical in a context of uncertainty, ambiguity and turbulence<br />High level of facilitation skills, multiple settings, stakeholders and contexts<br />To generate and ‘test’ ideas, to provide feedback, to ask the ‘tough’ questions; highly engaging, to build evaluative capability<br />Deep methodological toolkit<br />Methodological flexibility, eclecticism, and adaptability (Patton, p27, 2010)<br />Enquiring<br />Observant and critically reflective<br />Systematic <br />Attention to data and making data driven decisions<br />Courageous<br />Flexible, adaptable, responsive ‘on the run, on the fly’<br />
    28. 28. LUNCH TIME<br />
    29. 29. What will happen to the piece of wood when the person lets go of it?<br />
    30. 30.
    31. 31. DE in practice - Inquiry approaches<br />Focus of DE is on what is being developed?<br />What’s emerging?<br />Given what’s emerged, what’s next?<br />The DE evaluator:<br />Inquires into developments<br />Tracks developments<br />Facilitates interpretation of developments so that judgments can be made about the what, how, impact and consequence of developments<br />
    32. 32. DE: Asking questions that matter and matching questions to context<br />Patton (2010) suggests that questions that matter can be thought of as “a tool for working in complex situations” (p227)<br />Matching questions to particular situations is a central challenge - situational responsiveness and adaption<br />
    33. 33. 6 simple rules (Patton, 2010)<br />Connect questions with the ideas, language and frameworks of the people you are working with<br />Less is more. Limit the number of questions within the enquiry framework. 7+ or - 2 is a good rule of thumb<br />Keep the evaluation grounded in whatever basic developmental inquiry framework you and those you’re working with choose to guide your work<br />Distinguish overarching inquiry questions from detailed implementation questions <br />There are (no) stupid questions<br />Remember that whatever inquiry framework you are using, the focus is on What is being developed?<br />
    34. 34. DE – is values based‘[DE] sits alongside, doesn’t control or dampen the core values of innovation’ (Wehipeihana, cited in Patton, 2010). <br />Values / processes / relationships become very important<br />‘How’ outcomes / results are achieved is very important<br />Process matters<br />People and relationships matter<br />Where the end is unpredictable and emergent, values and process become anchors<br />‘there will never be enough certain knowledge to guide action’ Wendell Berry cited in Patton, 2010. <br />
    35. 35. DE – inquiry approaches<br />There is no definitive list of developmental evaluation inquiry approaches - neither should one be constructed<br />Developmental evaluation creatively adapts whatever approaches and methods fit the complexities of the situation. <br />It is responsive, appropriate, and credible to those you work with, and support opening up new understandings and guiding further development<br />It’s all about persistently asking questions and pursuing credible answers in time to be used. Questioning is the ultimate method. Questions are the quintessential tools. (Patton, 2010)<br />
    36. 36. DE – uses a range of interpretive frameworks<br />But applies evaluation logic and thinking in complex situations<br />Enables people to engage in data based, ongoing evaluative sense-making and reasoned argument<br />ensuring the criteria (values) people use to decide what’s ‘good’ and desirable are open to scrutiny and made explicit<br />ensuring the decisions made about development / the ‘forks in the road’ are documented and systematically reflected upon in a critical and open minded way<br />
    37. 37. Our experience: introducing DE to the uninitiated<br />Small steps, getting people grounded, on the same page really matters<br />The most simple of frameworks:<br />What, why, when, how, where and who<br />What is happening and what has changed? <br />Why?<br />Who has been affected, Who is involved?<br />How did this happen?<br />Where is the situation / programme at now?<br />When do key people need to know what?<br />How do people feel about the change? Why?<br />etc<br />
    38. 38. Or some slightly more evaluative questions:<br />What’s being developed? (WHAT?)<br />What sense can we make of emerging issues, evidence, data about this development? (WHAT’S EMERGING?)<br />What is the value / importance /significance of what we are doing / achieving? What does it mean to us now and in the future? (SO WHAT?)<br />What does this mean for how we should now act? What are our options? And into the future? (NOW WHAT)<br />
    39. 39. Applying DE questions<br />
    40. 40. Some useful interpretive frameworks:<br />See Handout 5<br />Appreciative Inquiry<br />Success Case Method<br />Most Significant Change<br />Systems approaches – with an emphasis on perspectives, boundaries and interrelationships<br />Outcome mapping<br />Action research<br />
    41. 41. Example – ‘as Māori’ a developmental journey<br />See Handout 6<br />The process<br />As Māori not prescribed in the provider selection process<br />Series of facilitated discussions <br />1 and 2 day provider hui (n=4)<br />individual providers<br />Ongoing iteration<br />Cycle of feedback loops<br />What’s emerging<br />Shared understanding<br />Co-constructed<br /><ul><li>But still emergent</li></li></ul><li>Emergent understanding of ‘as Māori’<br />It’s something that is led by tikanga<br />In a Māori way, underpinned by Māori ways, values and perspectives. <br />It may also be in Māori places… but not necessarily. <br />It may also be doing Māori activities… but not necessarily.<br />It’s not specific just to Māori contexts e.g. Māori people participating as Māori in the wider world context.<br />There’s not going to be one answer<br />I just want some clear definitions of what non-Māori are thinking of it <br />
    42. 42. Co-construction of ‘as Māori’<br />Te reo<br />Kanohi ki te kanohi<br />Guided by kawa and tikanga<br />Whānau, hapu, iwi and Māori <br />Uses Māori institutions e.g. marae, kohanga<br />Whanaungatanga<br />Traditional Māori sports<br />Spirit of us<br />Creating a feeling of belonging and empowerment<br />
    43. 43. What’s the value of ‘as Māori’<br />It’s our own self determination of how we do it, in ways that we want to do it. <br />To do the things we want to do, in ways that we want to do them<br />Opportunity to portray our uniqueness<br />Effective and better engagement achieved when cultural values and aspects core and/or included<br />Traditional sports are just as valuable - in terms of engaging our communities<br />
    44. 44. Example – ‘as Māori’ a developmental journey<br />An ‘as Māori’ continuum with five dimensions emerging: <br />
    45. 45.
    46. 46. AFTERNOON TEA<br />
    47. 47. Some final thoughts, tips and questions<br />FAQs we have encountered:<br />How systematic is DE?<br />How does DE differ from Action Research?<br />Isn’t DE just a bunch of stories – when do you get to outcomes? And measurement? And judgment?<br />What questions do you have about DE?<br />
    48. 48. What we’ve learned a long the way<br />It’s a journey with others, this is not for the sole trader, nor for the feint hearted!<br />As an evaluation team, we model the reflection process together<br /> This is hard thinking stuff, it doesn’t come easy<br />Great rewards possible<br />Thank goodness we chose and are able to use DE, it’s the right way to go, but it is hard work<br />
    49. 49. Wrap up<br />One thing you learned today<br />One thing you will do as a result of today<br />One thing you will follow up on<br />
    50. 50. References<br />Gamble J., A Developmental Evaluation Primer, J.W. McConnell Family Foundation, 2008.<br />Patton, M. Q. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press, June 2010.<br />Patton, M.Q. Utilization-Focused Evaluation, 4th ed., Sage, 2008. <br />Wehipeihana, N and McKegg K, Developmental Evaluation in an Indigenous Context, Reflections on the journey to date. Presentation to American Evaluation Conference, Orlando, Florida, 2009<br />Westley, Frances; Zimmerman, Brenda; Patton, Michael Q. Getting to Maybe: How the World Is Changed? Random House Canada, 2006<br />

    ×