Kate McKegg and Nan Wehipeihana (2010). Developmental Evaluation: A practitioner's introduction. A pre-conference workshop presented at the Australasian Evaluation Society (AES) Conference, September 2010, Wellington, New Zealand.
6. Developmental Evaluation - what is it? Developmental evaluation supports innovation development to guide adaption to emergent and dynamic realities in complex environments. (Michael Quinn Patton, 2010)
8. Five purposes of developmental evaluation Adapted from Patton (2010)
9. Development and innovation in complex situations Intractable issues e.g., poverty, homelessness, environmental degradation, family violence, obesity Uncertainty about options and solutions Political and economic turbulence Environment characterised by unpredictability and the unexpected emergence of issues and ideas (e.g., rise of social networking) Rapid adaption and change occurring across many contexts
10. DE is informed by complexity and systems thinking (picture sourced from Patton, 2009)
11. Some key complexity concepts Adaption Emergence Non-linearity Uncertainty Interdependence
14. DE - a distinct niche Two distinct niches Pre-formative Dynamic situations Support exploration and innovation before there is a program model (pre-formative) Support the ongoing development and adaption of a programme, or innovation in emergent and complex situations (dynamic situations)
15. DE - a distinct niche Different to formative and summative Differs from improvement orientated evaluation (making a programme better) It aims to support the ongoing real-time decisions – what to change, expand, close out further develop Important Distinction: Differentiating ongoing strategic thinking and periodic strategic planning DE as a form of thinking and acting strategically as an innovative intervention unfolds
16. Increase participation in sport and recreation – ‘by Māori’ 2007-08 review and programme ‘re-visioning’ Focus on participation ‘as Māori’ participation Tino rangatiratanga (self determination) 20 years of Māori development The story of He Oranga Poutama
17. Case example – He Oranga Poutama and complexity Uncertainty around the new programme concept and direction, not sure what ‘as Māori’ participation would involve New innovative programme and goal Values, principles and vision driven No formal evaluation evidence from other efforts about what to expect, no real data available to measure key concepts and variables Complex environment – contracting economy, political environment, changing organisation
18. DE – and creative thinking Requires critical AND creative thinking We have to think outside the box Ask questions about what makes sense as we try to connect evaluation to the program development
19. Creative thinking Draw four straight lines which go through the middle of all of the dots without taking the pencil off the paper
20. Creative thinking: Thinking outside the boxhttp://www.brainstorming.co.uk/puzzles/ninedotsnj.html Step 2 Step 3 Start here 1 Step 4
22. When not to use DE Not able or willing to commit the time to actively participate in the evaluation and to build and sustain relational trust – no room for the set and forget evaluation management approach High levels of certainty required – no tolerance for ambiguity Lack of openness to experimentation and reflection – high level of design certainty required No adaptive capacity – no real ability to make changes Unwillingness to ‘fail’ or hear ‘bad news’ – fear of failure context overrides learning focus Poor relationships between management and staff and evaluators – respectful and trusting relationships critical in a context of uncertainty, ambiguity and turbulence
24. Exercise 2:Teasing out complexity Handout 3: Read the brief description of Whānau Ora What if anything is complex about Whānau Ora? Share and discuss in small groups Make notes to feedback to the larger group
25. Roles and relationships of the DE evaluator Evaluator is part of the innovation team Facilitator and learning coach Brings evaluative thinking to the group Support or share innovators’ values and vision Support the feedback process around “what is being developed”
27. Skills, abilities and dispositions The ability to build and maintain trusting relationship Critical in a context of uncertainty, ambiguity and turbulence High level of facilitation skills, multiple settings, stakeholders and contexts To generate and ‘test’ ideas, to provide feedback, to ask the ‘tough’ questions; highly engaging, to build evaluative capability Deep methodological toolkit Methodological flexibility, eclecticism, and adaptability (Patton, p27, 2010) Enquiring Observant and critically reflective Systematic Attention to data and making data driven decisions Courageous Flexible, adaptable, responsive ‘on the run, on the fly’
29. What will happen to the piece of wood when the person lets go of it?
30.
31. DE in practice - Inquiry approaches Focus of DE is on what is being developed? What’s emerging? Given what’s emerged, what’s next? The DE evaluator: Inquires into developments Tracks developments Facilitates interpretation of developments so that judgments can be made about the what, how, impact and consequence of developments
32. DE: Asking questions that matter and matching questions to context Patton (2010) suggests that questions that matter can be thought of as “a tool for working in complex situations” (p227) Matching questions to particular situations is a central challenge - situational responsiveness and adaption
33. 6 simple rules (Patton, 2010) Connect questions with the ideas, language and frameworks of the people you are working with Less is more. Limit the number of questions within the enquiry framework. 7+ or - 2 is a good rule of thumb Keep the evaluation grounded in whatever basic developmental inquiry framework you and those you’re working with choose to guide your work Distinguish overarching inquiry questions from detailed implementation questions There are (no) stupid questions Remember that whatever inquiry framework you are using, the focus is on What is being developed?
34. DE – is values based‘[DE] sits alongside, doesn’t control or dampen the core values of innovation’ (Wehipeihana, cited in Patton, 2010). Values / processes / relationships become very important ‘How’ outcomes / results are achieved is very important Process matters People and relationships matter Where the end is unpredictable and emergent, values and process become anchors ‘there will never be enough certain knowledge to guide action’ Wendell Berry cited in Patton, 2010.
35. DE – inquiry approaches There is no definitive list of developmental evaluation inquiry approaches - neither should one be constructed Developmental evaluation creatively adapts whatever approaches and methods fit the complexities of the situation. It is responsive, appropriate, and credible to those you work with, and support opening up new understandings and guiding further development It’s all about persistently asking questions and pursuing credible answers in time to be used. Questioning is the ultimate method. Questions are the quintessential tools. (Patton, 2010)
36. DE – uses a range of interpretive frameworks But applies evaluation logic and thinking in complex situations Enables people to engage in data based, ongoing evaluative sense-making and reasoned argument ensuring the criteria (values) people use to decide what’s ‘good’ and desirable are open to scrutiny and made explicit ensuring the decisions made about development / the ‘forks in the road’ are documented and systematically reflected upon in a critical and open minded way
37. Our experience: introducing DE to the uninitiated Small steps, getting people grounded, on the same page really matters The most simple of frameworks: What, why, when, how, where and who What is happening and what has changed? Why? Who has been affected, Who is involved? How did this happen? Where is the situation / programme at now? When do key people need to know what? How do people feel about the change? Why? etc
38. Or some slightly more evaluative questions: What’s being developed? (WHAT?) What sense can we make of emerging issues, evidence, data about this development? (WHAT’S EMERGING?) What is the value / importance /significance of what we are doing / achieving? What does it mean to us now and in the future? (SO WHAT?) What does this mean for how we should now act? What are our options? And into the future? (NOW WHAT)
40. Some useful interpretive frameworks: See Handout 5 Appreciative Inquiry Success Case Method Most Significant Change Systems approaches – with an emphasis on perspectives, boundaries and interrelationships Outcome mapping Action research
41.
42. Co-construction of ‘as Māori’ Te reo Kanohi ki te kanohi Guided by kawa and tikanga Whānau, hapu, iwi and Māori Uses Māori institutions e.g. marae, kohanga Whanaungatanga Traditional Māori sports Spirit of us Creating a feeling of belonging and empowerment
43. What’s the value of ‘as Māori’ It’s our own self determination of how we do it, in ways that we want to do it. To do the things we want to do, in ways that we want to do them Opportunity to portray our uniqueness Effective and better engagement achieved when cultural values and aspects core and/or included Traditional sports are just as valuable - in terms of engaging our communities
44. Example – ‘as Māori’ a developmental journey An ‘as Māori’ continuum with five dimensions emerging:
47. Some final thoughts, tips and questions FAQs we have encountered: How systematic is DE? How does DE differ from Action Research? Isn’t DE just a bunch of stories – when do you get to outcomes? And measurement? And judgment? What questions do you have about DE?
48. What we’ve learned a long the way It’s a journey with others, this is not for the sole trader, nor for the feint hearted! As an evaluation team, we model the reflection process together This is hard thinking stuff, it doesn’t come easy Great rewards possible Thank goodness we chose and are able to use DE, it’s the right way to go, but it is hard work
49. Wrap up One thing you learned today One thing you will do as a result of today One thing you will follow up on
50. References Gamble J., A Developmental Evaluation Primer, J.W. McConnell Family Foundation, 2008. Patton, M. Q. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press, June 2010. Patton, M.Q. Utilization-Focused Evaluation, 4th ed., Sage, 2008. Wehipeihana, N and McKegg K, Developmental Evaluation in an Indigenous Context, Reflections on the journey to date. Presentation to American Evaluation Conference, Orlando, Florida, 2009 Westley, Frances; Zimmerman, Brenda; Patton, Michael Q. Getting to Maybe: How the World Is Changed? Random House Canada, 2006
Editor's Notes
Ongoing development adaptation of a programme, strategy or innovation to new conditions in complex dynamic situationAdapting effective general principles to a new context the dynamic middle between top-down and bottom-up forces for changeDeveloping a rapid response in turbulent, disaster situations and exploring real-time solutions and generative innovative responsesPre-formative development of a potentially scalable innovation a period of exploration to shape a model that is more fully conceptualised, potentially scalable interventionMajor systems change and cross-scale development evaluation how major systems change unfolds, evidence emergent tipping
Ongoing development adaptation of a programme, strategy or innovation to new conditions in complex dynamic situationAdapting effective general principles to a new context the dynamic middle between top-down and bottom-up forces for changeDeveloping a rapid response in turbulent, disaster situations and exploring real-time solutions and generative innovative responsesPre-formative development of a potentially scalable innovation a period of exploration to shape a model that is more fully conceptualised, potentially scalable interventionMajor systems change and cross-scale development evaluation how major systems change unfolds, evidence emergent tipping
Supports ongoing real-time decisions about what to change, expand, close out or further developFormative evaluation is most commonly conceived as evaluative effort aimed to help programmes with initial design and set up, improve implementation, i.e., getting programmes settled and stable - ready for summative evaluation - or more importantly aimed to set up a programme model of some sort,Summative evaluation addresses the question – did the program work? This requires a clear understanding and specification of the IT. This needs to be identifiable if we are going to be able to say something about the IT, and formative evaluation has played an important role in this process. Summative evaluation’s purpose is most commonly understood to be about determining the value or worth of a programme, in order to make decisions about continuing, expanding or stopping it. DE can be Pre-programmatic – such as working with a community to try and find ways to work with youth – lots of things are tried, piloted, tested out – not everything works, and changes are made…this is development, and is not just about improvement – it is about exploring the parameters of an innovation – as it takes shape. What’s being tried is an approach, more than a model. It is about helping people clarify, focus and articulate what they are doing, as they do it. DE also supports ongoing development – adaption. And this is not necessarily the same as improvement. Change is conceived very often necessary adaption to new circumstances or conditions. Implementing strategy in organisations, or even programs in the real world just doesn’t happen in a neat way. Although there is the intended implementation, what unfolds is always somewhat different. Some things get dropped or go undone, and some new things emerge, as new opportunities arise. In this ‘real world’ of developmental implementation, DE helps programs and organisations track and document the changes and adaptions as they happen. The distinction between improvement and development is NOT absolute or unambiguous – it is nuanced, and depends on intention and perception. But importantly, DE is driven by the program, or by the organisation – so if the program wants to call what they are doing development, then it’s DE they are doing. (Chapter 2)
Formative evaluation is most commonly conceived as evaluative effort aimed to help programmes with initial design and set up, improve implementation, i.e., getting programmes settled and stable - ready for summative evaluation - or more importantly aimed to set up a programme model of some sort,Summative evaluation addresses the question – did the program work? This requires a clear understanding and specification of the IT. This needs to be identifiable if we are going to be able to say something about the IT, and formative evaluation has played an important role in this process. Summative evaluation’s purpose is most commonly understood to be about determining the value or worth of a programme, in order to make decisions about continuing, expanding or stopping it. DE can be Pre-programmatic – such as working with a community to try and find ways to work with youth – lots of things are tried, piloted, tested out – not everything works, and changes are made…this is development, and is not just about improvement – it is about exploring the parameters of an innovation – as it takes shape. What’s being tried is an approach, more than a model. It is about helping people clarify, focus and articulate what they are doing, as they do it. DE also supports ongoing development – adaption. And this is not necessarily the same as improvement. Change is conceived very often necessary adaption to new circumstances or conditions. Implementing strategy in organisations, or even programs in the real world just doesn’t happen in a neat way. Although there is the intended implementation, what unfolds is always somewhat different. Some things get dropped or go undone, and some new things emerge, as new opportunities arise. In this ‘real world’ of developmental implementation, DE helps programs and organisations track and document the changes and adaptions as they happen. The distinction between improvement and development is NOT absolute or unambiguous – it is nuanced, and depends on intention and perception. But importantly, DE is driven by the program, or by the organisation – so if the program wants to call what they are doing development, then it’s DE they are doing. (Chapter 2)
Providing timely feedback, reflection and generation of learning's for development processTracking emergent findings and connectionsResponding to the unexpectedCollaboration and co-creationCapacity developmentInternal accountability focus – values orientationDealing with ambiguity and lack of control
Providing timely feedback, reflection and generation of learnings for development processTracking emergent findings and connectionsResponding to the unexpectedCollaboration and co-creationCapacity developmentInternal accountability focus – values orientationDealing with ambiguity and lack of control
While this answer IS the most appropriate answer for common living, this puzzle shows how easy it is to give the most obvious answer first and to ignore alternative possibilities. It's so easy not to consider the context of the problem. Different situations need different answers. http://www.brainstorming.co.uk/puzzles/dropblock.html