A Case Study on the Use ofDevelopmental Evaluationfor Navigating Uncertainty during Social Innovation                     ...
“The significant   problems we havecannot be solved at the same level of thinkingwith which we created        them.”       ...
Developmental Evaluation       in 1994 •   collaborative, long-term     partnership •   purpose: program     development •...
Developmental Evaluation       in 2011 •   takes on a responsive,     collaborative, adaptive     orientation to evaluatio...
Developmental Evaluation                                    (Patton, 1994, 2011)DE supports innovation development       E...
Developmental Evaluation                                     (Patton, 1994, 2011)DE supports innovation development to    ...
Developmental Evaluation                                        (Patton, 1994, 2011)     ImprovementDE supports innovation...
Developmental  Evaluation     is realitytesting.
• DE: novel, yet-  to-be developed  empirical &  practical basis• Research on  Evaluation:  • scope and    limitations  • ...
Overview• Theoretical Overview (only briefly)• Case Context• Case Study
Research Purposeto learn about the capacity of developmental      evaluation to support innovation                developm...
Research Questions1.	   To what extent does Assessment Pilot Initiative qualify as adevelopmental evaluation?2.	   What co...
Social Innovation• SI aspire to change and transform social  realities (Westley, Zimmerman, & Patton,  2006)• generating “...
Complexity ThinkingSituational Analysis        Complexity Concepts                                   “sensemaking” framewo...
Simple Complicated Complex                                                                C• predictable• replicable      ...
http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg
Complexity Concepts•   understanding dynamical behaviour of    systems•   description of behaviour over time•   metaphors ...
Complexity Concepts•   Nonlinearity (butterfly flaps its wings, black swan); cause and    effect•   Emergence: new behaviour...
Systems Thinking•   Pays attention to the influences and    relationships between systems in reference to    the whole    •...
Complex Adaptive Dynamic Systems
Practicing DE• Adapative to context, agile in methods,  responsive to needs• evaluative thinking - critical thinking• bric...
Five Uses of DE                        (Patton, 2011, p. 194)Five Purposes and Uses 1. Ongoing development in adapting pro...
Method & Methodology•   Questions drive method (Greene, 2007; Teddlie and Tashakkori,    2009)•   Qualitative Case Study  ...
Data Sources• Three pillars of data1. Program development records2. Development Artifacts3. Interviews with clients on the...
Data Analysis1. Reconstructing evidentiary base2. Identifying developmental episodes3. Coding for developmental moments4. ...
!    27
Assessment Pilot Initiative  •   Describes the innovative efforts of a team of 3      teacher educators promoting contempo...
29
Book-ending: Concluding      Conditions•   22 teacher candidates participated in a hybrid, blended    learning pilot. They...
31
How the innovation  came to be...
Key Developmental Episodes           •     Ep 1: Evolving understanding in using social media                 for professi...
(Wicked) Uncertainty•   uncertain about how to proceed•   uncertain about in what direction to proceed (given many choices...
How the innovation came to           be...  • Reframing what constituted “data”  • not intentional, but an adaptive respon...
Major Findings    RQ1: To what extent does API qualify as a           developmental evaluation?1. Preformative development...
RQ2: What contribution does DE make to   enable and promote program development?1. Lent a data-informed process to innovat...
RQ3: To what extent does DE address the      needs of developers in ways that inform              program development?1. D...
Implications to            Evaluation• One of the first documented case study into  developmental evaluation• Contributions...
Implication to Theory
• Program as co-created• Attending to the “theory” of the program• DE as a way to drive the innovating process• Six foci o...
Design andDesign Thinking      42
Design+Design Thinking  “Design is the systematic exploration into the complexity of options (in  program values, assumpti...
Implications to     Evaluation Practice1. Manager2. Facilitator of learning3. Evaluator4. Innovation thinker              ...
Limitations•   Contextually bound, so not generalizable    •   but it does add knowledge to the field•   Data of the study ...
Thank You!  Let’s Connect!      @chiyanlamchi.lam@QueensU.ca
A Case Study on the Use of Developmental Evaluation  for Navigating Uncertainty during Social Innovation
Upcoming SlideShare
Loading in …5
×

A Case Study on the Use of Developmental Evaluation for Navigating Uncertainty during Social Innovation

1,253 views

Published on

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,253
On SlideShare
0
From Embeds
0
Number of Embeds
443
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Welcome! Thanks for coming today. It truly exciting to speak on the topic of develpmental evaluation and many thanks to the evaluation use tig for making this possible. Today, I want to share the results of a project I”ve been working on for some time, and in doing so, challenge our collective thinking around the space of possibilty created by developmental evaluation. For this paper, i want to focus on the innovation space and on the process of innovating. The slides are already available at www.chiyanlam.com and will be available shortly under the Eval Use TIG elibrary. \n
  • Let me frame with presentation with a quote by Albert Einstein.\n
  • in 1994, Patton made the observation that some clients resisted typical formative/summative evaluation approaches because of the work that they do. They themselves don’t see a point in freezing a program in time in order to have it assessed and evaluated. He described an approach where he worked collaboratively, as part of the team, to help render evaluative data that would help these program staff adapt and evolve their programs. So, the process of engaging in “developmental evaluation” becomes paramount.\n
  • Fast forward to 2011, ... These are concepts that I will only touch on briefly. \n
  • Developmental evaluation is positioned to be a response to evaluators who work in complex space. \n\nFirst described by Patton in 1994, and futher elaborated in 2011, Developmental Evaluation proposes a collaborative and participatory approach to involving the evaluator in the development process. \n\n\n
  • \n\n
  • Developmental evaluation is positioned to be a response to evaluators who work in complex space. \n\nFirst described by Patton in 1994, and futher elaborated in 2011, Developmental Evaluation proposes a collaborative and participatory approach to involving the evaluator in the development process. \n\n\n
  • So what underlies DE is a commitment to reality testing. Patton positions it as one of the approaches one could taken within a utilization-focused framework: formative/summative/developmental. \n
  • You only need to conferences like this one to hear the buzz and curiosity over developmental evaluation. In february, a webinar was offered by UNICEF, and over 500+... \n
  • If we move beyond the excitement and buzz around DE, we see that DE is still very new. There is not a lot of empirical or practical basis to the arguments. If we are serious about the utilization about DE, \n
  • So, in the remaining time, I want us to dig deep into a case of developmental evaluation. \n
  • \n
  • \n
  • The observation that Patton made is very acute back in 1994 --- SI don’t rest, programs don’t stand still, addressing social problems means aiming a moving target that shifts as society changes...\n
  • \n
  • Let’s put it in a learning context. Simple would be teaching you CPR. I”ll keep the steps simple, rehearse it many times, so that you can do it when needed. \nComplicated would be teaching teacher candidates how to plan a lesson while taken into consideration curriculum expectations, learning objectives, isntructional methods/strategies, and assessment methods\nComplex - preparing TC to become a professional. We have many diff. parts (prac, foci, prof classes), think, behave, and participate like a contributing member of the profession. \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Document analysis of 10 design meetings over a 10-month period\n reveals the developmental “footprint”\n tracks and identifies specific developmental concerns being unpacked at a certain point in time of the project\n Interviews with the core design team members (2 instructors, 1 lead TA)\n illuminates which aspects of the DE was found helpful by the program designers\n
  • \n
  • \n
  • \n
  • The instructors of the case were responsible for teaching teacher candidates enrolled in a teacher education program classroom assessment. \n\nIn the field of teacher education, particularly in classroom assessment, the field is experiencing a very tricky situation where teachers are not assessing in the ways that we know helps with student learning. Much of what teachers do currently focuses on traditional notions of testing. At the level of teacher education, the problem is felt more acutely, because teacher candidates are not necessarily exhibiting the kinds of practice we would like to see from them. At my institution, we have two instructors responsible for delivering a module in classroom assessment in 7 hours total. In brief, there are many constraints that we’ve to work around, many of which we have little control over.\n\nWhat we do have control over is how we deliver that instruction. After a survey of different options, we were interested in integrating social media into teacher education as a way of building a community of learners. Our thinking was that assessment learning requires learners to actively engage with peers and to challenge their own experiences and conceptions of assessment.\n\nThat was the vision that guided our work. \n
  • So those were the beginning conditions. let me fast forward and describe for you how far we got in the evaluation. Then we’ll look at how the innovation came to be.\n\n
  • \n
  • \n
  • \n
  • The Development was marked by several key representative episodes:\n1 - creating a learning environment\n2 - use of AI; to help explicate values, so to gain clarity into the QUALITY of the program. \n3 - an example of how DE promotes collaboration\n4 - the use of DE findings to engage clients in sense-making in order to formulate next steps. \n
  • When I look back at the data, uncertainty was evident throughout the evaluation. the team was uncertain about...\n
  • ... typically data in an evaluation are made at the program-level, descirbing quality... \n
  • So let’s return now and unpack the case. \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • A Case Study on the Use of Developmental Evaluation for Navigating Uncertainty during Social Innovation

    1. 1. A Case Study on the Use ofDevelopmental Evaluationfor Navigating Uncertainty during Social Innovation Chi Yan Lam, MEd AEA 2012 @chiyanlam October 25, 2012 Assessment and Evaluation Group, Queen’s University Slides available now at www.chiyanlam.com
    2. 2. “The significant problems we havecannot be solved at the same level of thinkingwith which we created them.” http://yareah.com/wp-content/uploads/2012/04/einstein.jpg
    3. 3. Developmental Evaluation in 1994 • collaborative, long-term partnership • purpose: program development • observation: clients who eschew clear,specific, measurable goals
    4. 4. Developmental Evaluation in 2011 • takes on a responsive, collaborative, adaptive orientation to evaluation • complexity concepts • systems thinking • social innovation
    5. 5. Developmental Evaluation (Patton, 1994, 2011)DE supports innovation development Evaluator works collaboratively withto guide adaptation to emergent and social innovators to conceptualize,dynamic realities in complex design, and test new approaches inenvironments long-term, ongoing process of adaptation, intentional change andDE brings to innovation and adaptation development.the processes of: Primary functions of evaluator: • asking evaluative questions • elucidate the innovation and • applying evaluation logic adaptation processes • gathering and reporting eval data • track their implications and results to inform support project/ program/product, and/or • facilitate ongoing, real-time data- organizational development in real based decision-making in the time. Thus, feedback is rapid. developmental process.
    6. 6. Developmental Evaluation (Patton, 1994, 2011)DE supports innovation development to Evaluator works collaboratively withguide adaptation to emergent and social innovators to conceptualize,dynamic realities in complex design, and test new approaches inenvironments long-term, ongoing process of adaptation, intentional change andDE brings to innovation and adaptation development.the processes of: Primary functions of evaluator: • asking evaluative questions • elucidate the innovation and • applying evaluation logic adaptation processes • gathering and reporting eval • track their implications and results data to inform support project/ program/product, and/or • facilitate ongoing, real-time organizational development in real data-based decision-making time. Thus, feedback is rapid. in the developmental process.
    7. 7. Developmental Evaluation (Patton, 1994, 2011) ImprovementDE supports innovation developmentto guide adaptation to emergent anddynamic realities in complex Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches inenvironments long-term, ongoing process of adaptation, intentional change andDE brings to innovation and adaptation development.the processes of: • asking evaluative questions vs Primary functions of evaluator: • elucidate the innovation and • applying evaluation logic adaptation processes • gathering and reporting eval data • track their implications and results to inform support project/ Development • facilitate ongoing, real-time data- program/product, and/or organizational development in real based decision-making in the time. Thus, feedback is rapid. developmental process. .
    8. 8. Developmental Evaluation is realitytesting.
    9. 9. • DE: novel, yet- to-be developed empirical & practical basis• Research on Evaluation: • scope and limitations • utility & suitability diff. context• Legitimize DE
    10. 10. Overview• Theoretical Overview (only briefly)• Case Context• Case Study
    11. 11. Research Purposeto learn about the capacity of developmental evaluation to support innovation development. (from nothing to something) 12
    12. 12. Research Questions1. To what extent does Assessment Pilot Initiative qualify as adevelopmental evaluation?2. What contribution does developmental evaluation make toenable and promote program development?3. To what extent does developmental evaluation address theneeds of the developers in ways that inform program development?4. What insights, if any, can be drawn from this development about theroles and the responsibilities of the developmental evaluator? 13
    13. 13. Social Innovation• SI aspire to change and transform social realities (Westley, Zimmerman, & Patton, 2006)• generating “novel solutions to social problems that are more effective, efficient, sustainable, or just than existing solutions and for which the value created accrues primarily to society as a whole rather than private individuals” (Phills, Deiglmeier, & Miller, 2008) 14
    14. 14. Complexity ThinkingSituational Analysis Complexity Concepts “sensemaking” frameworks that attunes the evaluators to certain things 15
    15. 15. Simple Complicated Complex C• predictable• replicable • • predictable replicable • unpredictable •difficult to replicate h • a• known known • unknown • many variables/parts • many interacting• causal if-then working in tandem in variables/parts models • sequence requires expertise/training • systems thinking? o • complex dynamics? • causal if-then models (Westley, Zimmerman, Patton, 2008) s
    16. 16. http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg
    17. 17. Complexity Concepts• understanding dynamical behaviour of systems• description of behaviour over time• metaphors for describing change• how things change• NOT predictive, not explanatory • (existence of some underlying principles; rules- driven behaviour) 18
    18. 18. Complexity Concepts• Nonlinearity (butterfly flaps its wings, black swan); cause and effect• Emergence: new behaviour emerge from interaction... can’t really predetermine indicators• Adaptation: systems respond and adapt to each other, to environments• Uncertainty: processes and outcomes are unpredictable, uncontrollable, and unknowable in advance.• Dynamical: interactions within, between, among subsystems change in an unpredictable way.• Co-evolution: change in response to adaptation. (growing old together) 19
    19. 19. Systems Thinking• Pays attention to the influences and relationships between systems in reference to the whole • a system is a dynamic, complex, structured functional unit • there is flow and exchanges between systems • systems are situated within a particular context 20
    20. 20. Complex Adaptive Dynamic Systems
    21. 21. Practicing DE• Adapative to context, agile in methods, responsive to needs• evaluative thinking - critical thinking• bricoleur• “purpose-and-relationship-driven not [research] method driven”(Patton, 2011, p. 288)
    22. 22. Five Uses of DE (Patton, 2011, p. 194)Five Purposes and Uses 1. Ongoing development in adapting program, strategy, policy, etc. 2. Adapting effective principles to a local context 3. Developing a rapid response 4. Preformative development of a potentially broad- impact, scalable innovation 5. Major systems change and cross-scale developmental evaluation 23
    23. 23. Method & Methodology• Questions drive method (Greene, 2007; Teddlie and Tashakkori, 2009)• Qualitative Case Study • understanding the intricacies into the phenomenon and the context • Case is a “specific, unique, bounded system” (Stake, 2005, p. 436). • Understanding the system’s activity, and its function and interactions. • Qualitative research to describe, understand, and infer meaning. 24
    24. 24. Data Sources• Three pillars of data1. Program development records2. Development Artifacts3. Interviews with clients on the significance of various DE episodes 25
    25. 25. Data Analysis1. Reconstructing evidentiary base2. Identifying developmental episodes3. Coding for developmental moments4. Time-series analysis 26
    26. 26. ! 27
    27. 27. Assessment Pilot Initiative • Describes the innovative efforts of a team of 3 teacher educators promoting contemporary notions of classroom assessment • Teaching and Learning Constraints ($, time, space) • Interested in integrating Social Media into Teacher Education (classroom assessment) • The thinking was that assessment learning requires learners to actively engage with peers and challenge their own experiences and conceptions of assessment. 28
    28. 28. 29
    29. 29. Book-ending: Concluding Conditions• 22 teacher candidates participated in a hybrid, blended learning pilot. They tweeted about their own experiences around trying to put into practice contemporary notions of assessment• Guided by the script: “Think Tweet Share” - grounded in e- learning and learning theories• Developmental evaluation guided this exploration, between the instructors, evaluator, and teacher candidates as a collective in this participatory learning experience.• DE became integrated; Program became agile and responsive by design 30
    30. 30. 31
    31. 31. How the innovation came to be...
    32. 32. Key Developmental Episodes • Ep 1: Evolving understanding in using social media for professional learning. • Ep 2: Explicating values through Appreciative Inquiry for program development. • Ep 3: Enhancing collaboration through structured communication • Ep 4: Program development through the use of evaluative data Again, you cant connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs 33
    33. 33. (Wicked) Uncertainty• uncertain about how to proceed• uncertain about in what direction to proceed (given many choices)• uncertain how teacher candidates would respond to the intervention• the more questions we answered , the more questions we raised.• Typical of DE: • Clear, Measurable, and Specific Outcomes • Use of planning frameworks. • Traditional evaluation cycles wouldn’t work. 34
    34. 34. How the innovation came to be... • Reframing what constituted “data” • not intentional, but an adaptive response • informational needs concerning development; collected, analyzed, interpreted • relevant theories, concepts, ideas; introduced to catalyze thinking. Led to learning and un-learning. 35
    35. 35. Major Findings RQ1: To what extent does API qualify as a developmental evaluation?1. Preformative development of a potentially broad-impact, scalable innovation2. Patton: Did something get developed? ✗ (Improvement vs development vs innovation) ✔ ✔ 36
    36. 36. RQ2: What contribution does DE make to enable and promote program development?1. Lent a data-informed process to innovation2. Implication: responsiveness • program-in-action became adaptive to the emergent needs of users3. Consequence: resolving uncertainty 37
    37. 37. RQ3: To what extent does DE address the needs of developers in ways that inform program development?1. Definition - defining the “problem”2. Delineation - narrowing down the problem space3. Collaboration - collaboration processes; drawing on collective strength and contributions4. Prototyping - integration and synthesis of ideas to ready a program for implementation5. Illumination - iterative learning and adaptive development6. Evaluation - formal evaluation processes to reality-test 38
    38. 38. Implications to Evaluation• One of the first documented case study into developmental evaluation• Contributions into understanding, analyzing and reporting development as a process• Delineating the kinds of roles and responsibilities that promote development• The notion of design emerges from this study 39
    39. 39. Implication to Theory
    40. 40. • Program as co-created• Attending to the “theory” of the program• DE as a way to drive the innovating process• Six foci of development• Designing programs?
    41. 41. Design andDesign Thinking 42
    42. 42. Design+Design Thinking “Design is the systematic exploration into the complexity of options (in program values, assumptions, output, impact, and technologies) and decision-making processes that results in purposeful decisions about the features and components of a program-in-development that is informed by the best conception of the complexity surrounding a social need. Design is dependent on the existence and validity of highly situated and contextualized knowledge about the realities of stakeholders at a site of innovation. The design process fits potential technologies, ideas, and concepts to reconfigure the social realities. This results in the emergence of a program that is adaptive and responsive to the needs of program users.” (Lam, 2011, p. 137-138) 43
    43. 43. Implications to Evaluation Practice1. Manager2. Facilitator of learning3. Evaluator4. Innovation thinker 44
    44. 44. Limitations• Contextually bound, so not generalizable • but it does add knowledge to the field• Data of the study is only as good as the data collected from the evaluation • better if I had captured the program-in-action• Analysis of the outcome of API could help strength the case study • but not necessary to achieving the research foci• Cross-case analysis would be a better method for generating understanding. 45
    45. 45. Thank You! Let’s Connect! @chiyanlamchi.lam@QueensU.ca

    ×