• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program
 

Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program

on

  • 329 views

Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social ...

Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social realities (Patton, 2011; Preskill & Beer, 2012). This presentation examines a case study of the pre-formative development of an innovative educational program. It contributes to research on DE by examining the capacity and contribution of DE for innovating. This case provides evidence supporting the utility of DE in developing innovative programs, but challenges our current understanding of DE on two fronts: 1) the necessary move from data-based to data-informed decision-making within a context of innovating, and 2) the use of DE for program co-creation as an outcome to the demands of social innovation. Analysis reveals the pervasiveness of uncertainty throughout development and how the rendering of evaluative data helps to propel development forward. DE enabled a nonlinear, co-evolutionary development process centering on six foci of development-definition, delineation, collaboration, prototyping, illumination, and evaluation-that characterize the innovation process.

Statistics

Views

Total Views
329
Views on SlideShare
187
Embed Views
142

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 142

http://chiyanlam.com 136
http://evalcentral.com 4
http://news.google.com 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program Presentation Transcript

    • ChiYan Lam, MEdCES 2013Insights on Using Developmental Evaluationfor Innovating:A Case Study on the Co-creation of aninnovative program@chiyanlamJune 11, 2013Assessment and Evaluation Group, Queen’s UniversitySlides available at www.chiyanlam.com1Monday, 10 June, 13
    • Acknowledgements2Monday, 10 June, 13
    • “The significantproblems we havecannot be solved at thesame level of thinkingwith which we createdthem.”http://yareah.com/wp-content/uploads/2012/04/einstein.jpg3Monday, 10 June, 13
    • 4Monday, 10 June, 13
    • ImpersonalHard to FocusPassive5Monday, 10 June, 13
    • ✓Learn from peers✓Reflect on priorexperiences✓Meaning-making✓Active constructionof knowledge+6Monday, 10 June, 13
    • Dilemma• Barriers to Teaching and Learning: $, time, space.• PRACTICUM = out of sight, out of touch.• Instructors became interested in integrating Webtechnologies to Teacher Education to open uppossibilities• The thinking was that assessment learningrequires learners to actively engage with peersand challenge their own experiences andconceptions of assessment.77Monday, 10 June, 13
    • So what happened...?• 22 teacher candidates participated in a hybrid, blendedlearning pilot. They tweeted about their ownexperiences around trying to put into practicecontemporary notions of assessment.• Guided by the script:“Think Tweet Share”• Developmental evaluation guided this exploration,between the instructors, evaluator, and teacher candidatesas a collective in this participatory learning experience.• DE became integrated; Program became agile andresponsive by design88Monday, 10 June, 13
    • 99Monday, 10 June, 13
    • Research Purposeto learn about the capacity of developmentalevaluation to support innovationdevelopment.(from nothing to something)1010Monday, 10 June, 13
    • Research Questions1. To what extent does Assessment Pilot Initiative qualify as adevelopmental evaluation?2. What contribution does developmental evaluation make toenable and promote program development?3. To what extent does developmental evaluation address theneeds of the developers in ways that inform program development?4. What insights, if any, can be drawn from this development about theroles and the responsibilities of the developmental evaluator?1111Monday, 10 June, 13
    • Literature Review12Monday, 10 June, 13
    • Developmental Evaluationin 1994• collaborative, long-termpartnership• purpose: programdevelopment• observation: clients whoeschew clear, specific,measurable goals13Monday, 10 June, 13
    • Developmental Evaluationin 2011• takes on a responsive,collaborative, adaptiveorientation to evaluation• complexity concepts• systems thinking• social innovation14Monday, 10 June, 13
    • Developmental EvaluationDE supports innovation developmentto guide adaptation to emergent anddynamic realities in complexenvironments.DE brings to innovation and adaptationthe processes of:• asking evaluative questions• applying evaluation logic• gathering and reporting eval datato inform support project/program/product, and/ororganizational development in realtime.Thus, feedback is rapid.Evaluator works collaboratively withsocial innovators to conceptualize,design, and test new approaches inlong-term, ongoing process ofadaptation, intentional change anddevelopment.Primary functions of evaluator:• elucidate the innovation andadaptation processes• track their implications and results• facilitate ongoing, real-time data-based decision-making in thedevelopmental process.(Patton, 2011)15Monday, 10 June, 13
    • Developmental EvaluationDE supports innovation development toguide adaptation to emergent anddynamic realities in complexenvironments.DE brings to innovation and adaptationthe processes of:• asking evaluative questions• applying evaluation logic• gathering and reporting evaldata to inform support project/program/product, and/ororganizational development in realtime.Thus, feedback is rapid.Evaluator works collaboratively withsocial innovators to conceptualize,design, and test new approaches inlong-term, ongoing process ofadaptation, intentional change anddevelopment.Primary functions of evaluator:• elucidate the innovation andadaptation processes• track their implications and results• facilitate ongoing, real-timedata-based decision-makingin the developmental process.(Patton, 2011)16Monday, 10 June, 13
    • Developmental EvaluationDE supports innovation developmentto guide adaptation to emergent anddynamic realities in complexenvironmentsDE brings to innovation and adaptationthe processes of:• asking evaluative questions• applying evaluation logic• gathering and reporting eval datato inform support project/program/product, and/ororganizational development in realtime.Thus, feedback is rapid.Evaluator works collaboratively withsocial innovators to conceptualize,design, and test new approaches inlong-term, ongoing process ofadaptation, intentional change anddevelopment.Primary functions of evaluator:• elucidate the innovation andadaptation processes• track their implications and results• facilitate ongoing, real-time data-based decision-making in thedevelopmental process.(Patton, 2011)ImprovementvsDevelopment .17Monday, 10 June, 13
    • DevelopmentalEvaluationisrealitytesting.18Monday, 10 June, 13
    • 1. Social Innovation• SI aspire to change and transform socialrealities (Westley, Zimmerman, & Patton, 2006)• SI is about generating “novel solutions tosocial problems” that are more effective,efficient, sustainable, or just than existing solutionsand for which the value created accruesprimarily to society as a whole rather thanprivate individuals” (Phills, Deiglmeier, & Miller,2008)1919Monday, 10 June, 13
    • 2. Complexity Thinking20Situational Analysis Complexity ConceptsSensitizing frameworks thatattunes the evaluators tocertain things20Monday, 10 June, 13
    • Simple Complicated Complex• predictable• replicable• known• causal if-thenmodels• unpredictable•difficult to replicate• unknown• many interactingvariables/parts• systems thinking?•complex dynamics?• predictable• replicable• known• many variables/partsworking in tandem insequence• requires expertise/training• causal if-then models (Westley, Zimmerman, Patton, 2008)Chaos21Monday, 10 June, 13
    • http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg22Monday, 10 June, 13
    • Complexity Concepts• understanding dynamical behaviour ofsystems• description of behaviour over time• metaphors for describing change• how things change• NOT predictive, not explanatory• (existence of some underlying principles; rules-driven behaviour)2323Monday, 10 June, 13
    • Complexity Concepts• Nonlinearity (butterfly flaps its wings, black swan); cause andeffect• Emergence: new behaviour emerge from interaction... can’t reallypredetermine indicators• Adaptation: systems respond and adapt to each other, toenvironments• Uncertainty: processes and outcomes are unpredictable,uncontrollable, and unknowable in advance.• Dynamical: interactions within, between, among subsystems changein an unpredictable way.• Co-evolution: change in response to adaptation. (growing oldtogether)2424Monday, 10 June, 13
    • 3. Systems Thinking• Pays attention to the influences and relationshipsbetween systems in reference to the whole• a system is a dynamic, complex, structuredfunctional unit• there is flow and exchanges between sub-systems• systems are situated within a particularcontext2525Monday, 10 June, 13
    • Complex Adaptive Dynamic Systems26Monday, 10 June, 13
    • Practicing DE• Adaptive to context, agile in methods,responsive to needs• evaluative thinking - critical thinking• bricoleur• “purpose-and-relationship-driven not[research] method driven”(Patton, 2011, p.288)27Monday, 10 June, 13
    • Five Purposes and Uses:1. Ongoing development in adapting program, strategy,policy, etc.2. Adapting effective principles to a local context3. Developing a rapid response4. Preformative development of a potentially broad-impact, scalable innovation5. Major systems change and cross-scale developmentalevaluation28(Patton, 2011, p. 194)Five Uses of DE28Monday, 10 June, 13
    • Method & Methodology• Questions drive method (Greene, 2007; Teddlie and Tashakkori,2009)• Qualitative Case Study• understanding the intricacies into the phenomenon andthe context• Case is a “specific, unique, bounded system” (Stake,2005, p. 436).• Understanding the system’s activity, and its function andinteractions.• Qualitative research to describe, understand, and infermeaning.2929Monday, 10 June, 13
    • Data SourcesThree pillars of data:1. Program development records2. Development Artifacts3. Interviews with clients on the significance ofvarious DE episodes3030Monday, 10 June, 13
    • Data Analysis1. Reconstructing evidentiary base2. Identifying developmental episodes3. Coding for developmental moments4. Time-series analysis3131Monday, 10 June, 13
    • 32!32Monday, 10 June, 13
    • How the innovationcame to be...33Monday, 10 June, 13
    • Key Developmental Episodes• Ep 1: Evolving understanding in using social mediafor professional learning.• Ep 2: Explicating values through AppreciativeInquiry for program development.• Ep 3: Enhancing collaboration through structuredcommunication• Ep 4: Program development through the use ofevaluative data34Again, you cant connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs34Monday, 10 June, 13
    • (Wicked) Uncertainty• uncertain about how to proceed• uncertain about in what direction to proceed (given many choices)• uncertain about the effects and outcome to how teachercandidates would respond to the intervention• the more questions we answered , the more questions we raised.• Typical of DE:• Clear, Measurable, and Specific Outcomes• Use of planning frameworks.• Traditional evaluation cycles wouldn’t work.3535Monday, 10 June, 13
    • How the innovation came tobe...• Reframing what constituted “data”• not intentional, but an adaptive responseto emergent needs:• informational needs concerningdevelopment; collected, analyzed, interpreted• relevant theories, concepts, ideas; introducedto catalyze thinking. Led to learning and un-learning.3636Monday, 10 June, 13
    • Major FindingsRQ1: To what extent does API qualifyas a developmental evaluation?1. Preformative development of a potentiallybroad-impact, scalable innovation2. Patton: Did something get developed?(Improvement vs development vs innovation)37✔✔✗37Monday, 10 June, 13
    • RQ2: What contribution does DEmake to enable and promote programdevelopment?1. Lent a data-informed process to innovation2. Implication: responsiveness• program-in-action became adaptive tothe emergent needs of users3. Consequence: resolving uncertainty3838Monday, 10 June, 13
    • RQ3: To what extent does DE addressthe needs of developers in ways thatinform program development?1. Definition - defining the “problem”2. Delineation - narrowing down the problem space3. Collaboration - collaboration processes; drawing oncollective strength and contributions4. Prototyping - integration and synthesis of ideas to readya program for implementation5. Illumination - iterative learning and adaptive development6. Evaluation - formal evaluation processes to reality-test3939Monday, 10 June, 13
    • Implications toEvaluation• One of the first documented case study intodevelopmental evaluation• Contributions into understanding, analyzingand reporting development as a process• Delineating the kinds of roles andresponsibilities that promote development• The notion of design emerges from thisstudy4040Monday, 10 June, 13
    • Implication to Theory41Monday, 10 June, 13
    • • Program as co-created• Attending to the “theory” of the program• DE as a way to drive the innovating process• Six foci of development• Designing programs?42Monday, 10 June, 13
    • Design andDesign Thinking4343Monday, 10 June, 13
    • Design+Design Thinking“Design is the systematic exploration into the complexity of options (inprogram values, assumptions, output, impact, and technologies) anddecision-making processes that results in purposeful decisions about thefeatures and components of a program-in-development that is informed bythe best conception of the complexity surrounding a social need.Design is dependent on the existence and validity of highly situated andcontextualized knowledge about the realities of stakeholders at a site ofinnovation.The design process fits potential technologies, ideas, andconcepts to reconfigure the social realities.This results in the emergence ofa program that is adaptive and responsive to the needs of program users.”(Lam, 2011, p. 137-138)4444Monday, 10 June, 13
    • Implications toEvaluation Practice1. Manager2. Facilitator of learning3. Evaluator4. Innovation thinker4545Monday, 10 June, 13
    • Limitations• Contextually bound, so not generalizable• but it does add knowledge to the field• Data of the study is only as good as the data collected fromthe evaluation• better if I had captured the program-in-action• Analysis of the outcome of API could help strength the casestudy• but not necessary to achieving the research foci• Cross-case analysis would be a better method for generatingunderstanding.4646Monday, 10 June, 13
    • ThankYou!Let’s Connect.@chiyanlamchi.lam@QueensU.cawww.chiyanlam.com47Monday, 10 June, 13