• Save
Developmental Evaluation and the Graduate Student Researcher
Upcoming SlideShare
Loading in...5
×
 

Developmental Evaluation and the Graduate Student Researcher

on

  • 747 views

This presentation introduces the discipline of program evaluation and offers a glimpse to how developmental evaluation responds to the call of providing an evaluation approach to working in complex ...

This presentation introduces the discipline of program evaluation and offers a glimpse to how developmental evaluation responds to the call of providing an evaluation approach to working in complex contexts, such as social innovation. I conclude by introducing the notion of design and design thinking as a way of approaching problems we face in today's complex world.

I discuss, briefly, some of the strategies I employed to manage my own thesis research as a graduate student researcher in education.

Statistics

Views

Total Views
747
Views on SlideShare
331
Embed Views
416

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 416

http://chiyanlam.com 416

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Developmental Evaluation and the Graduate Student Researcher Developmental Evaluation and the Graduate Student Researcher Presentation Transcript

  • Developmental Evaluationand the Graduate Student Researcher Chi Yan Lam Queen’s University EGSS ScholarShare @chiyanlam February 22, 2012 Assessment and Evaluation Group, Queen’s University 1
  • Researching &Researching EvaluationCutting-edge ResearchingDevelopment inEvaluation • Writing • Thinking about • Data Management Evaluation • Utilization-Focused • Project Management Evaluation • Reality-testing • Logic-line planning • Developmental Evaluation 2
  • Let’s talk cars. http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 3
  • What would you need to knowin order to make a purchase decision? http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 4
  • 5
  • Insurance? Value? Best bang for the buck? Lease rate? Affordability? Socialdesirability? Reliability? Comfort? Ride quality? Practicality? Future plans? 6
  • 7
  • 8
  • Buying a car is anevaluative act. http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 9
  • As an evaluator,I think (and care!) deeply about evaluation. 10
  • As an evaluator,I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • As an evaluator,I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • As an evaluator,I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • As an evaluator,I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • What? So what? Now what? (Patton, 2011, p. 3) 12
  • Purposes of Program Evaluation • Overall judgement - Does it meet the needs of participants? Should we keep? • Learning/Improvement - what works/what doesn’t? How can it be improved? How can quality be enhanced? • Accountability - are goals being met? • Monitoring - graduation rates? retention? • Knowledge generation - What are patterns of effectiveness? Site A vs Site B? 13
  • Utilization-Focused Evaluation (Patton, 2008, 2012) • Framework for making decisions about the evaluation in collaboration with Intended use primary users. by intended • Attention paid to stakeholders - people users affected by the program and evaluation (Greene, 2006) • Focus is on... use! 14
  • Utilization-Focused Evaluation (Patton, 2008, 2012) Intended use by intended users 15
  • Utilization- Focused Evaluation is a deepcommitment to reality testing. 16
  • Simple Complicated Complex• predictable • predictable • unpredictable• replicable • replicable •difficult to replicate• known • known • unknown • many variables/parts • many interacting working in tandem in• causal if-then sequence variables/parts models • requires expertise/training • systems thinking? • complex dynamics? • causal if-then models (Westley, Zimmerman, Patton, 2008) 17
  • http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg 18
  • How do we navigate ina fast-changing world? 19
  • http://youtu.be/3SuNx0UrnEo 20 20
  • Social Complexity & Social InnovationThe world that we live in today isfast-changing, such that the tools we havefor evaluation are no longer adequate. 21 21
  • Complex Adaptive Dynamic Systems 22
  • Developmental Evaluation (Patton, 1994, 2011) • supports innovation development to guide adaptation [of programs] to emergent and dynamic realities in complex environments • processes include: • asking evaluative questions • applying evaluation logic • gathering real-time data to inform ongoing decision making • document and track program development; sense-making • Informed by complexity science and systems thinking 23 23
  • Assessment Pilot Initiative• Contemporary notions of classroom assessment• Teaching and Learning Constraints• Interested in integrating Social Media into Teacher Education (classroom assessment) • The thinking was that assessment learning requires learners to actively engage with peers and challenge their own experiences and conceptions of assessment. 24 24
  • Uncertainty• uncertain about how to proceed• uncertain what (to use) in order proceed• uncertain how teacher candidates would respond• Clear, Measurable, and Specific Outcomes• Use of planning frameworks.• Traditional evaluation cycles wouldn’t work. 25 25
  • 26 26
  • Book-ending: Concluding Conditions• In the end, 22 candidates participated in a pilot program.• Teacher candidates tweeted about their own experiences around trying to put into practice contemporary notions of assessment• Guided by the script: “Think Tweet Share”• Developmental evaluation guided this exploration, between the instructors, evaluator, and teacher candidates as a collective in this participatory learning experience. 27 27
  • 28 28
  • Research Purposeto learn about the capacity of developmental evaluation to support innovation. 29 29
  • Why?• DE is still tentative.• Evaluation community craves “practical knowledge” (Schwandt, 2008) about evaluation approaches: knowledge generation. 30 30
  • Research Questions1. To what extent does Assessment Pilot Initiative qualify as adevelopmental evaluation?2. What contribution does developmental evaluation make toenable and promote program development?3. To what extent does developmental evaluation address theneeds of the developers in ways that inform program development?4. What insights, if any, can be drawn from this development about theroles and the responsibilities of the developmental evaluator? 31 31
  • Method & Methodology• Questions drive method (Greene, 2007; Teddlie and Tashakkori, 2009)• Qualitative Case Study • understanding the intricacies into the phenomenon and the context • Case is a “specific, unique, bounded system” (Stake, 2005, p. 436). • Understanding the system’s activity, and its function and interactions. • Qualitative research to describe, understand, and infer meaning. 32 32
  • Data Sources• Three pillars of data1. Program development records2. Interviews with clients on the significance of various DE episodes3. My own reactions to the ongoing development; via document-elicitation 33 33
  • Data Analysis1. Reconstructing evidentiary base2. Identifying developmental episodes (p. 47)3. Coding for developmental moments (p. 49)4. Time-series analysis 34 34
  • Data Analysis1. Reconstructing evidentiary base2. Identifying developmental episodes (p. 47)3. Coding for developmental moments (p. 49)4. Time-series analysis 35 35
  • ! 36 36
  • Key Developmental Episodes• Ep 1: Evolving understanding in using social media for professional learning.• Ep 2: Explicating values through Appreciative Inquiry for program development.• Ep 3: Enhancing collaboration through structured communication• Ep 4: Program development through the use of evaluative data 37 37
  • Major Findings RQ1: To what extent does API qualify as a developmental evaluation?1. Preformative development of a potentially broad- impact, scalable innovation2. Patton: Did something get developed? (Improvement vs development vs innovation)3. Trends (patterns over time) 38 38
  • ! • Development occurred through purposeful interactions (i.e. developmental episodes) • concretization of ideas and thinking (e.g. reflection; green, across) • “intensification” in developmental evaluative activities 39 39
  • Major Findings RQ2: What contribution does DE make to enable and promote program development?1. Lent a data-informed process to innovation; (p. 97)2. Implication: responsiveness • in candidates’ reaction • in the program3. Consequence: resolving uncertainty 40 40
  • • Blue DM - kinds of issues and concerns that surfaced.• Six foci of development• Definition, delineation, collaboration, prototyping, illumination, evaluation.• non-linear, cyclical process 41 41
  • Major Findings RQ3: To what extent does DE address the needs of developers in ways that inform program development?1. Through promoting learning, and enacting a learning framework2. Values and valuing 42 42
  • Major Findings RQ4: What insights, if any, can be drawn from this development about the roles and the responsibilities of the developmental evaluator?1. Manager2. Facilitator of learning3. Evaluator4. Innovation thinker 43 43
  • Conclusion• Innovation developed in a context of complexity guided by developmental evaluation• Preformative development• Evaluator as someone who co-develops a program and draws upon substantive knowledge/skills to promote development• Innovation process (six foci of development) 44 44
  • Design andDesign Thinking 45 45
  • Design+Design Thinking “Design is the systematic exploration into the complexity of options (in program values, assumptions, output, impact, and technologies) and decision-making processes that results in purposeful decisions about the features and components of a program-in-development that is informed by the best conception of the complexity surrounding a social need. Design is dependent on the existence and validity of highly situated and contextualized knowledge about the realities of stakeholders at a site of innovation. The design process fits potential technologies, ideas, and concepts to reconfigure the social realities. This results in the emergence of a program that is adaptive and responsive to the needs of program users.” (Lam, 2011, p. 137-138) 46 46
  • Implications to Evaluation• One of the first documented case study into developmental evaluation• Contributions into understanding, analyzing and reporting development as a process• Delineating the kinds of roles and responsibilities that promote development• The notion of design emerges from this study 47 47
  • Limitations• Contextually bound, so not generalizable • but it does add knowledge to the field• Data of the study is only as good as the data collected from the evaluation • better if I had captured the program-in-action• Analysis of the outcome of API could help strength the case study • but not necessary to achieving the research foci• Cross-case analysis would be a better method for generating understanding. 48 48
  • Let’s talk about researching. 49
  • Writing is thinking.To think is to write. 50
  • Data Management• Document, document, document. • keep a researcher’s log• Use a hanging folder method • for each phase of the project • proposal, lit, ethics, consent forms, transcripts, examples • special folders for each type of your rsh data• Same thing for your digital files! 51
  • Project Management• Think about: • budget twice the time • deadlines • budget for re-writing and re-drafting • deliverables • work in time blocks • lead time • pomodoro 25-4 • wait-time • immerse yourself 52
  • Logic-line Planning 53
  • Thank You! Let’s Connect! @chiyanlamchi.lam@QueensU.ca 54