SlideShare a Scribd company logo
1 of 57
Developmental Evaluation
and the Graduate Student
       Researcher



                 Chi Yan Lam
               Queen’s University
                      EGSS ScholarShare                     @chiyanlam
                      February 22, 2012
      Assessment and Evaluation Group, Queen’s University




                                                                         1
Researching &
Researching Evaluation
Cutting-edge               Researching
Development in
Evaluation                  •   Writing
 •   Thinking about
                            •   Data Management
     Evaluation
 •   Utilization-Focused    •   Project Management
     Evaluation
 •   Reality-testing        •   Logic-line planning

 •   Developmental
     Evaluation


                                                      2
Let’s talk cars.
        http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature

                                                                                                                          3
What would you need to know
in order to make a purchase decision?
                   http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature

                                                                                                                                     4
5
Insurance?   Value? Best bang for the buck?

                                   Lease rate?
                                     Affordability?
   Social
desirability?                       Reliability?


                                  Comfort?
                                    Ride quality?
 Practicality?


                                  Future plans?

                                                      6
7
8
Buying
    a car is an
evaluative act.
     http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature

                                                                                                                       9
As an evaluator,
I think (and care!) deeply about evaluation.




                                               10
As an evaluator,
I think (and care!) deeply about evaluation.




                                       (Guba & Lincoln, 1989)
                                                                11
As an evaluator,
I think (and care!) deeply about evaluation.




                                       (Guba & Lincoln, 1989)
                                                                11
As an evaluator,
I think (and care!) deeply about evaluation.




                                       (Guba & Lincoln, 1989)
                                                                11
As an evaluator,
I think (and care!) deeply about evaluation.




                                       (Guba & Lincoln, 1989)
                                                                11
What?

  So what?

     Now what?
             (Patton, 2011, p. 3)
                                12
Purposes of Program Evaluation
  •   Overall judgement - Does it meet the needs of
      participants? Should we keep?
  •   Learning/Improvement - what works/what
      doesn’t? How can it be improved? How can quality
      be enhanced?
  •   Accountability - are goals being met?
  •   Monitoring - graduation rates? retention?
  •   Knowledge generation - What are patterns of
      effectiveness? Site A vs Site B?


                                                         13
Utilization-Focused Evaluation     (Patton, 2008, 2012)



                •   Framework for making
                    decisions about the evaluation
                    in collaboration with
 Intended use       primary users.

  by intended   •   Attention paid to
                    stakeholders - people
      users         affected by the program and
                    evaluation (Greene, 2006)

                •   Focus is on... use!




                                                          14
Utilization-Focused Evaluation
                       (Patton, 2008, 2012)




 Intended use
  by intended
      users




                                              15
Utilization-
    Focused
  Evaluation
   is a deep
commitment to
    reality
  testing.
                 16
Simple Complicated                                    Complex
• predictable      •    predictable
                                                         • unpredictable
• replicable       •     replicable
                                                         •difficult to replicate
• known            •       known
                                                         •     unknown
                   • many variables/parts                • many interacting
                        working in tandem in
• causal if-then              sequence
                                                             variables/parts
    models
                   •   requires expertise/training
                                                         • systems thinking?
                                                         • complex dynamics?
                   • causal if-then models
                                                     (Westley, Zimmerman, Patton, 2008)
                                                                                          17
http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg
                                                                   18
How do we navigate in
a fast-changing world?   19
http://youtu.be/3SuNx0UrnEo

            20
                              20
Social Complexity
   & Social Innovation

The world that we live in today is
fast-changing, such that the tools we have
for evaluation are no longer adequate.



                   21
                                             21
Complex Adaptive Dynamic Systems
                                   22
Developmental Evaluation                             (Patton, 1994, 2011)


 •   supports innovation development to guide adaptation [of
     programs] to emergent and dynamic realities in complex
     environments
 •   processes include:
     •   asking evaluative questions
     •   applying evaluation logic
     •   gathering real-time data to inform ongoing decision making
     •   document and track program development; sense-making
 •   Informed by complexity science and systems thinking


                                     23
                                                                            23
Assessment
         Pilot Initiative
• Contemporary notions of classroom
  assessment
• Teaching and Learning Constraints
• Interested in integrating Social Media into
  Teacher Education (classroom assessment)
  • The thinking was that assessment learning
    requires learners to actively engage with
    peers and challenge their own experiences
    and conceptions of assessment.
                       24
                                                24
Uncertainty
•   uncertain about how to proceed
•   uncertain what (to use) in order proceed
•   uncertain how teacher candidates would respond



•   Clear, Measurable, and Specific Outcomes
•   Use of planning frameworks.
•   Traditional evaluation cycles wouldn’t work.

                          25
                                                     25
26
     26
Book-ending: Concluding
      Conditions
•   In the end, 22 candidates participated in a pilot
    program.
•   Teacher candidates tweeted about their own
    experiences around trying to put into practice
    contemporary notions of assessment
•   Guided by the script: “Think Tweet Share”
•   Developmental evaluation guided this exploration,
    between the instructors, evaluator, and teacher
    candidates as a collective in this participatory learning
    experience.

                             27
                                                                27
28
     28
Research Purpose

to learn about the capacity of developmental
      evaluation to support innovation.




                    29
                                               29
Why?

• DE is still tentative.
• Evaluation community craves “practical
  knowledge” (Schwandt, 2008) about
  evaluation approaches: knowledge
  generation.



                     30
                                           30
Research Questions
1.	

   To what extent does Assessment Pilot Initiative qualify as a

developmental evaluation?

2.	

   What contribution does developmental evaluation make to

enable and promote program development?

3.	

   To what extent does developmental evaluation address the

needs of the developers in ways that inform program development?

4.	

   What insights, if any, can be drawn from this development about the

roles and the responsibilities of the developmental evaluator?

                                    31
                                                                              31
Method & Methodology
•   Questions drive method (Greene, 2007; Teddlie and Tashakkori,
    2009)
•   Qualitative Case Study
          •   understanding the intricacies into the phenomenon and
              the context
          •   Case is a “specific, unique, bounded system” (Stake,
              2005, p. 436).
          •   Understanding the system’s activity, and its function and
              interactions.
    •   Qualitative research to describe, understand, and infer
        meaning.


                                  32
                                                                          32
Data Sources

• Three pillars of data
1. Program development records
2. Interviews with clients on the significance
   of various DE episodes
3. My own reactions to the ongoing
   development; via document-elicitation
                          33
                                                33
Data Analysis

1. Reconstructing evidentiary base
2. Identifying developmental episodes (p. 47)
3. Coding for developmental moments (p. 49)
4. Time-series analysis



                      34
                                                34
Data Analysis

1. Reconstructing evidentiary base
2. Identifying developmental episodes (p. 47)
3. Coding for developmental moments (p. 49)
4. Time-series analysis



                      35
                                                35
!




    36
         36
Key Developmental
         Episodes
• Ep 1: Evolving understanding in using social
  media for professional learning.
• Ep 2: Explicating values through Appreciative
  Inquiry for program development.
• Ep 3: Enhancing collaboration through
  structured communication
• Ep 4: Program development through the use
  of evaluative data

                       37
                                                  37
Major Findings
      RQ1: To what extent does API qualify as a
             developmental evaluation?


1. Preformative development of a potentially broad-
   impact, scalable innovation
2. Patton: Did something get developed? (Improvement
   vs development vs innovation)
3. Trends (patterns over time)

                          38
                                                       38
!




    •   Development occurred through purposeful
        interactions (i.e. developmental episodes)
    •   concretization of ideas and thinking (e.g. reflection;
        green, across)
    •   “intensification” in developmental evaluative activities
                                39
                                                                  39
Major Findings
        RQ2: What contribution does DE make to enable and
                promote program development?


1. Lent a data-informed process to innovation; (p. 97)
2. Implication: responsiveness
    •    in candidates’ reaction
    •    in the program

3. Consequence: resolving uncertainty


                                   40
                                                            40
• Blue DM - kinds of issues and concerns
  that surfaced.
• Six foci of development
• Definition, delineation, collaboration,
  prototyping, illumination, evaluation.
• non-linear, cyclical process
                      41
                                           41
Major Findings
    RQ3: To what extent does DE address the
     needs of developers in ways that inform
             program development?


1. Through promoting learning, and enacting a
   learning framework
2. Values and valuing

                        42
                                                42
Major Findings
    RQ4: What insights, if any, can be drawn from
      this development about the roles and the
   responsibilities of the developmental evaluator?
1. Manager
2. Facilitator of learning
3. Evaluator
4. Innovation thinker

                             43
                                                      43
Conclusion
• Innovation developed in a context of
  complexity guided by developmental
  evaluation
• Preformative development
• Evaluator as someone who co-develops a
  program and draws upon substantive
  knowledge/skills to promote development
• Innovation process (six foci of development)
                      44
                                                 44
Design and
Design Thinking



      45
                  45
Design+Design Thinking
  “Design is the systematic exploration into the complexity of options (in
  program values, assumptions, output, impact, and technologies) and
  decision-making processes that results in purposeful decisions about the
  features and components of a program-in-development that is informed by
  the best conception of the complexity surrounding a social need.

  Design is dependent on the existence and validity of highly situated and
  contextualized knowledge about the realities of stakeholders at a site of
  innovation. The design process fits potential technologies, ideas, and
  concepts to reconfigure the social realities. This results in the emergence of
  a program that is adaptive and responsive to the needs of program users.”


                                                       (Lam, 2011, p. 137-138)


                                     46
                                                                                  46
Implications to
            Evaluation
• One of the first documented case study into
  developmental evaluation
• Contributions into understanding, analyzing
  and reporting development as a process
• Delineating the kinds of roles and
  responsibilities that promote development
• The notion of design emerges from this
  study
                       47
                                                47
Limitations
•   Contextually bound, so not generalizable
    •   but it does add knowledge to the field
•   Data of the study is only as good as the data collected from
    the evaluation
    •   better if I had captured the program-in-action
•   Analysis of the outcome of API could help strength the case
    study
    •   but not necessary to achieving the research foci
•   Cross-case analysis would be a better method for generating
    understanding.


                                48
                                                                   48
Let’s talk about
 researching.


                   49
Writing is thinking.

To think is to write.


                        50
Data Management
•   Document, document, document.
     •   keep a researcher’s log
•   Use a hanging folder method
     •   for each phase of the project
          •   proposal, lit, ethics, consent forms,
              transcripts, examples
     •   special folders for each type of your rsh data
•   Same thing for your digital files!


                                                          51
Project Management
•   Think about:       •   budget twice the time

    •   deadlines      •   budget for re-writing
                           and re-drafting
    •   deliverables
                       •   work in time blocks
    •   lead time
                           •   pomodoro 25-4
    •   wait-time
                           •   immerse yourself




                                                   52
Logic-line Planning
                      53
Thank You!
  Let’s Connect!

      @chiyanlam
chi.lam@QueensU.ca




                     54

More Related Content

Similar to Developmental Evaluation and the Graduate Student Researcher

Pre Class Meeting
Pre Class MeetingPre Class Meeting
Pre Class Meeting
nuvention
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
debcook
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
debcook
 
2012 EES Peace Precarious Workshop full slides
2012 EES Peace Precarious Workshop full slides2012 EES Peace Precarious Workshop full slides
2012 EES Peace Precarious Workshop full slides
seaelkins
 
#nacada12 Pre-Conference Overview
#nacada12 Pre-Conference Overview#nacada12 Pre-Conference Overview
#nacada12 Pre-Conference Overview
Laura Pasquini
 
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning SettingsResearch and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Katrien Verbert
 
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
Xyleme
 
Day 3 ldp slides - eng
Day 3   ldp slides - engDay 3   ldp slides - eng
Day 3 ldp slides - eng
Mây Trắng
 
Change Management for Sustainability
Change Management for SustainabilityChange Management for Sustainability
Change Management for Sustainability
Peter Hess
 

Similar to Developmental Evaluation and the Graduate Student Researcher (20)

Peer Review Evaluation System
Peer Review Evaluation SystemPeer Review Evaluation System
Peer Review Evaluation System
 
Pre Class Meeting
Pre Class MeetingPre Class Meeting
Pre Class Meeting
 
Celebrating Our Own
Celebrating Our OwnCelebrating Our Own
Celebrating Our Own
 
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
Lessons Learned from the application of Outcome Mapping to an IDRC EcoHealth ...
 
Working in the Social World: Complex Adaptive Systems
Working in the Social World: Complex Adaptive SystemsWorking in the Social World: Complex Adaptive Systems
Working in the Social World: Complex Adaptive Systems
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
 
The Importance of Data-Based Decisions for Curriculum Development ©Bauman 201...
The Importance of Data-Based Decisions for Curriculum Development ©Bauman 201...The Importance of Data-Based Decisions for Curriculum Development ©Bauman 201...
The Importance of Data-Based Decisions for Curriculum Development ©Bauman 201...
 
If You Evaluate It, They Will Fund
If You Evaluate It, They Will FundIf You Evaluate It, They Will Fund
If You Evaluate It, They Will Fund
 
2012 EES Peace Precarious Workshop full slides
2012 EES Peace Precarious Workshop full slides2012 EES Peace Precarious Workshop full slides
2012 EES Peace Precarious Workshop full slides
 
#nacada12 Pre-Conference Overview
#nacada12 Pre-Conference Overview#nacada12 Pre-Conference Overview
#nacada12 Pre-Conference Overview
 
GHAMAS Design Principles
GHAMAS Design PrinciplesGHAMAS Design Principles
GHAMAS Design Principles
 
Understanding Information Architecture
Understanding Information ArchitectureUnderstanding Information Architecture
Understanding Information Architecture
 
Fit for Purpose: Developing Business Cases for New Services in Research Libr...
 Fit for Purpose: Developing Business Cases for New Services in Research Libr... Fit for Purpose: Developing Business Cases for New Services in Research Libr...
Fit for Purpose: Developing Business Cases for New Services in Research Libr...
 
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning SettingsResearch and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
 
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
Webinar: "Let's Get Mobile: Changing Your Concept of Mobile Content Design an...
 
Day 3 ldp slides - eng
Day 3   ldp slides - engDay 3   ldp slides - eng
Day 3 ldp slides - eng
 
Outputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic ModelsOutputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic Models
 
Change Management for Sustainability
Change Management for SustainabilityChange Management for Sustainability
Change Management for Sustainability
 
OB - Decision Making
OB - Decision MakingOB - Decision Making
OB - Decision Making
 

Recently uploaded

Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
MateoGardella
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 

Recently uploaded (20)

Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 

Developmental Evaluation and the Graduate Student Researcher

  • 1. Developmental Evaluation and the Graduate Student Researcher Chi Yan Lam Queen’s University EGSS ScholarShare @chiyanlam February 22, 2012 Assessment and Evaluation Group, Queen’s University 1
  • 2. Researching & Researching Evaluation Cutting-edge Researching Development in Evaluation • Writing • Thinking about • Data Management Evaluation • Utilization-Focused • Project Management Evaluation • Reality-testing • Logic-line planning • Developmental Evaluation 2
  • 3. Let’s talk cars. http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 3
  • 4. What would you need to know in order to make a purchase decision? http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 4
  • 5. 5
  • 6. Insurance? Value? Best bang for the buck? Lease rate? Affordability? Social desirability? Reliability? Comfort? Ride quality? Practicality? Future plans? 6
  • 7. 7
  • 8. 8
  • 9. Buying a car is an evaluative act. http://www.caranddriver.com/features/2013-subaru-brz-and-2013-scion-fr-s-a-study-in-comparison-and-contrast-feature 9
  • 10. As an evaluator, I think (and care!) deeply about evaluation. 10
  • 11. As an evaluator, I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • 12. As an evaluator, I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • 13. As an evaluator, I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • 14. As an evaluator, I think (and care!) deeply about evaluation. (Guba & Lincoln, 1989) 11
  • 15. What? So what? Now what? (Patton, 2011, p. 3) 12
  • 16. Purposes of Program Evaluation • Overall judgement - Does it meet the needs of participants? Should we keep? • Learning/Improvement - what works/what doesn’t? How can it be improved? How can quality be enhanced? • Accountability - are goals being met? • Monitoring - graduation rates? retention? • Knowledge generation - What are patterns of effectiveness? Site A vs Site B? 13
  • 17. Utilization-Focused Evaluation (Patton, 2008, 2012) • Framework for making decisions about the evaluation in collaboration with Intended use primary users. by intended • Attention paid to stakeholders - people users affected by the program and evaluation (Greene, 2006) • Focus is on... use! 14
  • 18. Utilization-Focused Evaluation (Patton, 2008, 2012) Intended use by intended users 15
  • 19. Utilization- Focused Evaluation is a deep commitment to reality testing. 16
  • 20. Simple Complicated Complex • predictable • predictable • unpredictable • replicable • replicable •difficult to replicate • known • known • unknown • many variables/parts • many interacting working in tandem in • causal if-then sequence variables/parts models • requires expertise/training • systems thinking? • complex dynamics? • causal if-then models (Westley, Zimmerman, Patton, 2008) 17
  • 22. How do we navigate in a fast-changing world? 19
  • 24. Social Complexity & Social Innovation The world that we live in today is fast-changing, such that the tools we have for evaluation are no longer adequate. 21 21
  • 26. Developmental Evaluation (Patton, 1994, 2011) • supports innovation development to guide adaptation [of programs] to emergent and dynamic realities in complex environments • processes include: • asking evaluative questions • applying evaluation logic • gathering real-time data to inform ongoing decision making • document and track program development; sense-making • Informed by complexity science and systems thinking 23 23
  • 27. Assessment Pilot Initiative • Contemporary notions of classroom assessment • Teaching and Learning Constraints • Interested in integrating Social Media into Teacher Education (classroom assessment) • The thinking was that assessment learning requires learners to actively engage with peers and challenge their own experiences and conceptions of assessment. 24 24
  • 28. Uncertainty • uncertain about how to proceed • uncertain what (to use) in order proceed • uncertain how teacher candidates would respond • Clear, Measurable, and Specific Outcomes • Use of planning frameworks. • Traditional evaluation cycles wouldn’t work. 25 25
  • 29. 26 26
  • 30. Book-ending: Concluding Conditions • In the end, 22 candidates participated in a pilot program. • Teacher candidates tweeted about their own experiences around trying to put into practice contemporary notions of assessment • Guided by the script: “Think Tweet Share” • Developmental evaluation guided this exploration, between the instructors, evaluator, and teacher candidates as a collective in this participatory learning experience. 27 27
  • 31. 28 28
  • 32. Research Purpose to learn about the capacity of developmental evaluation to support innovation. 29 29
  • 33. Why? • DE is still tentative. • Evaluation community craves “practical knowledge” (Schwandt, 2008) about evaluation approaches: knowledge generation. 30 30
  • 34. Research Questions 1. To what extent does Assessment Pilot Initiative qualify as a developmental evaluation? 2. What contribution does developmental evaluation make to enable and promote program development? 3. To what extent does developmental evaluation address the needs of the developers in ways that inform program development? 4. What insights, if any, can be drawn from this development about the roles and the responsibilities of the developmental evaluator? 31 31
  • 35. Method & Methodology • Questions drive method (Greene, 2007; Teddlie and Tashakkori, 2009) • Qualitative Case Study • understanding the intricacies into the phenomenon and the context • Case is a “specific, unique, bounded system” (Stake, 2005, p. 436). • Understanding the system’s activity, and its function and interactions. • Qualitative research to describe, understand, and infer meaning. 32 32
  • 36. Data Sources • Three pillars of data 1. Program development records 2. Interviews with clients on the significance of various DE episodes 3. My own reactions to the ongoing development; via document-elicitation 33 33
  • 37. Data Analysis 1. Reconstructing evidentiary base 2. Identifying developmental episodes (p. 47) 3. Coding for developmental moments (p. 49) 4. Time-series analysis 34 34
  • 38. Data Analysis 1. Reconstructing evidentiary base 2. Identifying developmental episodes (p. 47) 3. Coding for developmental moments (p. 49) 4. Time-series analysis 35 35
  • 39. ! 36 36
  • 40. Key Developmental Episodes • Ep 1: Evolving understanding in using social media for professional learning. • Ep 2: Explicating values through Appreciative Inquiry for program development. • Ep 3: Enhancing collaboration through structured communication • Ep 4: Program development through the use of evaluative data 37 37
  • 41. Major Findings RQ1: To what extent does API qualify as a developmental evaluation? 1. Preformative development of a potentially broad- impact, scalable innovation 2. Patton: Did something get developed? (Improvement vs development vs innovation) 3. Trends (patterns over time) 38 38
  • 42. ! • Development occurred through purposeful interactions (i.e. developmental episodes) • concretization of ideas and thinking (e.g. reflection; green, across) • “intensification” in developmental evaluative activities 39 39
  • 43. Major Findings RQ2: What contribution does DE make to enable and promote program development? 1. Lent a data-informed process to innovation; (p. 97) 2. Implication: responsiveness • in candidates’ reaction • in the program 3. Consequence: resolving uncertainty 40 40
  • 44. • Blue DM - kinds of issues and concerns that surfaced. • Six foci of development • Definition, delineation, collaboration, prototyping, illumination, evaluation. • non-linear, cyclical process 41 41
  • 45. Major Findings RQ3: To what extent does DE address the needs of developers in ways that inform program development? 1. Through promoting learning, and enacting a learning framework 2. Values and valuing 42 42
  • 46. Major Findings RQ4: What insights, if any, can be drawn from this development about the roles and the responsibilities of the developmental evaluator? 1. Manager 2. Facilitator of learning 3. Evaluator 4. Innovation thinker 43 43
  • 47. Conclusion • Innovation developed in a context of complexity guided by developmental evaluation • Preformative development • Evaluator as someone who co-develops a program and draws upon substantive knowledge/skills to promote development • Innovation process (six foci of development) 44 44
  • 49. Design+Design Thinking “Design is the systematic exploration into the complexity of options (in program values, assumptions, output, impact, and technologies) and decision-making processes that results in purposeful decisions about the features and components of a program-in-development that is informed by the best conception of the complexity surrounding a social need. Design is dependent on the existence and validity of highly situated and contextualized knowledge about the realities of stakeholders at a site of innovation. The design process fits potential technologies, ideas, and concepts to reconfigure the social realities. This results in the emergence of a program that is adaptive and responsive to the needs of program users.” (Lam, 2011, p. 137-138) 46 46
  • 50. Implications to Evaluation • One of the first documented case study into developmental evaluation • Contributions into understanding, analyzing and reporting development as a process • Delineating the kinds of roles and responsibilities that promote development • The notion of design emerges from this study 47 47
  • 51. Limitations • Contextually bound, so not generalizable • but it does add knowledge to the field • Data of the study is only as good as the data collected from the evaluation • better if I had captured the program-in-action • Analysis of the outcome of API could help strength the case study • but not necessary to achieving the research foci • Cross-case analysis would be a better method for generating understanding. 48 48
  • 52. Let’s talk about researching. 49
  • 53. Writing is thinking. To think is to write. 50
  • 54. Data Management • Document, document, document. • keep a researcher’s log • Use a hanging folder method • for each phase of the project • proposal, lit, ethics, consent forms, transcripts, examples • special folders for each type of your rsh data • Same thing for your digital files! 51
  • 55. Project Management • Think about: • budget twice the time • deadlines • budget for re-writing and re-drafting • deliverables • work in time blocks • lead time • pomodoro 25-4 • wait-time • immerse yourself 52
  • 57. Thank You! Let’s Connect! @chiyanlam chi.lam@QueensU.ca 54