Organization Development and Change
Thomas G. Cummings
Christopher G. Worley
Evaluating and Institutionalizing
OD Interventions
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-2
Learning Objectives
for Chapter Eleven
• To understand the issues associated with
evaluating OD interventions
• To understand the process of
institutionalizing OD interventions and the
factors that contribute to it
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-3
Issues in Evaluating OD
Interventions
• Implementation and Evaluation Feedback
• Measurement
– Select the right variables to measure
– Design good measurements
• Operational
• Reliable
• Valid
• Research Design
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-4
Implementation
Feedback
• Feedback aimed at
guiding
implementation efforts
• Milestones,
intermediate targets
• Measures of the
intervention’s progress
Evaluation
Feedback
• Feedback aimed at
determining impact of
intervention
• Goals, outcomes,
performance
• Measures of the
intervention’s effect
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-5
Implementation and
Evaluation Feedback
Diagnosis
Design and
Implementation
of Interventions
Alternative
Interventions
Implementation of
Intervention
Clarify
Intention
Plan for
Next Steps
Implementation
Feedback
Measures of
the Intervention
and Immediate
Effects
Evaluation
Feedback
Measure
of
Long-term
Effects
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-6
Sources of Reliability
• Rigorous Operational Definition
– Provide precise guidelines for measurement:
How high does a team have to score on a five-
point scale to say that it is effective?
• Multiple Measures
– Multiple items on a survey
– Multiple measures of the same variable (survey,
observation, unobtrusive measure)
• Standardized Instruments
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-7
Types of Validity
• Face Validity: Does the measure “appear” to
reflect the variable of interest?
• Content Validity: Do “experts” agree that the
measure appears valid?
• Criterion or Convergent Validity: Do
measures of “similar” variables correlate?
• Discriminant Validity: Do measures of “non-
similar” variables show no association?
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-8
Elements of Strong Research
Designs in OD Evaluation
• Longitudinal Measurement
– Change is measured over time
• Comparison Units
– Appropriate use of “control” groups
• Statistical Analysis
– Alternative sources of variation have been
controlled
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-9
Evaluating Different
Types of Change
• Alpha Change
– Movement along a stable dimension
• Beta Change
– Recalibration of units of measure in a stable
dimension
• Gamma Change
– Fundamental redefinition of dimension
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-10
Institutionalization Framework
Organization
Characteristics
Intervention
Characteristics
Institutionalization
Processes
Indicators of
Institutionalization
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-11
Organization Characteristics
• Congruence
– Extent to which an intervention supports or
aligns with the current environment, strategic
orientation, or other changes taking place
• Stability of Environment and Technology
• Unionization
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-12
Intervention Characteristics
• Goal Specificity
• Programmability
• Level of Change Target
• Internal Support
• Sponsor
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-13
Institutionalization Processes
• Socialization
• Commitment
• Reward Allocation
• Diffusion
• Sensing and Calibration
Cummings & Worley, 8e
(c)2005 Thomson/South-Western
11-14
Indicators of
Institutionalization
• Knowledge
• Performance
• Preferences
• Normative Consensus
• Value Consensus

Oc 6440 evaluating od interventions

  • 1.
    Organization Development andChange Thomas G. Cummings Christopher G. Worley Evaluating and Institutionalizing OD Interventions
  • 2.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-2 Learning Objectives for Chapter Eleven • To understand the issues associated with evaluating OD interventions • To understand the process of institutionalizing OD interventions and the factors that contribute to it
  • 3.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-3 Issues in Evaluating OD Interventions • Implementation and Evaluation Feedback • Measurement – Select the right variables to measure – Design good measurements • Operational • Reliable • Valid • Research Design
  • 4.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-4 Implementation Feedback • Feedback aimed at guiding implementation efforts • Milestones, intermediate targets • Measures of the intervention’s progress Evaluation Feedback • Feedback aimed at determining impact of intervention • Goals, outcomes, performance • Measures of the intervention’s effect
  • 5.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-5 Implementation and Evaluation Feedback Diagnosis Design and Implementation of Interventions Alternative Interventions Implementation of Intervention Clarify Intention Plan for Next Steps Implementation Feedback Measures of the Intervention and Immediate Effects Evaluation Feedback Measure of Long-term Effects
  • 6.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-6 Sources of Reliability • Rigorous Operational Definition – Provide precise guidelines for measurement: How high does a team have to score on a five- point scale to say that it is effective? • Multiple Measures – Multiple items on a survey – Multiple measures of the same variable (survey, observation, unobtrusive measure) • Standardized Instruments
  • 7.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-7 Types of Validity • Face Validity: Does the measure “appear” to reflect the variable of interest? • Content Validity: Do “experts” agree that the measure appears valid? • Criterion or Convergent Validity: Do measures of “similar” variables correlate? • Discriminant Validity: Do measures of “non- similar” variables show no association?
  • 8.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-8 Elements of Strong Research Designs in OD Evaluation • Longitudinal Measurement – Change is measured over time • Comparison Units – Appropriate use of “control” groups • Statistical Analysis – Alternative sources of variation have been controlled
  • 9.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-9 Evaluating Different Types of Change • Alpha Change – Movement along a stable dimension • Beta Change – Recalibration of units of measure in a stable dimension • Gamma Change – Fundamental redefinition of dimension
  • 10.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-10 Institutionalization Framework Organization Characteristics Intervention Characteristics Institutionalization Processes Indicators of Institutionalization
  • 11.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-11 Organization Characteristics • Congruence – Extent to which an intervention supports or aligns with the current environment, strategic orientation, or other changes taking place • Stability of Environment and Technology • Unionization
  • 12.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-12 Intervention Characteristics • Goal Specificity • Programmability • Level of Change Target • Internal Support • Sponsor
  • 13.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-13 Institutionalization Processes • Socialization • Commitment • Reward Allocation • Diffusion • Sensing and Calibration
  • 14.
    Cummings & Worley,8e (c)2005 Thomson/South-Western 11-14 Indicators of Institutionalization • Knowledge • Performance • Preferences • Normative Consensus • Value Consensus