Pilots often evolve into larger rollouts with unexpected results. Decisions to extend programs before evaluation is complete can be driven by popularity, management decisions, or political pressure. Evaluators face challenges in meeting information needs within tight timeframes and demonstrating impact. Pilots may not transfer well to larger settings due to changes in stakeholders and complexity. Evaluators must build relationships, understand interventions, inform all phases, and communicate sensitively while maintaining independence.
Call Girls Kukatpally 7001305949 all area service COD available Any Time
Pilot to National Launch: Evaluator's Role in Scaling Interventions
1. Pilot to National Launch
Scale Matters
Evaluators
Anne Dowden
(Partner – Evaluation, Research New Zealand)
John Wren
(Principal Research Advisor, ACC Research)
Presentation: Evaluation in a changing landscape: People, Politics and Policy.
Aotearoa New Zealand Evaluation Association Conference
Te Papa, 8-10 August, 2011
2. Some big questions
• What drives rollout decisions?
• Is the role of evaluators recognised?
• What do evaluators understand as their
role?
• How can evaluators deliver what is
needed?
3. General Observations
• Pilots have life of their own
– they grow, transform and evolve into larger
rollouts
• Not all pilots are evaluated
– some evaluations run alongside
– quite a few evaluations are ‘after the fact’
4. However…in our experience
• Pilots throw up surprising results (that
warrant careful monitoring)
– with real implications for extensions
• Decisions to extend may be sudden or
before evaluation data is in:
– good idea (after all, it is evidenced in the
international literature)
– popularity factor (stakeholders’ buy-in)
– senior management decision (political)
– pressure from sector/Minister …
5. However…in our experience
• Stakeholders/organisations do not
recognise what evaluators’ roles can
include
– Evaluators’ roles in implementation (in
particular) are not always clear or
understood by stakeholders
• Stakeholders/organisations see
evaluators:
– As there to ‘prove’ not as improvers
– As proving impact, not as designers of
proof (setting up frameworks)
6. The challenges …
• Evaluation in the current environment
– evaluators are facing real challenges in
meeting clients’ information needs
– timeframes
– hard evidence
• Evaluation ‘after the fact’
– immediate outcomes are not measured
– only high level outcomes are measured
7. Result in …
• Limited evidence of effectiveness
– no evidence that high level outcomes are
due to the intervention
• Stakeholders/organisations views:
– outcome evaluation “can’t be done”
– evaluation has no value
8. So what …?
• What is the evaluator’s role?
• How can evaluators deliver what is
needed?
9. The challenge of the
evaluator’s role
• We need to do our best to communicate
– the right knowledge
– in a timely manner
– to the right people
– with sensitivity
– while maintaining
• integrity
• independence
• objectivity
10. Facing the challenge:
Scale Matters
• Moving from small to large settings
often means:
– Change in business group owner (BGO)
– More stakeholders
– More at stake
– More complexity
– Dynamics of intervention change but not
always understood or allowed for
11. Facing the challenge:
Scale Matters
Consequences
“Small problems manageable at the
local level, become significantly more
difficult in larger and more complex
settings”
12. Facing the challenge:
‘Owners’ role
• Pilot stakeholders
– high level of understanding and ownership
of intervention
• Pilot owners
– under pressure to “do something”
13. Facing the challenge:
‘Owners’ role
• Implications
– reputations are at risk
– objectivity can be problematic
– intervention concept is not always shared /
understood by others
– understanding intervention concept
≠ good project design, management or
evaluation
14. Facing the challenge:
Engagement
• “Advisory Groups”
– established for the pilot, stakeholders
design the intervention
– no Advisory Group for the extension
15. Facing the challenge:
Engagement
• Consequences
– Loss of ‘buy-in’ that threatens engagement
with, and acceptance of, intervention
– Lack of relevance when pilot design in a
local context may not transfer well
Compounded by
– Lack of knowledge of intervention can’t
identify local levers and make critical
adaptations for the new locations
16. Facing the challenge:
Project Planning
• Poor project planning
– Limited resourcing
– Limited experience
– Time pressure to deliver early
17. Facing the challenge:
Project Planning
• Consequences
– Project planning compromises the
intervention delivery and evaluation
• Inadequate “buy-in” by key contributors
and stakeholders
• Interventions are not fully developed and
their design changes overtime
– Risks to delivering intervention outcomes
– Risks to the evaluation processes
18. Facing the challenge:
Communicating the right information
• evaluators need to build strong relationships
with owners early
• evaluators need to understand the
intervention logic, and the owners’ stake in the
intervention
• evaluators need to inform the design,
implementation & monitoring (data collection,
data collation) phases
“Evaluators have to manage
competing perspectives on a project
simultaneously with expectations
about the evaluation results”
19. Facing the challenge:
Communication sensitivity
• Stakeholder relationships critical
– Be aware of stakeholders’ commitment,
investment and risk in the intervention
For challenging results
– Socialise findings early
– Use sensitivity when sharing broadly
– Time results dissemination to coincide with
solutions focused responses
20. Facing the challenge:
Communication tools
• Well designed PowerPoint
– Enables delivery of more timely information
– Gives succinct presentation of key results
– Is more influential than written material
– More cost-effective than written reports
– Provides focus for discussion and debate
– Provides transparency for end-users and
shows the value-add of analysis
21. • Mind Maps [diagrams] are useful
analytical and communication tools
– Shows the intervention’s intent/intents
– Shows relationships
• Different stakeholder perspectives on a key
issue (eg what drives behaviour change)
• Relative importance of components to each
other (eg operational interface, sector
relationships, key activities)
– Enables evaluators to focus when discussing
with decision-makers
Facing the challenge:
Communication tools
22. Example Mind Map for
Analysis & Communication
GPs
“Great, a service that works”
“Good: there is a safe role”
Employers
“I don’t have suitable work”
“Well maybe that will work …”
“Its good to be involved”
“Great to know they’re safe”
PHO & A+Ms
“Revenue stream”
“Link to GPs”
“Best practice!”
Unions
“Just ACC cost savings?”
“Not sure about this …”
Injured workers
“Ok .. if you say so”
“I don’t think there is work…”
“Good: someone to sort it with the boss”
“Great: got to get back”
Better@Work
ACC
“This is an excellent
idea”
“Seeding the return
to work idea on day
one is exactly right ”
23. Mind Map presenting analytical
insight: Early Return to Work Savings
Employer wages/salary paymentsACC weekly compensation
Less time on full weekly compensation
Better@Work is a graduated
decrease in weekly comp hours:
Means this:
ACC weekly compensation
Employer wages/salary payments
24. Facing the challenge:
Independence of thinking
• Evaluator independence is critical
– Be aware of stakeholders’ biases
– Be mindful of evaluation capture
– Socialise concerns of weaknesses & threats
to the intervention early
– Present change as an opportunity to
strengthen the intervention/a way to
achieve intervention outcomes
– Use and reuse evaluation tools for critical
analysis (logic models, outcome
frameworks)
25. Partnership by evaluators
internal:external partnership
• Recognises wider context
& wider drivers
• Pulls in external
examples for sense-
making
• Uses tried & true
communications/project
management strategies
• Less captured by issues
of project
• Uses due process to
maintain independence
(critical analysis)
• Expects evaluation use
• Works to promote use
• Contextual intelligence
• Pre-knowledge aides &
hinders
• Pulls in internal
examples/explanations
• Intense involvement in
project steers sensitivity
level
• Internal politics apparent
• Views evaluation use as
dictated by environment
• Understands systems for
evaluation use
26. The evaluators role:
21st
Century
• Strong relationships
• Communication tools
– Mind Maps
– PowerPoint
• Yet retain key
evaluation tools
– Logic models
– Outcome frameworks
• We need to
communicate
– the right knowledge
– in a timely manner
– to the right people
– with sensitivity
– while maintaining
integrity,
independence &
objectivity
Editor's Notes
- no clear intervention outcome
- no clarity on data to collect
- various data collected
- changes to intervention
- data no longer relevant
Use logic models to build relationships, gain shared understanding
GP and hospital interface; economic recession as a driver of behaviour change