2. Introduction
• Value
– Must be defined, attained, and demonstrable by
measurement to ensure resources did advance
organization’s mission and vision
– Investment of resources in any project should
produce value for organization’s stakeholders:
3. Measurement
• Use of information drawn from databases;
often collected specifically for a project based
on anticipated outcomes of project
• Informative information collected;
predesigned plan for its use has been
determined
• Information must be: relevant to project, and
sensitive so that it measures real differences
in project’s anticipated impact
4. Measurement—cont’d
• Unbiased and comprehensive information to
capture scope and magnitude of project
• Timely information required to measure
impact of project
• Value expressed by first identifying the
stakeholder’s interest
5. Measurement—cont’d
• Project Manager
– Must be clear about stakeholder’s needs and
wants to establish appropriate baseline for
comparison of outcomes
– If project impacts patient care, must consider the
patient’s perceptions
– Measuring value: complex and requires
considerable forethought and design
6. Measurement—cont’d
• Project Manager—cont’d
– Will play multiple roles
– Measuring value of impact in any project must be
considered before project is initiated
– Must understand genesis of the project
– Needs to consider cultural context of the work
– Must be strategic
– Must function as planner, communicator, data
analyst, and database administrator
7. Project Evaluation
• Definition: systematic collection of information
about the activities, characteristics, and
outcomes, in order to make judgements
• Steps:
– Considering the evaluation purpose
– Identifying stakeholders
– Assessing evaluation expertise
– Gathering relevant evidence
– <NLL>Building consensus
8. Aligning Metrics with Project Aims
• Projects vary in terms of the:
– Magnitude and scope of the change
– Stakeholders involved
– Degree of linearity or complexity associated with
the approach taken to manage the project
9. Case Study Application
• Project manager appointment based on
community desire to be more heart healthy
• Project aims
• See Table 13-1
• Information management plan
10. Principles of Project
Evaluation Methods
• Project Evaluation
– Effort made to measure impact of project-based
change
– Value proposition to be measured derived from
stakeholders
– Project manager responsibility: measuring impact
of project objectives or aims
11. Principles of Project
Evaluation Methods—cont’d
• Program Evaluation—cont’d
– Diligent investigation of program’s characteristics
and merits to provide information about its
effectiveness in optimizing outcomes, efficiency,
and quality of health care
– Appraises achievement of a project’s goals and
objectives and extent of its impact and costs
12. Principles of Project
Evaluation Methods—cont’d
• Project and Program Evaluation
– Major task: to identify a program’s merits
– Methods used to capture data include both
quantitative and qualitative strategies
– Quantitative methods include approaches that
measure impact through mechanisms
– Qualitative data provide a rich context for
evaluating project impact
13. Information Dissemination: Roles and
Responsibilities for Communication
• Information Dissemination
– Face-to-face communication: helpful, but
insufficient for data presentation
– Charts, survey tools, graphs, and other means to
display data to fully reflect impact of project
during its implementation, conclusion of
intervention, and to monitor post-intervention
effectiveness and restabilization
14. Information Dissemination: Roles and
Responsibilities for Communication
—cont’d
• Presentation of feedback through quantitative
and qualitative data offers opportunity for
project leader to:
– Anticipate risk
– Integrate findings with lived reality
– Communicate a sense of purpose to stakeholders
– Design plans for improving project past its due
date and weaving it into fabric of the work of the
organization
15. Information Dissemination: Roles and
Responsibilities for Communication
—cont’d
• Information Dissemination
– Statistical process control charts: helps to
determine impact of the change and resets new
and improved standard
– Final report (summary email): closure event to
mark achievements
– Run charts with notes attached: denotes
accomplishments
16. Tools of the Trade
• Tools of the Trade
– Statistical process control software: control chart
that emanates from statistical process control
documents change and variability
– Graphic presentation of data: dashboard
presentation compares and contrasts metrics into
a single document
17. Summary
• Value obtained by knowing specific
stakeholder wants and needs related to
project.
• Can vary widely; often include access to
service, cost-effective delivery of service,
satisfaction with project’s outcomes.
• Metrics encompass art and science of
measuring value.
18. Summary—cont’d
• Metrics must be informative, relevant,
unbiased and comprehensive, action oriented,
performance targeted, and cost effective.
• Projects vary in complexity and the metrics
will vary accordingly.
19. Summary—cont’d
• Metrics can include both quantitative and
qualitative data, which are complementary
concepts; whereas the former provides
information about specific points of achievement,
the latter provides context.
• Project leaders accountable for fair and honest
representation of project and should be prepared
to reveal progress toward project’s aims, as well
as unanticipated outcomes (positive and
negative).
20. Summary—cont’d
• Project leaders should use decision science
tools, such as those associated with statistical
process control and dashboard mapping to
represent their work and to adapt projects as
needed.
• Data should support the project from before
the onset of the project through to project
stabilization, until the work is sustained as
part of the way work is done.