SlideShare a Scribd company logo
1 of 79
Introduction
Where last week we examined specific and quantifiable methods
for tracking implementation progress,
this week we look at a few qualitative things managers can do
to ensure that IT projects succeed. As
you will learn from the text, there are abundant examples of IT
projects in the public sector that are not
deemed successful. Even the most seasoned IT manager gets a
bit depressed when reading the
examples in these notes and the text. There are political/legal,
structural, operational/managerial, and
cultural barriers to development of public information
technology and e-government, forming a
complex intermixture of forces. Challenges in these area are
even greater for inter-jurisdictional
and inter-sectoral initiatives than interdepartmental ones
(Kernaghan, 2005). As a result, there is no
“formula for success” and even the most commonly cited
success factors may prove inappropriate in
certain circumstances.
Success Factors?
Unfortunately, the rate at which IT projects fail in both the
public and private sectors is very high. The
Center for Technology in Government suggests that more than
half of all government IT projects are
viewed as failures by organizational stakeholders. Projects
Fail. Typical reasons for failure are:
1. Complexity – projects are too large and complex
2. Commitment Failure – lack of commitment to the project
from stakeholders
3. Planning Failure – creation of poor implementation plans
4. Vision Failure – underlying assumptions are unrealistic
5. Inappropriate Methods – agency methods may not match IT
6. Short Time Horizon – unrealistic schedules
7. Turbulent Environments- rapid rates of change can beget lack
coordination and system failure
8. Failure to Support End Users – end users have no incentive to
use the technology or are unable to
use it
IT Failures in the 1990s
In the 1990s, just before the U.S. federal government embraced
enterprise resource planning (ERP)
systems on a massive scale, the corporate world had
experienced a number of traumatic ERP incidents.
Universal Oil launched an ERP system in 1991 but found the
system unusable and sued Andersen
Consulting, the vendor, for $100 million. Andersen countersued
Universal for libel. In 1996, FoxMeyer
Corporation hired SAP and Andersen Consulting to install an
ERP system, the problems with which
resulted in the bankruptcy of the company and a billion dollar
lawsuit. Tri Valley Growers, a giant
agricultural cooperative, initiated a $6 million ERP system in
1996, finding none of the Oracle software
performed as promised, some was hardware incompatibility, and
the company filed a $20 million
lawsuit. Oracle countersued and the company went into
bankruptcy. In 1998-1999, W. W. Grainger Inc.
launched a $9 million ERP system which miscounted inventory,
crashed recurrently, and led to a $23
million loss in profits. Grainger persevered, however, and
worked with the vendor (SAP) to fix problems.
Hershey Foods Corp., seeking to meet Halloween/Christmas
candy demand, forced installation of a new
$112 ERP system in just seven months. The new system
miscounted inventory, caused delayed
shipments and incomplete orders, leading to a 12% drop in sales
(Nash, 2000).
Of course, ERP software was not unique. Other very large IT
projects had failed in the private sector in
the 1990s as well. AMR Corp., attempted in the early 1990s to
create the “Confirm” reservation system
for Budget Rent A Car Corp., Hilton Hotels Corp., and Marriott
International Inc. After four years the
project was in shambles, both sides sued the other, and AMR
took a $109 million write off. Snap On Inc.
installed a new automated order entry system in 1997 only to
discover the system delayed orders,
miscounted inventory, and cost the company $50 million in lost
sales while increasing operating costs by
40% with a net 22% decline in company profits over the
previous year.
Greyhound’s $6 million “Trips” reservation and bus dispatch
system was installed in 1993 but crashed
when Greyhound offered sale prices. Agents were forced to
revert to manual methods and ridership
dropped 12% in a month. Greyhound posted a $61 million loss
in 1994 and although “Trips” was
eventually fixed, Greyhound never regained its competitive
position. Norfolk Southern Corp. in merging
with Conrail, relied in 1998-1999 on customs logistics software.
When a dispatcher mistakenly entered
erroneous data, Norfolk Southern suffered train backups for
over a year, incurring $80 million in extra
overtime pay for workers and had to pay to fix the system.
Oxford Health Plans launched a new
automated billing and claims processing system in 1996 only to
find that it had mis-estimated medical
costs and delays led to massive protests from doctors and
patients. Oxford suffered its first quarterly
losses ($78 million) and was fined an additional $3 million by
the state of New York. Much of the system
was abandoned and replaced with other commercial modules
(Nash, 2000).
The Real Y2K Disaster?
Between 1995 and 2001, six surveys reported disturbing failure
rates for information systems projects.
Surveying these surveys, one consulting firm summarized, “An
IT project is more likely to be
unsuccessful than successful - only about 1 out of 5 IT projects
is likely to bring full satisfaction (and) the
larger the project the more likely the failure” (IT-Cortex, 2006).
• The OASIG Survey (1995, for the Organizational Aspects of
Information Technology special
interest group) of 14,000 computer user groups in the UK found
a reported IT success rate of 20-
30%, with at least 70% of projects failing in some respect.
• The Chaos Report (1995) was a much-cited Standish Group
study of 365 US IT executive
managers spread across the public and private sectors, finding
31% of projects were canceled
before completion and ran over budget (by an average 189%).
Standish estimated organizations
paid $81 billion annually for canceled projects and another $59
billion in cost overruns. Only
16% of projects were completed on time, on budget, and to
original specifications.
• The KPMG Canada Survey (1997) was a poll of 176 leading
public and private sector
organizations in Canada, finding over 61% of projects were
deemed failures by respondents.
Over three quarters of projects were behind schedule 30% or
more, and over half had
substantial cost overruns.
• The Conference Board Survey (2001) was a study of 117 U.S.
companies which had attempted
ERP projects, finding 40% of projects failed to achieve their
business case goals within one year
after implementation, with implementation costing an average
of 25% over budget.
• The Robbins Gioia Survey (2001) was another study of ERP
project implementation, covering
232 public and private sector organizations and finding over
half (51%) evaluated their ERP
implementation as unsuccessful
Failure statistics: private sector
In data covering 2004, the Standish Group report on IT project
outcomes in over 9,000 primarily large
corporate/US or European projects, found that only 29% of all
projects succeeded (delivered on time, on
budget, with required features and functions). Some 53% fell
in its “challenged” category (late, over
budget and/or with less than the required features and
functions). Another 18% failed outright
(cancelled prior to completion or delivered and never used)
(Standish Group, 2004). This represented an
improvement over its corresponding figures a decade earlier,
when only 20% of projects could be
considered successful. On the other hand, failure remains
prevalent in IT projects in the private sector.
Examples of IT Failures in the Public Sector
Certainly the public sector has not been immune from project
failures. IT project failure, defined as
projects which come in over budget, behind schedule, and/or
lacking originally intended functionality,
continue to be common in spite of a decade of efforts by the
Office of Management and Budget, the
Government Accountability Office, and other oversight bodies
to remedy the problem:
• The Veterans Affairs Department spent $342 million on its
Core Financial and Logistics System
before abandoning it in 2004. The VA also had to pull the plug
on its $300 million HR Links
automated personnel system (Mosquera, 2005c).
• In 2005, the FBI abandoned its $170 million computerized
case management system, the Virtual
Case File System, after consultants judged it obsolete before its
completion and riddled with
problems (Associated Press, 2006b).
• In 2005, United Airlines at Denver International Airport
abandoned its 10-year-old automated
baggage-handling system in favor of the old manual one,
estimating it would save $1 million a
month by doing so. The automated system had been plagued by
construction delays, cost
overruns, lost and damaged baggage, and long lines. The city of
Denver had paid a reported
$441 million for the system, and United Airlines is obligated to
pay $60 million/year for its
Denver facilities under a 25-year contract. The system vendor,
BAE Automated Systems, has
ceased to exist (Associated Press, 2005).
• In the Environmental Protection Agency, the Clean Air
Markets Division Business System
(CAMDBS, a tool supporting emission trading programs) as of
2005 was already $2.8 million and
two years over the originally budgeted plan. Similarly,
PeoplePlus, came in $3.7 million and one
year over EPA original plans (Thormeyer, 2005b).
• The Senate Committee on Homeland Security and
Governmental Affairs’ Permanent
Subcommittee on Investigations held hearings in September,
2005, and found that the Defense
Travel System (a web-based travel booking system, meant to be
similar to Expedia or
Travelocity) was over budget by over $200 million and was four
years behind schedule (Onley,
2005a).
• A 2005 Inspector General’s report on systems employed by the
Federal Aviation Administration
found that 9 of 16 major acquisitions had experienced schedule
delays of 2 to 12 years, and 11
had experienced cost growth of about $5.6 billion over initial
budgets (Thormeyer, 2005e).
Often after such failures, as in the case of the Veterans
Administration, the proposed solution was even
larger-scale “enterprise” software systems. Many more
examples of public sector IT failure are tracked
in The Risks Digest.
The disaster surrounding Hurricane Katrina, which left the city
of New Orleans flooded in 2005, provided
a case example relevant to why information technology projects
may fail. Rep. Nydia Velazquez (R NY),
ranking member of the House Small Business Committee, noted
that the Small Business Administration’s
Office of Disaster Assistance (ODA) was hampered in being
responsive due to ongoing centralization of
ODA core applications a new Disaster Credit Management
System (DCMS), operated from a single
location in Fort Worth, to which staff were still adjusting
(Thormeyer, 2005a). Though created months
earlier, the DCMS was reported in the press to have “stumbled
badly because there haven't been
enough new computers or staff trained to use them” and also
because “those who have made it out into
the field have discovered that they can't always link up from the
disaster area to handle new loan
applications and file reports on existing ones” (Gaouette,
Miller, and Alonso-Zaldivar, 2005). Though
these represent potentially solvable problems, they also
illustrate common factors which may
undermine IT implementation efforts - neglect of training and
shortage of resources.
What to Do?
The list of IT failures is daunting, yet public organizations are
increasingly dependent on information
technology to engage citizens, deliver services and provide
information. IT management consultant
Robin Goldsmith (2005) suggests there are three prime reasons
for the failure of IT projects, all having to
do with top-down implementation. The Gartner group, a leading
systems consulting firm, gives similar
reasons (Olsen, 2005). Below are four critical mistakes IT
managers make when implementing IT
projects:
• Failure to freeze requirements. As the project develops,
“requirements creep” occurs as
management adds previously unanticipated and unbudgeted new
scope and new requirements.
When the project proceeds before system specifications are
defined and final, the chances of
failure escalate markedly.
• Bad budgeting. Failure may be linked to management
mandating budgets rigidly, without due
regard for requirements necessary for success. Alternatively,
cost-plus contract budgeting gives
little incentive to constrain costs.
• Inadequate time. In order to meet pre-established deadlines,
project quality and production
values are sacrifices. Similarly, the pace of technology or
programmatic urgencies (ex., terrorism
in DHS IT projects) can mean unrealistic time frames and
failure.
• Hierarchical, top down, control oriented approaches to
systems development may relate to any
of these three causes of IT failure and is itself a widely-cited
failure factor (ex., Brown and
Brudney, 2004).
Failure Factors
A 2006 Commerce Department report on the technical and
economic impact of new Internet protocols
on government IT projects concluded that failure to plan was
one of the most significant
implementation risks. In this case, to plan for new network
arrangements, planning required that
administrators benchmark existing network performance.
Without this benchmarking, it is difficult to
gauge new performance specifications in IT contracting
(Government Computer News, 2006b).
Neglect of human factors in technology implementation
continues to be the prime reason for
technology failure. For instance, Rinkus et al. (2005) studied
healthcare technology, finding that many
projects failed due to inattention to such human factors as
workflow, organizational change, and
usability. Rinkus found that even when technology project
managers do look at human factors explicitly,
they usually limit their focus to designing better user interfaces,
whereas what is needed more is greater
consideration of users’ functionality.
Neglect of the human dimensions of information technology
may be linked to the narrowly technocratic
nature of leadership frequently found in IT projects. A Cutter
Consortium survey published in CIO
Magazine found human factors, not lack of expertise or
technological experience, to be all of the “top
five failure factors” for information technology officers. Based
on a study of 250 senior IT executives
who were asked to describe the attributes of “the worst IT
manager” they had ever known, the five
failure factors were:
• Poor interpersonal skills (58%)
• Being self centered (56%)
• Failure to acknowledge problems (55%)
• Untrustworthiness (54%)
• Weak management skills (52%)
Noting that failed IT leadership was associated with lack of
empathy, lack of emotional ability, and
inability to connect with others, the authors of the survey
identified the overall main cause of IT
leadership failure as “lack of emotional intelligence” (Prewitt,
2005).
Poor data as a failure factor
New Jersey’s Department of Youth and Family Services (DYFS)
was sued in 2003 by the New York Times
and other newspapers for access to its Services Information
System (SIS) database. Documents
subsequently revealed that “outdated, inefficient technology”
plagued the DYFS. One social worker was
quoted as saying, “most of what we put on SIS is wrong”
(Peterson, 2003b: 28). While DYFS has since
implemented a new information system, its 2003 plight
illustrated the GIGO (“garbage in, garbage out”)
principle which applies to all IT systems.
Overdependence on automated systems as a failure factor
Perry, Wears, and Cook, 2005, studied a near-fatal IT failure in
a hospital setting, where a automated
dispensing unit (ADU) failed to dispense needed medicines,
instead giving a “Printer not available” error
message and locking the keyboard. All drugs became
unavailable to all nursing units. The authors note
that, “Ironically, the more reliable the technological system, the
more deskilled workers become in
responding to the eventually inevitable anomalies” (p. 60).
Hospital workers did not have in place
procedures to handle IT systems failure manually and, indeed,
the ADU was designed to prevent manual
access to medicines, some of which were controlled substances.
The authors conclude that while
automation can be an effective tool, it can also be “a potent
source of new vulnerability” (p. 60).
Inappropriateness for local culture as a failure factor
E-government “transparency” projects are an example of global
technology transfer, but the failure to
take into account organizational culture in the local context can
lead to failure (Heeks, 2005). Likewise,
in a study of European back-office support for e-governance
services, Westholm (2005) found that
governance structures are “historically grown,” preventing
otherwise innovative models for integrated
e-government services from being copied from one nation to
another in any simple manner. Culture as
an obstacle to technology transfer has been confirmed in
numerous other studies (ex., Rezgui, Wilson,
Olphert,& Damodaran, 2005).
Internal Success Factors
Despite the problems IT projects have encountered there is a
consensus emerging as to why IT projects
succeed. Success factors include:
1. Management Support – active involvement of top
management.
2. Stakeholder Motivation – typical resistance to change must
be overcome by convincing
stakeholders of the benefits.
3. Goal Clarity – project scope must be clear.
4. Support for Organization Culture.
5. Participatory Implementation – employee resistance factor.
6. User Friendliness – a way to increase stakeholder motivation.
7. Adequate Budget and Time Horizon.
8. Phased Implementation – extension of goal clarity
9. Process and Software Engineering – dealing with legacy
systems
10. Project Management – professional is better.
Commonly cited success factors include incomplete user
requirements, inadequate management
involvement, ineffective communication, immature work
processes, technicians’ unwillingness to be
constrained by formal standards (Richardson and Ives, 2005). It
is not uncommon for agencies to
identify long lists of such success factors in formal planning
documents. Here, for instance, is such a list
from the Food and Drug Administration (FDA, 2005: Appendix
B):
1. Effective communications between FDA IT and our
customers and suppliers, that increase the
opportunities for mutual understanding and collaboration
2. Clear and consistent IT management guidance and feedback
3. An adequately sized and skilled work force with an excellent
customer service ethic
4. Adequate tools for the work force to accomplish the FDA
mission
5. Effective and well documented governance, policies,
processes, and procedures
6. Adequate funding for the IT infrastructure
7. Adequate facilities to house the IT work force and
infrastructure
8. Clear measurements of performance
Such agency lists are a combination of common sense, project
experience, and wish-list, not necessarily
based on any systematic study in a social science sense.
Nonetheless, they give a flavor of the
conventional wisdom on the subject. Internationally, Europe’s
Organization for Economic Cooperation
and Development has listed 10 success factors “to get IT right”
(OECD, 2001):
1. establish appropriate governance structures;
2. think small;
3. use known technologies;
4. identify and manage risk;
5. ensure compliance with best practices for project
management;
6. hold business managers accountable;
7. recruit and retain talent;
8. prudently manage knowledge;
9. establish environments of trust with private vendors; and
10. involve end users.
However, as discussed above, there is no one “right” list of
success factors, and any given success factor
(ex., “think small”) may be inappropriate in a particular setting.
Success stories
Agency success stories are a second source of insight into the
internal success factors thought important
in public information systems. For instance, implementing ERP
(Enterprise Resource Planning) software
can be a challenging or even disaster-prone effort, but a success
story is the Department of
Transportation’s (DOT) implementation of the Oracle Federal
Financial e-Business suite. DOT’s effort
was so successful, it was designated by the OMB as a “center of
excellence” for the Financial
Management Line of Business initiative. Six key success
factors cited as underlying this success story
are:
1. Avoiding customization of the software, instead making
agency practices conform to pre-
determined process formats dictated by the vendor.
2. Gradual rather than rushed implementation, over the 2000 -
2003 period.
3. Getting stakeholder buy-in by creating test labs in each
agency to demonstrate workability.
4. Having a contract which required the vendor to maintain the
software, including cost of
upgrades.
5. Providing individual desk-side transitional assistance for the
first six weeks.
6. Creating a user support group to share problems and solutions
across the department’s units.
The completed DOT system allowed the department to produce
financial statements at 8 a.m. of the
first of every month, and to meet deadlines for year-end
financial statements (Government Computer
News, 2006c).
Leadership
In a 22-country survey of e-government activity of 2001
compared to 2000, Hunter (2001) categorized
countries’ progress on e-government based (innovative leaders,
visionary followers, and steady
achievers). Progress was measured in terms of service maturity
(number and depth of online services)
and delivery maturity (degree of way toward ideal of single-
point-of-access cross-agency web presence).
“Interestingly,” he concluded, “the research found that overall
progress in e-Government is not closely
correlated with similar social, political or economic
characteristics.” Rather, adoption was driven by
“leadership, political will, commitment of deliverables and
accountability for results.” (Hunter, 2001)
Supportive organizational culture
As the information economy replaces the production economy,
the challenge of the typical organization
becomes increasingly one of management of professional
workers. Professionalization has often been
associated with the ethos of autonomy, and professional workers
have been characterized as resistant
to close managerial supervision. However, a study of
professional expectations of managers by Hernes
(2005) found not so much a desire for autonomy from managers
as for a supportive style of
management. Supportive management is characterized by
managers giving priority to relationships,
constructive feedback, encouragement and motivation, and to
clear and realistic communication of
management expectations.
Power struggles and organizational politics are associated with
the dissemination of information
technology, which in turn requires greater collaboration and
partnering skills in public management
(Kernaghan and Gunraj, 2004). Management of information
technology, especially given the increased
importance of public-private partnerships, requires a team based
collaborative organizational culture to
motivate and retain knowledge workers. Classical approaches to
technology management do not
adequately take politics and organizational culture into account
(Haynes, 2005).
A discussion of IT-based change in Computer World
(Karacsony, 2006) emphasized implementation
success factors such as recognizing and working with
organizational politics, involving all staff from the
beginning, emphasizing communications, being non-threatening,
showing benefits of change, and
providing support and training for the change process.
In particular, a criticism-tolerant organizational culture can be
an important success factor. In a study of
IT in the setting of public schools, Hinman, Saarlas, and Lloyd
Puryear (2004) found that “A key factor in
successful collaborative efforts is camaraderie and trust among
a group’s members that problems (and
failures) can be shared without negative consequences” (p.
S10).
Participative IT initiatives
End-user participation is almost universally listed as a success
factor in IT implementation (ex., OECD,
2001: 6). When in 2005 it came time for the Census Bureau to
plan the enterprise architecture for its
2010 Census, priority was given to employee participation in its
development. “The people who have
participated in developing it, I think, are seeing the benefits
because they are referencing it,” said
Marjorie Martinez, branch chief of the Census Bureau’s
software and systems standards division
(Mosquera, 2005b).
Adequate staffing
Adequate staffing prevents work overload and provides career
paths necessary for employee
motivation. Kim’s (2005) study of state government information
technology staff found that work
exhaustion combined with lack of opportunity for advancement
were prime correlates of job turnover,
while salary satisfaction was not a significant correlate.
Outcome-based contracts as success factor
Principal-agent theory points to the importance of outcome-
based contracting as the key success factor
(Mahaney and Lederer, 2003), viewing IT implementers as
“agents” who must be held clearly
accountable by the CEO “principal.” Whether the principal-
agent analogy is a good one, few disagree
with the idea that goal clarity is important in successful IT
implementation and conversely, when
management or external forces make the goal a shifting target,
failure is much more likely to occur.
The role of training
Lack of training can lead to IT system disuse. For instance,
DHS’s Homeland Security Information
Network employs Common Operating Picture (COP)
applications software to share information and
maps related to activities and events in disaster response. In late
2006, Senators Susan Collins (R-ME)
and Joseph Lieberman (I-CT) charged in a letter to DHS that
COP was “hugely” underutilized, used
regularly by fewer than 6% of first responders. The senators
wrote, “DHS has done little to inform first
responders about the common operating picture or to train them
how to use it” (Lipowicz, 2006n).
Training for workforce information literacy is an oft-cited
critical success factor (Fowell and Elliot 2002).
In general, the critical role of training has been emphasized in
any number of lists of success factors for
IT implementation (ex., OECD, 2001: 6).
There is a direct correlation between the amount of attention
and resources devoted to training and the
success of implementing IT. The main aim of training is to
make staff feel competent using new IT.
Training requires time that is almost always taken from other
tasks – this requires the support of
management and contingency for handling work displaced.
Timing is important in training – too early or
too late can affect success. Development of a supportive
culture for training and development is
essential
External Success Factors
An agency’s external environment can be a major determinant
of IT success. External success factors
include:
1. Partnerships with Vendors and Other Strategic Partners –
partnering can often tap into knowledge bases unavailable
in
house. Partnering can be short term or long term
2. Creating Independence from Vendors – project
implementation is
more likely to be successful if the agency is not dependent
on
the vendor. Agencies should avoid allowing contracts to
deplete
the in-house staff to a point where in-house expertise to
evaluate
contracts is either limited or even eliminated
3. Accountability to the Political Layer – IT projects will be
more
successful if long-term funding is secure and the benefits
made
clear to the political funders
External funding
Partnering and strategic alliances are a key success factor for
growth in competitive IT markets. (U. S.
Commercial Service, 2005). Non-profit organizations,
particularly in human services, are often on the
wrong side of the digital divide. For non-profits, external
funding through grants is a major factor in
whether the organization acquires and uses information
technology. External funding is more of a
determinant than is organizational size (Humphrey, Kim, and
Dudley, 2005).
Professionalization
Based on a regression study of e-government in the 50 states,
research suggests that more extensive e-
government is correlated with legislative professionalization,
state professional networks, and
Republican control of the legislature. Contrary to (citizen)
demand theory, states with more Internet
access and use actually have less developed e-government
features (McNeal, Tolbert, Mossberger, and
Dotterweich, 2003).
Evaluation of Public Information Systems
Strategies for Evaluating Public Information Technology.
Traditionally there has been little incentive to
attempt this in the public sector. Some questions are:
1. What are the criteria for evaluation? Are they to be
weighted?
2. How to value intangibles in dollar terms?
3. Should dollar be in net present value? Financial models to be
used?
4. How far into the future to measure benefit stream?
5. Should potential value be included? If so, how?
6. Should benefits be discounted for other causal factors of
results?
7. Should internal benefits be counted in return on investment?
8. Should subjective benefits be included in evaluations?
Strategies for Evaluating Public Information Technology
Questions Regarding Measurement of Costs:
1. Will cost measurement be total or marginal cost?
2. How much agency infrastructure will be allocated to the IT
cost analysis and by what formula?
3. Are life cycle costs included?
4. Are training costs included?
5. Is depreciation of facilities, equipment, and software
included?
Life Cycle Costing is an important aspect of cost evaluation.
This includes taking into account ownership
costs such as operations, maintenance, and disposition.
The Private Sector Model: Return on Investment (ROI)
From a managerial perspective, IT investment tends to shift
from a “honeymoon” period to a later
period of “competitive realism.” Return on investment analyses
search for the illusive tangible benefits.
The private sector model of calculation focuses on quantitative
indicators, but this is often difficult to
transfer to the public sector due to accounting for intangibles.
In the public sector, intangible benefits of IT can be as
important as the tangible ones. These would
include such items as client and citizen awareness, participation
and satisfaction, upholding democratic
accountability and transparency, policy consistency, more
effective targeting of services, staff morale
and improving citizen choices.
ROI an increasing federal priority
A survey of 97 federal IT executives by CDW Government, Inc.,
found that while cybersecurity remained
a top but declining priority (24 percent in 2006, compared to
43% in 2005), proving return on
investment (ROI) was a rising second priority (18% in 2006,
compared to 10% in 2005) (Miller, 2006i).
The problem with ‘hard” ROI evaluations
Behind every ROI study, there are assumptions, often very
questionable ones. An example was the
Defense Department’s 2006 claim that its new travel
information system would save $56 million per
year. When the Government Accountability Office investigated,
however, they found the $56 million
figure had two components. First, DoD claimed over $24
million would be saved by personnel savings,
but the GAO found that the Navy and Air Force, when queried,
disclaimed there would be any personnel
savings, though some personnel might be shifted to other
functions in the future if warranted. Second,
the DoD claimed a $31 million savings in reduced commercial
travel fees, but the GAO found there had
been no actual fiscal analysis at all, but rather the figure was
derived from an article in a travel trade
journal (Onley, 2006b). The case raised issues common to ROI,
even when confined to economic impact
measurement: From the DoD view, displacing personnel was
“savings,” but not necessarily from the
viewpoint of Congress, which the GAO serves. Likewise, the
DoD claimed guesstimates found in the
press were better than no data at all, but are they?
Unfortunately, every such estimate of costs and
benefits is fraught with similar problems of measurement.
Soft ROI evaluations?
A strikingly more flexible version of a ROI evaluation strategy
emanated from a 2006 report of the
Center for Technology in Government, University at Albany, in
conjunction with SAP, Inc. (Cresswell,
Burke, & Pardo, 2006; Government Technology, 2006k). (SAP
is one of the leading producers of
Enterprise Resource Planning [ERP] software). The CTG/SAP
strategy emphasized how IT projects can
create value, either governmental improvements or services to
groups and individuals, which goes well
beyond the usual measures of expected direct financial savings.
Such value includes, for instance,
increasing the integrity of government through greater
transparency. In a set of six international case
studies, the report set forth a framework for incorporating such
important but non-financial values into
ROI evaluations.
The CTG/SAP framework (which is non-proprietary) calls on
evaluators to identify stakeholder groups
and for each to then identify not only financial but also
political, social, strategic, ideological, and
“stewardship” (trust, integrity, and legitimacy) impacts on each,
finding ways to measure these impacts.
In addition to the usual efficiency (input/output ratio
improvements) and effectiveness (achievement of
organizational goals) impacts, value is construed to include also
enablement (positive externalities) and
intrinsic enhancements (public goods like legitimacy and
transparency). The framework acknowledges
that actual measurement of impacts often must be qualitative
rather than quantitative, though case
studies give examples of each. Finally, the framework urges that
risk analysis include not only the usual
analysis of development risks (risks the project may not be
implemented on time, within budget, and to
specifications) but also benefit risks (risks that anticipated
beneficial impacts identified in stakeholder
analysis will not materialize). In summary, the CTG/SAP
framework for ROI stands the traditional ROI
approach on its head, making it quite different from financial
analysis and embracing many long-
standing criticisms of ROI for being too narrowly economic in
nature.
Stakeholder analysis through Value Measuring Methodology
(VMM)
Although VMM is a form of cost-benefit analysis, the intent of
its authors was to conduct such analysis
in relation to stakeholder groups. A governmental example was
the application of VMM to the project
which created the XML.gov registry (Johnson and Foley, 2005).
This project was intended to spread the
gospel of Extensible Markup Language (XML) as a solution to
interoperability problems associated with
sharing documents and data, and to facilitate cooperative efforts
among government agencies and the
private sector toward this end. A specific problem was that
many agencies were creating their own
separate XML registries, ironically creating lack of
interoperability. A VMM evaluation calculated
expected benefits for each stakeholder group (direct users,
government financial, government
operational, social, and political/strategic actors). Stakeholder
scores were aggregated into a total
benefit score for each of three proposed alternatives, one of
which was establishing a unified federal
XML registry. By proceeding in this way, information was
highlighted which showed which groups
benefitted the most, where costs occurred, and where the
greatest risks occurred. This allowed
alternatives to be evaluated in terms of risk-adjusted scores.
The stakeholder focus of VMM can be used
to dovetail with other stakeholder-oriented organization
development and organization change efforts.
Evolution of a Public Sector IT Model for Evaluation
Public administration makes a continual attempt to improve
evaluation of services provided. Over the
years a wide variety of performance measures for IT have been
practiced or advocated. However, over
time, the measures have shifted to emphasize effectiveness in
terms of mission fulfillment.
Key Legislation
- Chief Financial Officers Act of 1990
- Government Performance and Results Act of 1993
- Federal Acquisition Streamlining Act of 1995
- Paperwork Reduction Act of 1995
- Clinger-Cohen Act of 1996
Models used for evaluation:
Traditional Value Ranking Methods – establishment of value
categories
Value-Measuring Methodology (VMM)– a multicriterion
decision-making methodology
The Program Assessment Rating Tool (PART) – identifies a
program’s strengths and weaknesses
The Performance Reference Model (PRM) – calls for
government agencies to develop their own
performance metrics
Evaluation of Governmental Web Sites
In evaluating government websites, it may be important to
emphasize interactive features, not simply
for the intrinsic value of e-participation and e-democracy, but
also because such features seem to play a
critical role in attracting users to government portals. For
instance, in 2005 Firstgov.gov dropped from
being the most-visited federal website to being in ninth place,
according to a study by Darrell West of
Brown University’s Taubman Center for Public Policy. West
attributed the drop to the failure of
FirstGov.gov to add new features, such as the interactive e-mail
feature which the White House website
added, helping propel it, West believed, into first place
(Lipowicz , 2005f). However, in West’s 2006
survey, FirstGov returned to being one of the top two most-
visited websites, along with the USDA.
Melitski, Holzer, and Kim (2005) suggest Seoul, Hong Kong,
Singapore, New York, and Shanghai are the
top five large cities worldwide in providing digital government
using a 5 stage model for evaluating
public sector websites that includes:
1. Security & Privacy
2. Usability
3. Up-to-date, accurate content
4. Service provision
5. Citizen participation
OMB Standards for Federal Websites
The 2004 OMB requirements for federal websites, incorporating
requirements found in the E-
Government Act, the Paperwork Reduction Act, and OMB
Circular A-130, includes ten specifications
(Miller, 2004r):
1. Using .gov, .mil, or fed.us domains
2. Implement security controls
3. Maintain accessibility for the disabled.
4. Protect the privacy of data
5. Have an information quality assurance process
6. Follow requirements for managing federal records
7. Maintain schedules for posting information
8. Establish and enforce agencywide linking policies
9. Communicate with state and local government, and citizens,
regarding information needs.
10. All agency websites should have a search function.
The Range and Type of Services Approach
Many evaluation strategies focus on the range and quality of
government services provided over the
Internet. The Brown University (West) surveys cited above
ranked over 1,500 state and federal websites
based on content, including contact information, comment
forms, automatic email notification,
publications, databases, audio and video clips, foreign language
features, disability services,
advertisements, user fees, and security and privacy statements.
The survey found 77% of government
web portals offered fully executable services online in 2006, up
from 73% in 2005. Some 71% had posted
privacy policies, up from 69% in 2005 Thormeyer, Rob (2006q).
A second services approach is exemplified by Scott (2005), who
evaluated municipal websites using
measures on five different dimensions of service quality.
Confirming previous studies, Scott found that
city size was a key predictor of quality, but there was wide
variation in Internet services
Similarly, the National Policy Research Council (NPRC)
undertook to rank all known official U.S. state and
local government websites in 2006 (Government Technology,
2006aa) using a combined range-of-
functionality and type-of-content approach, giving higher
ratings to websites that had more of some 25
measured features in these categories:
• website navigation aids
• home page content and links
• handicapped accessibility options
• website outreach via foreign language translations, e-surveys
online comment forms
• legislative, mapping, and socio economic statistical
information
• online payment options for taxes and services
• online business and/or construction permit options
• online procurement information and transactions
• government job listings and online job applications.
The Usability Approach
A manual containing design and usability guidelines was
published in 2003 by the Department of Health
and Human Services in partnership with the General Services
Administration. This manual, Research
based web design & usability guidelines (HHSD, 2006), has
since become perhaps the single most widely
used resource of its type and reflects the input of a wide variety
of public, private, and academic sector
experts. The manual contains a large number of guidelines
ranked by importance. As of the 2006
edition, those in the top category of importance (category 5)
were:
• Provide useful content
• Establish user requirements
• Understand and meet user ’s expectations
• Involve users in establishing user requirements
• Do not display unsolicited windows or graphics
• Comply with section 508
• Design forms for users using assistive technology
• Do not use color alone to convey information
• Enable access to the homepage
• Show all major options on the homepage
• Create a positive first impression of your site
• Place important items consistently
• Avoid cluttered displays
• Place important items at top center
• Eliminate horizontal scrolling
• Use clear category labels
• Use meaningful link labels
• Distinguish required and optional data entry fields
• Label push buttons clearly
• Make action sequences clear
• Organize information clearly
• Facilitate scanning
• Ensure that necessary information is displayed
• Ensure usable search results
• Design search engines to search the entire site
’Relative importance ’ was determined by a panel of external
reviewers (16 experts in the original
manual, 36 more in the revised manual), half web designers and
half usability specialists. In addition,
the following criteria were judged to be in the top category by
strength of evidence in reported web
research (which depends as much on what researchers choose to
look at as importance, hence the
differential between expert opinion and empirical research):
• Provide useful content
• Standardize task sequences
• Design for working memory limitations
• Align items on a page
• Use descriptive headings liberally
• Use black text on plain, high contrast backgrounds
• Use attention attracting features when appropriate
• Use familiar fonts
• Emphasize importance
• Order elements to maximize user performance
• Use data entry fields to speed performance
• Use simple background images
• Use video, animation, and audio meaningfully
• Use images to facilitate learning
• Use mixed case with prose
• Group related elements
• Use color for grouping
• Use an iterative design approach
While the HHSD guidelines are meant to be design criteria, they
may be converted for use as criteria
when evaluating government websites. As a group, this set of
criteria focus on usability, downplaying or
omitting considerations such as interactivity or e-democracy
support or even range and type of services.
“Citizen-centric” evaluation strategies
Wang, Bretschneider, and Gant (2005) have proposed an
evaluation approach which emphasizes why
some web designs are better than others in facilitating citizens'
information seeking tasks. They contrast
this “citizen-centric” approach with most existing evaluation
methodologies, which either focus on
usage and accessibility or on user reactions. Counting number
of page hits, for instance, shows usage in
quantitative terms but does not show if citizens are finding what
they were seeking. Rejecting private
sector evaluation methods which assess websites in terms of
contribution to competitive advantage of
the firm, the Wang-Bretschneider-Gant approach instead
focuses “task related information” (relevancy
of the website to tasks citizens seek to accomplish), “task
complexity” (skills needed to utilize
information presented), and (3) effectiveness of “perceptual
stimuli” such as text, audio, video, and
charts utilized to convey information.
A simplified version of the “citizen-centric” evaluation strategy
is simply counting interactive forms
available to the citizen, since these are almost by definition
oriented to citizen tasks. The University of
Minnesota Extension Service (2005), for instance, notes
“Perhaps the quickest measurement of e-
government service to citizens is the number of forms on the
website. They can be simple contact or
feedback forms, requests for services, subscriptions to email
lists, search boxes, or easy to use e-
commerce functions for paying taxes, obtaining licenses, or
buying a county souvenir. The more forms
and the easier they are to use, the more audience focused the
government website. No forms means no
interaction— failed website.”
ID
Task
Mode
Task Name
Duration
Start
Finish
Aug 20, '17
Sep 24, '17
Oct 29, '17
Dec 3, '17
Jan 7, '18
Feb 11, '18
Mar 18, '18
Apr 22, '18
May 27, '18
Jul 1, '18
Aug 5
S
M
T
W
T
F
S
S
M
T
W
T
F
S
S
M
T
W
T
F
S
S
M
T
1
PROJECT DESIGN
63 days
Mon 9/4/17 Wed 11/29/1
2
DEVELOP FUNCTIONAL SPECIFICATIONS
10 days
Mon 9/4/17 Fri 9/15/17
3
DEVELOP SYSTEM ARCHITECTURE
4
DEVELOP PRELIMANARY DESIGN SPECIFICATIONS
20 days
Mon
9/18/17
2.5 days
Mon
10/16/17
Fri 10/13/17
Wed
10/18/17
5
DEVELOP DETAILED DESIGN SPECIFICATIONS
6
DEVELOP ACCEPTANCE TEST PLAN
20 days
Wed
10/18/17
10.5 days
Wed
11/15/17
Wed
11/15/17
Wed
11/29/17
7
PROJECT DEVELOPMENT 157 days
Thu 11/30/1 Fri 7/6/18
8
DEVELOP COMPONENTS
60 days
Thu
11/30/17
Wed
2/21/18
9
PROCURE SOFTWARE
25 days
Thu 2/22/18 Wed 3/28/18
10
PROCURE HARDWARE 25 days
Thu 3/29/18 Wed 5/2/18
11
DEVELOPMENT ACCEPTANCE TEST PACKAGE
12
PERFORM UNIT/INTEGRATION TEST
25 days
Thu 5/3/18
Wed 6/6/18
22 days
Thu 6/7/18
Fri 7/6/18
13
PROJECT DELIVERY
9.5 days
Mon 7/9/18 Fri 7/20/18
14
INSTALL SYSTEM
2 days
Mon 7/9/18 Tue 7/10/18
Project: Web Based Tool
Date: Sun 7/23/17
Task
Split
Milestone
Project Summary
Inactive Task
Inactive Milestone
Manual Task
Duration-only
Manual Summary Rollup
Start-only
Finish-only
External Tasks
Deadline
Progress
Manual Progress
Summary
Inactive Summary
Manual Summary
External Milestone
Page 1
ID
Task
Mode
Task Name
Duration
Start
Finish
Aug 20, '17
Sep 24, '17
Oct 29, '17
Dec 3, '17
Jan 7, '18
Feb 11, '18
Mar 18, '18
Apr 22, '18
May 27, '18
Jul 1, '18
Aug 5
S
M
T
W
T
F
S
S
M
T
W
T
F
S
S
M
T
W
T
F
S
S
M
T
15
TRAIN CUSTOMERS
1 day
Wed 7/11/18 Wed 7/11/18
16
PERFORM ACCEPTANCE 3 days
Thu 7/12/18 Mon
TEST
17
PERFORM POST PROJECT REVIEW
18
PROVIDE WARRANTY SUPPORT
7/16/18
3 days
Tue 7/17/18 Thu 7/19/18
1 day
Tue 7/17/18 Tue 7/17/18
19
ARCHIVE MATERIALS
0.5 days
Fri 7/20/18
Fri 7/20/18
Project: Web Based Tool
Date: Sun 7/23/17
Task
Split
Milestone
Project Summary
Inactive Task
Inactive Milestone
Manual Task
Duration-only
Manual Summary Rollup
Start-only
Finish-only
External Tasks
Deadline
Progress
Manual Progress
Summary
Inactive Summary
Manual Summary
External Milestone
Page 2
7
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
�
BudgetNew Web Based Tool Budget EstimatePROJECT
TASKSLABOR HOURSLABOR COST ($)MATERIAL COST
($)OTHER COST ($)TOTAL PER TASKPROJECT
DESIGNDevelop Functional
Specifications80.0$3,100.00$2,499.00$750.00$6,429.00Develop
System
Architecture160.0$1,200.00$1,500.00$0.00$2,860.00Develop
Preliminary Design
Specification20.0$300.00$1,295.00$0.00$1,615.00Develop
Detailed Design
Specifications160.0$1,100.00$1,700.00$0.00$2,960.00Develop
Acceptance Test
Plan85.0$1,500.00$2,500.00$0.00$4,085.00Subtotal505.0$7,200
.00$9,494.00$750.00$17,949.00PROJECT
DEVELOPMENTDevelop
Components480.0$1,700.00$6,250.00$0.00$8,430.00Procure
Software (Capital
Equipment)200.0$400.00$15,500.00$0.00$16,100.00Procure
Hardware (Capital
Equipment)200.0$600.00$27,500.00$0.00$28,300.00Developme
nt Acceptance Test
Package200.0$1,200.00$1,000.00$0.00$2,400.00Perform
Unit/Integration Test
180.0$650.00$1,000.00$0.00$1,830.00Subtotal1,260.0$4,550.00
$51,250.00$0.00$57,060.00PROJECT
DELIVERYInstall
System16.0$320.00$1,750.00$500.00$2,586.00Train
Customers8.0$160.00$200.00$612.00$980.00Perform
Acceptance Test24.0$480.00$500.00$0.00$1,004.00Perform
Post Project Review24.0$480.00$0.00$0.00$504.00Provide
Warranty Support8.0$160.00$0.00$0.00$168.00Archive
Materials4.0$80.00$150.00$0.00$234.00Subtotal84.0$1,680.00$
2,600.00$1,112.00$5,476.00PROJECT
MANAGEMENTCustomer Progress
Meetings/Reports480.0$1,000.00$0.00$0.00$1,480.00Internal
Status
Meetings/Reports480.0$1,000.00$0.00$0.00$1,480.00Subcontra
ctors480.0$1,000.00$0.00$2,500.00$3,980.00Interface to Other
Internal
Departments480.0$1,000.00$0.00$0.00$1,480.00Configuration
Management480.0$1,000.00$0.00$0.00$1,480.00Quality
Assurance480.0$1,000.00$0.00$0.00$1,480.00Overall Project
Management480.0$1,000.00$0.00$0.00$1,480.00Subtotal3,360.
0$7,000.00$0.00$2,500.00$12,860.00OTHER
COSTTravel0.0$0.00$0.00$4,396.00$4,396.00Telecommunicati
ons0.0$0.00$0.00$1,759.00$1,759.00Documentation0.0$0.00$0.
00$500.00$500.00Subtotal0.0$0.00$0.00$6,655.00$6,655.00Tot
al Project Funding
Requirements5209.0$20,430.00$63,344.00$11,017.00$100,000.
00
Development & TestingDevelopment & Testing Report - New
Web Based Tool (testing results)Test Item Test Condition
Expected Result Procedure Pass/FailDefect ID#Browser /
DeviceBug Issue Description Owner CommentsSearch Form
1.aStable 5.00CompletePass325670.00Windows 8
(IE10)NoneJasmine McCordnavigation 1.aCritical
5.00Escallated to testing for fixFail325672.00Windows 8
(IE10)Cannot move mouse to menu area without it disappearing.
(See screenshot) Jasmine McCordnavigation 1.bCritical
5.00Escallated to testing for fixFail325673.00Windows 8
(IE10)Cannot scroll on pages 4, 6, 8Jasmine McCordsearch
form 1.bStable 5.00CompletePass325671.00Windows 8
(IE10)NoneJasmine McCordimage carousel 1-3.aCritical
5.00Escallated to testing for fixFail325674.00Windows 8
(IE10)The left and right arrows in box do nto scrollJasmine
McCordCustomer Support Field 1.aCritical 5.00Escallated to
testing for fixFail325675.00Windows 8 (IE10)Customer cannot
submit form unless all required fields are filled outJasmine
McCordNeed an extra 10 lead time for resolution Data Entry
Field (feedback)Critical 5.00Escallated to testing for
fixFail325676.00Windows 8 (IE10)Some fields get stuck when
data entered --> customer cannot get passed field without
restarting browserJasmine McCordNeed to resolve asap -
Priority level
Sheet2
Preliminary Project Charter Worksheet
CPMGT/305 Version 12
1
University of Phoenix Material
Preliminary Project Charter Worksheet
Complete this project charter worksheet according to the
instructions in section 4.1 of A Guide to the Project
Management Body of Knowledge.
Completed by: Learning Team A
Date: 7/22/2017
1. Project title: New Web Based Tool
2. High-level project scope ( fewer than 50 words)
A new performance management process will be created that
utilizes a web-based system; providing a convenient way to
document and track employee performance goals, provide
regularly updated status reports, and a vehicle for managerial
feedback. The system will use quantified data for salary/bonus
treatment at year’s end.
3. Problem to be solved or opportunity to be realized by this
project (fewer than 25 words)
The new performance management system will provide an
efficient way to quantify employee performance; providing a
fairer, more accurate system for performance appraisals.
4. Project purpose or justification including specific measurable
business impacts or results (fewer than 50 words)
The new performance management system is needed because of
the lack of accuracy in past employee performance reviews and
merit increases. The new system will provide both qualitative
and quantitative data metric data, which will result in providing
more precise analysis of employee work performance.
5. Measurable project objectives and related success criteria
including metrics (Provide three to four objectives with metrics)
· Determine which performance measuring criteria should be
implemented into the new system, and describe how the system
utilizes it.
· Develop performance management system platform
· Integrate system into company website
· Achieve increased employee performance of at least 15%
within 12 months of implementation.
6. High-level requirements (fewer than 100 words)
The performance management system project will require a
hand-selected project team with extensive background in human
resources and supervisory management. Two human resource
clerks will participate, in addition to a single project manager.
The team will also require the services of a software
development consultant and her team of two software engineers.
The project will be staged in a small suite of offices on the
third-floor of the downtown Seattle office. The project will
require the purchase of a new web server, and dedicated internet
connection during preliminary stages prior to implementation in
a production environment.
7. High-level risks (fewer than 100 words)
Some high-level risks include: developmental delays, scope
creep, excessive change requests, technical difficulties, lack of
employee enthusiasm for new performance system.
8. Summary of high-level milestones schedule (identify the
major deliverables and subtasks)
1. Develop Employee Performance Grading System
· Analyze industry standards, averages
· Create methodology for quantifying performance goals/results
· Develop template for employee performance status reports
· Create grading matrix for salary/bonus criteria
2. Develop software platform for Performance Grading System
· Develop outline for software development
· Create software for Performance Grading System
· Pilot test group prior to production rollout
· Work out bugs, smooth out feature set
· Implementation
3. Company Intranet Integration
· Implement Performance Management software into Web space
· Develop supervisor feedback system that integrates with PMS
· Test and Troubleshoot
· Pilot software platform/web site
· Full implementation
9. Summary of high-level budget including expense dollars,
capital dollars, and headcount (identify costs for major
deliverables and tasks identified in the preceding milestone
schedule)
Deliverables
Estimated Cost
Due Date
DEVELOP FUNCTIONAL SPECIFICATIONS
Labor Costs = $7,200
September 9th, 2017
DEVELOP SYSTEM ARCHITECTURE
Labor Costs = $7,000
October 15th, 2017
DEVELOP PRELIMANARY DESIGN SPECIFICATIONS
Labor Costs = $4,500
October 18th, 2017
DEVELOP DETAILED DESIGN SPECIFICATIONS
Labor Costs = $1,100
November 15th, 2017
DEVELOP COMPONENTS
Labor Costs = $600
August 22th, 2017
PROCURE SOFTWARE
Labor Costs = $7,200
March 28, 2017
DEVELOPMENT ACCEPTANCE TEST PACKAGE
Labor Costs = $1,680
June 6th, 2018
PERFORM UNIT/INTEGRATION TEST
Labor Costs = $7,000
July 6th, 2018
INSTALL SYSTEM
Labor Costs = $3,200
July 10th, 2018
1
PROJECT PLAN OVERVIEW & PRELIMINARY PROJECT
PLANProject Plan Overview & Preliminary Project Plan
Team A:
Project Selection Criterion
One of the biggest decisions an organization makes is what
projects to take on. Once a scope of work, bids or a proposal
has been presented as project material, there are many factors to
consider. The most valuable options should be chosen as well
as considering goals and objectives of the portfolio of the
organization. The benefits of the project as well as objectives
are the main reason for choosing a project. Another thing to
consider would be the feasibility, what is the likelihood of this
project getting off the ground to completion. Timing must be
well-thought-out for each project. Some projects are conducted
in an emergency state, whereas, some are planned out far in
advance. Budget is one of biggest reasons for a project to fail
or be successful. A chosen project must maintain allocated
funding from stakeholders.
The first step for project selection is defining the project and
the criteria thereof. The second step is to score the project for
where it stands before other projects. The last step inform
stakeholders for approvals For this particular project the
criteria behind this project being chosen was the sales force
needed a web-based tool that could help in managing client
accounts, track sales and maintain customer proposals. It was
chosen due to the value of customer satisfaction verses budget
constraints. The benefits to cost ratio was analyzed and found
to be a profitable advantage for the sales team and their
customers. The payback period was examined; the findings
were that within a five year period the payback would be higher
than cost of project.
As the sales team begin to manage their customers with a more
organized position, they can take on more customers and keep
satisfaction and personable service while maintaining
production. Stakeholders chose this project for growth of
clients. Another criteria for selection was time management.
With the purchase and implementation of this web-based tool,
the sales force will have the tools to preserve client accounts in
a timely fashion. The web-based tool has the ability to notify
sales and clients when their products are low. The tool is also
has the ability to perform data-based functions. The sales team
inputs what products the client has. The client uses the tool for
production accountability. When stock gets low, the tool sends
notification to that client’s sales person for further ordering
direction. This cost value and customer satisfaction was criteria
for stakeholder decision Overview
The nature and scope of this project is to develop a web based
tool for a client organization. Team A works for an information
technology organization. A client has asked our organization to
help build and implement a new web-based internal management
system for their sales department to manage their customer
accounts. Our client’s sales managers and sales executives will
use this tool to develop account plans for their customers and to
track their own sales results and be able to extract a sales report
for each sales representative at each quarter. The focus of this
project is to be able to deliver a program that sales
representatives can manage easily anywhere and be able to pull
statistical information about their clients and themselves and
sales criteria.
The purpose of this internal management system implementation
is to help our clients sales department get more organized and
be able to input their client’s information and be able to build
programs specific to the needs of their clients. This
implementation will also allow the sales representatives to be
able to manage their client base much easier and be able to
track their clients progress and also be able to keep up to date
statistical information in their systems about their clients as
well. This system is being customized and developed
specifically to fit the needs of our client and requirements to
meet the client’s needs internally. The long term goal is for our
client to be able to manage their clients much better and be able
to address their needs in a more specific fashion.
We will be developing the project based on the following
criteria;
· A description of your internal client's requirements for this
sales and account management system.
· Identification and engagement of the appropriate stakeholders
to define the requirements for the new system.
· Development and testing of the system.
· Implementation of the system.
· A method for obtaining client feedback.
Task
Project Manager
Due Date
Owner
A description of your internal client’s requirements for this sale
and account management system
TBD
August 14, 2017
TBD
Identification and engagement of the appropriate stakeholders to
define the requirements for the new system.
TBD
August 14, 2017
TBD
Development & Testing for the system
TBD
August 14, 2017
TBD
A Method for obtaining client feedback
TBD
August 14, 2017
TBD
Preliminary Project Schedule
Preliminary Budgetary Plan
Reference
PMI (2013) Project Management Institute. A Guide to the
Project Management Body of Knowledge (PMBOK Guide) - 5th
Edition

More Related Content

Similar to Introduction Where last week we examined specific and quan.docx

breaking_bad_integration_projects
breaking_bad_integration_projectsbreaking_bad_integration_projects
breaking_bad_integration_projectsJim Elliott
 
Breaking Bad Integration Projects
Breaking Bad Integration ProjectsBreaking Bad Integration Projects
Breaking Bad Integration ProjectsHans B. Otharsson
 
GRP & IT Vendor Governance in Government
GRP & IT Vendor Governance in GovernmentGRP & IT Vendor Governance in Government
GRP & IT Vendor Governance in GovernmentFreeBalance
 
Cloud Reality Check 2015 - NTT Communications
Cloud Reality Check 2015 - NTT CommunicationsCloud Reality Check 2015 - NTT Communications
Cloud Reality Check 2015 - NTT CommunicationsRob Steggles
 
Spring 2018Name_________________________________________.docx
Spring 2018Name_________________________________________.docxSpring 2018Name_________________________________________.docx
Spring 2018Name_________________________________________.docxwhitneyleman54422
 
Data Migration Primer - Data Analytics in Healthcare 2022.pdf
Data Migration Primer - Data  Analytics in Healthcare 2022.pdfData Migration Primer - Data  Analytics in Healthcare 2022.pdf
Data Migration Primer - Data Analytics in Healthcare 2022.pdfAli Khan
 
ERP overview
ERP overviewERP overview
ERP overviewThai Pham
 
How Poor Or Missing Requirements Can Kill An It Project
How Poor Or Missing Requirements Can Kill An It ProjectHow Poor Or Missing Requirements Can Kill An It Project
How Poor Or Missing Requirements Can Kill An It ProjectLizLavaveshkul
 
Continental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxContinental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxbobbywlane695641
 
Continental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxContinental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxdickonsondorris
 
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docx
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docxRunning head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docx
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docxhealdkathaleen
 
Session 1 - Introduction to IS and foundations of IS in business.ppt
Session 1 - Introduction to IS and foundations of IS in business.pptSession 1 - Introduction to IS and foundations of IS in business.ppt
Session 1 - Introduction to IS and foundations of IS in business.pptENRIQUE EGLESIAS
 
SOFIT Final Public Copy
SOFIT Final Public CopySOFIT Final Public Copy
SOFIT Final Public CopyLaura Szakmary
 
ESP - PwC - EU Emissions Trading Scheme 2006
ESP - PwC - EU Emissions Trading Scheme 2006ESP - PwC - EU Emissions Trading Scheme 2006
ESP - PwC - EU Emissions Trading Scheme 2006Frank GEISLER 🇨🇭
 

Similar to Introduction Where last week we examined specific and quan.docx (20)

breaking_bad_integration_projects
breaking_bad_integration_projectsbreaking_bad_integration_projects
breaking_bad_integration_projects
 
Breaking Bad Integration Projects
Breaking Bad Integration ProjectsBreaking Bad Integration Projects
Breaking Bad Integration Projects
 
Chaos report
Chaos reportChaos report
Chaos report
 
301338685.pdf
301338685.pdf301338685.pdf
301338685.pdf
 
GRP & IT Vendor Governance in Government
GRP & IT Vendor Governance in GovernmentGRP & IT Vendor Governance in Government
GRP & IT Vendor Governance in Government
 
Cloud Reality Check 2015 - NTT Communications
Cloud Reality Check 2015 - NTT CommunicationsCloud Reality Check 2015 - NTT Communications
Cloud Reality Check 2015 - NTT Communications
 
Spring 2018Name_________________________________________.docx
Spring 2018Name_________________________________________.docxSpring 2018Name_________________________________________.docx
Spring 2018Name_________________________________________.docx
 
Data Migration Primer - Data Analytics in Healthcare 2022.pdf
Data Migration Primer - Data  Analytics in Healthcare 2022.pdfData Migration Primer - Data  Analytics in Healthcare 2022.pdf
Data Migration Primer - Data Analytics in Healthcare 2022.pdf
 
Ai in government
Ai in government Ai in government
Ai in government
 
Erp
ErpErp
Erp
 
ERP overview
ERP overviewERP overview
ERP overview
 
How Poor Or Missing Requirements Can Kill An It Project
How Poor Or Missing Requirements Can Kill An It ProjectHow Poor Or Missing Requirements Can Kill An It Project
How Poor Or Missing Requirements Can Kill An It Project
 
Successful project
Successful projectSuccessful project
Successful project
 
Erp
ErpErp
Erp
 
Continental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxContinental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docx
 
Continental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docxContinental Airlines was founded in 1934 with a single-engin.docx
Continental Airlines was founded in 1934 with a single-engin.docx
 
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docx
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docxRunning head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docx
Running head CASE STUDY QUESTIONS 1CASE STUDY QUESTIONS7.docx
 
Session 1 - Introduction to IS and foundations of IS in business.ppt
Session 1 - Introduction to IS and foundations of IS in business.pptSession 1 - Introduction to IS and foundations of IS in business.ppt
Session 1 - Introduction to IS and foundations of IS in business.ppt
 
SOFIT Final Public Copy
SOFIT Final Public CopySOFIT Final Public Copy
SOFIT Final Public Copy
 
ESP - PwC - EU Emissions Trading Scheme 2006
ESP - PwC - EU Emissions Trading Scheme 2006ESP - PwC - EU Emissions Trading Scheme 2006
ESP - PwC - EU Emissions Trading Scheme 2006
 

More from normanibarber20063

Assist with first annotated bibliography.  Assist with f.docx
Assist with first annotated bibliography.  Assist with f.docxAssist with first annotated bibliography.  Assist with f.docx
Assist with first annotated bibliography.  Assist with f.docxnormanibarber20063
 
Assistance needed with SQL commandsI need assistance with the quer.docx
Assistance needed with SQL commandsI need assistance with the quer.docxAssistance needed with SQL commandsI need assistance with the quer.docx
Assistance needed with SQL commandsI need assistance with the quer.docxnormanibarber20063
 
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docx
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docxassingment Assignment Agenda Comparison Grid and Fact Sheet or .docx
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docxnormanibarber20063
 
Assimilate the lessons learned from the dream sequences in Defense o.docx
Assimilate the lessons learned from the dream sequences in Defense o.docxAssimilate the lessons learned from the dream sequences in Defense o.docx
Assimilate the lessons learned from the dream sequences in Defense o.docxnormanibarber20063
 
Assignmnt-500 words with 2 referencesRecognizing the fa.docx
Assignmnt-500 words with 2 referencesRecognizing the fa.docxAssignmnt-500 words with 2 referencesRecognizing the fa.docx
Assignmnt-500 words with 2 referencesRecognizing the fa.docxnormanibarber20063
 
Assignmnt-700 words with 3 referencesToday, there is a crisi.docx
Assignmnt-700 words with 3 referencesToday, there is a crisi.docxAssignmnt-700 words with 3 referencesToday, there is a crisi.docx
Assignmnt-700 words with 3 referencesToday, there is a crisi.docxnormanibarber20063
 
Assignment  For Paper #2, you will pick two poems on a similar th.docx
Assignment  For Paper #2, you will pick two poems on a similar th.docxAssignment  For Paper #2, you will pick two poems on a similar th.docx
Assignment  For Paper #2, you will pick two poems on a similar th.docxnormanibarber20063
 
Assignment Write an essay comparingcontrasting two thingspeople.docx
Assignment Write an essay comparingcontrasting two thingspeople.docxAssignment Write an essay comparingcontrasting two thingspeople.docx
Assignment Write an essay comparingcontrasting two thingspeople.docxnormanibarber20063
 
Assignment Travel Journal to Points of Interest from the Early Midd.docx
Assignment Travel Journal to Points of Interest from the Early Midd.docxAssignment Travel Journal to Points of Interest from the Early Midd.docx
Assignment Travel Journal to Points of Interest from the Early Midd.docxnormanibarber20063
 
Assignment What are the factors that influence the selection of .docx
Assignment What are the factors that influence the selection of .docxAssignment What are the factors that influence the selection of .docx
Assignment What are the factors that influence the selection of .docxnormanibarber20063
 
Assignment Write a research paper that contains the following.docx
Assignment Write a research paper that contains the following.docxAssignment Write a research paper that contains the following.docx
Assignment Write a research paper that contains the following.docxnormanibarber20063
 
Assignment Thinking about Managers and Leaders· Identifya man.docx
Assignment Thinking about Managers and Leaders· Identifya man.docxAssignment Thinking about Managers and Leaders· Identifya man.docx
Assignment Thinking about Managers and Leaders· Identifya man.docxnormanibarber20063
 
Assignment Talk to friends, family, potential beneficiaries abou.docx
Assignment Talk to friends, family, potential beneficiaries abou.docxAssignment Talk to friends, family, potential beneficiaries abou.docx
Assignment Talk to friends, family, potential beneficiaries abou.docxnormanibarber20063
 
Assignment The objective of assignment is to provide a Power .docx
Assignment The objective of assignment is to provide a Power .docxAssignment The objective of assignment is to provide a Power .docx
Assignment The objective of assignment is to provide a Power .docxnormanibarber20063
 
Assignment During the on-ground, residency portion of Skill.docx
Assignment During the on-ground, residency portion of Skill.docxAssignment During the on-ground, residency portion of Skill.docx
Assignment During the on-ground, residency portion of Skill.docxnormanibarber20063
 
Assignment PurposeThe first part of this assignment will assist.docx
Assignment PurposeThe first part of this assignment will assist.docxAssignment PurposeThe first part of this assignment will assist.docx
Assignment PurposeThe first part of this assignment will assist.docxnormanibarber20063
 
Assignment PowerPoint Based on what you have learned so .docx
Assignment PowerPoint Based on what you have learned so .docxAssignment PowerPoint Based on what you have learned so .docx
Assignment PowerPoint Based on what you have learned so .docxnormanibarber20063
 
Assignment In essay format, please answer the following quest.docx
Assignment In essay format, please answer the following quest.docxAssignment In essay format, please answer the following quest.docx
Assignment In essay format, please answer the following quest.docxnormanibarber20063
 
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docx
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docxAssignment NameUnit 2 Discussion BoardDeliverable Length150-.docx
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docxnormanibarber20063
 
Assignment In essay format, please answer the following questions.docx
Assignment In essay format, please answer the following questions.docxAssignment In essay format, please answer the following questions.docx
Assignment In essay format, please answer the following questions.docxnormanibarber20063
 

More from normanibarber20063 (20)

Assist with first annotated bibliography.  Assist with f.docx
Assist with first annotated bibliography.  Assist with f.docxAssist with first annotated bibliography.  Assist with f.docx
Assist with first annotated bibliography.  Assist with f.docx
 
Assistance needed with SQL commandsI need assistance with the quer.docx
Assistance needed with SQL commandsI need assistance with the quer.docxAssistance needed with SQL commandsI need assistance with the quer.docx
Assistance needed with SQL commandsI need assistance with the quer.docx
 
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docx
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docxassingment Assignment Agenda Comparison Grid and Fact Sheet or .docx
assingment Assignment Agenda Comparison Grid and Fact Sheet or .docx
 
Assimilate the lessons learned from the dream sequences in Defense o.docx
Assimilate the lessons learned from the dream sequences in Defense o.docxAssimilate the lessons learned from the dream sequences in Defense o.docx
Assimilate the lessons learned from the dream sequences in Defense o.docx
 
Assignmnt-500 words with 2 referencesRecognizing the fa.docx
Assignmnt-500 words with 2 referencesRecognizing the fa.docxAssignmnt-500 words with 2 referencesRecognizing the fa.docx
Assignmnt-500 words with 2 referencesRecognizing the fa.docx
 
Assignmnt-700 words with 3 referencesToday, there is a crisi.docx
Assignmnt-700 words with 3 referencesToday, there is a crisi.docxAssignmnt-700 words with 3 referencesToday, there is a crisi.docx
Assignmnt-700 words with 3 referencesToday, there is a crisi.docx
 
Assignment  For Paper #2, you will pick two poems on a similar th.docx
Assignment  For Paper #2, you will pick two poems on a similar th.docxAssignment  For Paper #2, you will pick two poems on a similar th.docx
Assignment  For Paper #2, you will pick two poems on a similar th.docx
 
Assignment Write an essay comparingcontrasting two thingspeople.docx
Assignment Write an essay comparingcontrasting two thingspeople.docxAssignment Write an essay comparingcontrasting two thingspeople.docx
Assignment Write an essay comparingcontrasting two thingspeople.docx
 
Assignment Travel Journal to Points of Interest from the Early Midd.docx
Assignment Travel Journal to Points of Interest from the Early Midd.docxAssignment Travel Journal to Points of Interest from the Early Midd.docx
Assignment Travel Journal to Points of Interest from the Early Midd.docx
 
Assignment What are the factors that influence the selection of .docx
Assignment What are the factors that influence the selection of .docxAssignment What are the factors that influence the selection of .docx
Assignment What are the factors that influence the selection of .docx
 
Assignment Write a research paper that contains the following.docx
Assignment Write a research paper that contains the following.docxAssignment Write a research paper that contains the following.docx
Assignment Write a research paper that contains the following.docx
 
Assignment Thinking about Managers and Leaders· Identifya man.docx
Assignment Thinking about Managers and Leaders· Identifya man.docxAssignment Thinking about Managers and Leaders· Identifya man.docx
Assignment Thinking about Managers and Leaders· Identifya man.docx
 
Assignment Talk to friends, family, potential beneficiaries abou.docx
Assignment Talk to friends, family, potential beneficiaries abou.docxAssignment Talk to friends, family, potential beneficiaries abou.docx
Assignment Talk to friends, family, potential beneficiaries abou.docx
 
Assignment The objective of assignment is to provide a Power .docx
Assignment The objective of assignment is to provide a Power .docxAssignment The objective of assignment is to provide a Power .docx
Assignment The objective of assignment is to provide a Power .docx
 
Assignment During the on-ground, residency portion of Skill.docx
Assignment During the on-ground, residency portion of Skill.docxAssignment During the on-ground, residency portion of Skill.docx
Assignment During the on-ground, residency portion of Skill.docx
 
Assignment PurposeThe first part of this assignment will assist.docx
Assignment PurposeThe first part of this assignment will assist.docxAssignment PurposeThe first part of this assignment will assist.docx
Assignment PurposeThe first part of this assignment will assist.docx
 
Assignment PowerPoint Based on what you have learned so .docx
Assignment PowerPoint Based on what you have learned so .docxAssignment PowerPoint Based on what you have learned so .docx
Assignment PowerPoint Based on what you have learned so .docx
 
Assignment In essay format, please answer the following quest.docx
Assignment In essay format, please answer the following quest.docxAssignment In essay format, please answer the following quest.docx
Assignment In essay format, please answer the following quest.docx
 
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docx
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docxAssignment NameUnit 2 Discussion BoardDeliverable Length150-.docx
Assignment NameUnit 2 Discussion BoardDeliverable Length150-.docx
 
Assignment In essay format, please answer the following questions.docx
Assignment In essay format, please answer the following questions.docxAssignment In essay format, please answer the following questions.docx
Assignment In essay format, please answer the following questions.docx
 

Recently uploaded

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
CELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxCELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxJiesonDelaCerna
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 

Recently uploaded (20)

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
CELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxCELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 

Introduction Where last week we examined specific and quan.docx

  • 1. Introduction Where last week we examined specific and quantifiable methods for tracking implementation progress, this week we look at a few qualitative things managers can do to ensure that IT projects succeed. As you will learn from the text, there are abundant examples of IT projects in the public sector that are not deemed successful. Even the most seasoned IT manager gets a bit depressed when reading the examples in these notes and the text. There are political/legal, structural, operational/managerial, and cultural barriers to development of public information technology and e-government, forming a complex intermixture of forces. Challenges in these area are even greater for inter-jurisdictional and inter-sectoral initiatives than interdepartmental ones (Kernaghan, 2005). As a result, there is no “formula for success” and even the most commonly cited success factors may prove inappropriate in certain circumstances. Success Factors?
  • 2. Unfortunately, the rate at which IT projects fail in both the public and private sectors is very high. The Center for Technology in Government suggests that more than half of all government IT projects are viewed as failures by organizational stakeholders. Projects Fail. Typical reasons for failure are: 1. Complexity – projects are too large and complex 2. Commitment Failure – lack of commitment to the project from stakeholders 3. Planning Failure – creation of poor implementation plans 4. Vision Failure – underlying assumptions are unrealistic 5. Inappropriate Methods – agency methods may not match IT 6. Short Time Horizon – unrealistic schedules 7. Turbulent Environments- rapid rates of change can beget lack coordination and system failure 8. Failure to Support End Users – end users have no incentive to use the technology or are unable to use it
  • 3. IT Failures in the 1990s In the 1990s, just before the U.S. federal government embraced enterprise resource planning (ERP) systems on a massive scale, the corporate world had experienced a number of traumatic ERP incidents. Universal Oil launched an ERP system in 1991 but found the system unusable and sued Andersen Consulting, the vendor, for $100 million. Andersen countersued Universal for libel. In 1996, FoxMeyer Corporation hired SAP and Andersen Consulting to install an ERP system, the problems with which resulted in the bankruptcy of the company and a billion dollar lawsuit. Tri Valley Growers, a giant agricultural cooperative, initiated a $6 million ERP system in 1996, finding none of the Oracle software performed as promised, some was hardware incompatibility, and the company filed a $20 million lawsuit. Oracle countersued and the company went into bankruptcy. In 1998-1999, W. W. Grainger Inc. launched a $9 million ERP system which miscounted inventory, crashed recurrently, and led to a $23
  • 4. million loss in profits. Grainger persevered, however, and worked with the vendor (SAP) to fix problems. Hershey Foods Corp., seeking to meet Halloween/Christmas candy demand, forced installation of a new $112 ERP system in just seven months. The new system miscounted inventory, caused delayed shipments and incomplete orders, leading to a 12% drop in sales (Nash, 2000). Of course, ERP software was not unique. Other very large IT projects had failed in the private sector in the 1990s as well. AMR Corp., attempted in the early 1990s to create the “Confirm” reservation system for Budget Rent A Car Corp., Hilton Hotels Corp., and Marriott International Inc. After four years the project was in shambles, both sides sued the other, and AMR took a $109 million write off. Snap On Inc. installed a new automated order entry system in 1997 only to discover the system delayed orders, miscounted inventory, and cost the company $50 million in lost sales while increasing operating costs by 40% with a net 22% decline in company profits over the previous year. Greyhound’s $6 million “Trips” reservation and bus dispatch system was installed in 1993 but crashed
  • 5. when Greyhound offered sale prices. Agents were forced to revert to manual methods and ridership dropped 12% in a month. Greyhound posted a $61 million loss in 1994 and although “Trips” was eventually fixed, Greyhound never regained its competitive position. Norfolk Southern Corp. in merging with Conrail, relied in 1998-1999 on customs logistics software. When a dispatcher mistakenly entered erroneous data, Norfolk Southern suffered train backups for over a year, incurring $80 million in extra overtime pay for workers and had to pay to fix the system. Oxford Health Plans launched a new automated billing and claims processing system in 1996 only to find that it had mis-estimated medical costs and delays led to massive protests from doctors and patients. Oxford suffered its first quarterly losses ($78 million) and was fined an additional $3 million by the state of New York. Much of the system was abandoned and replaced with other commercial modules (Nash, 2000). The Real Y2K Disaster? Between 1995 and 2001, six surveys reported disturbing failure rates for information systems projects.
  • 6. Surveying these surveys, one consulting firm summarized, “An IT project is more likely to be unsuccessful than successful - only about 1 out of 5 IT projects is likely to bring full satisfaction (and) the larger the project the more likely the failure” (IT-Cortex, 2006). • The OASIG Survey (1995, for the Organizational Aspects of Information Technology special interest group) of 14,000 computer user groups in the UK found a reported IT success rate of 20- 30%, with at least 70% of projects failing in some respect. • The Chaos Report (1995) was a much-cited Standish Group study of 365 US IT executive managers spread across the public and private sectors, finding 31% of projects were canceled before completion and ran over budget (by an average 189%). Standish estimated organizations paid $81 billion annually for canceled projects and another $59 billion in cost overruns. Only 16% of projects were completed on time, on budget, and to original specifications. • The KPMG Canada Survey (1997) was a poll of 176 leading public and private sector organizations in Canada, finding over 61% of projects were
  • 7. deemed failures by respondents. Over three quarters of projects were behind schedule 30% or more, and over half had substantial cost overruns. • The Conference Board Survey (2001) was a study of 117 U.S. companies which had attempted ERP projects, finding 40% of projects failed to achieve their business case goals within one year after implementation, with implementation costing an average of 25% over budget. • The Robbins Gioia Survey (2001) was another study of ERP project implementation, covering 232 public and private sector organizations and finding over half (51%) evaluated their ERP implementation as unsuccessful Failure statistics: private sector In data covering 2004, the Standish Group report on IT project outcomes in over 9,000 primarily large corporate/US or European projects, found that only 29% of all projects succeeded (delivered on time, on budget, with required features and functions). Some 53% fell
  • 8. in its “challenged” category (late, over budget and/or with less than the required features and functions). Another 18% failed outright (cancelled prior to completion or delivered and never used) (Standish Group, 2004). This represented an improvement over its corresponding figures a decade earlier, when only 20% of projects could be considered successful. On the other hand, failure remains prevalent in IT projects in the private sector. Examples of IT Failures in the Public Sector Certainly the public sector has not been immune from project failures. IT project failure, defined as projects which come in over budget, behind schedule, and/or lacking originally intended functionality, continue to be common in spite of a decade of efforts by the Office of Management and Budget, the Government Accountability Office, and other oversight bodies to remedy the problem: • The Veterans Affairs Department spent $342 million on its Core Financial and Logistics System before abandoning it in 2004. The VA also had to pull the plug on its $300 million HR Links automated personnel system (Mosquera, 2005c).
  • 9. • In 2005, the FBI abandoned its $170 million computerized case management system, the Virtual Case File System, after consultants judged it obsolete before its completion and riddled with problems (Associated Press, 2006b). • In 2005, United Airlines at Denver International Airport abandoned its 10-year-old automated baggage-handling system in favor of the old manual one, estimating it would save $1 million a month by doing so. The automated system had been plagued by construction delays, cost overruns, lost and damaged baggage, and long lines. The city of Denver had paid a reported $441 million for the system, and United Airlines is obligated to pay $60 million/year for its Denver facilities under a 25-year contract. The system vendor, BAE Automated Systems, has ceased to exist (Associated Press, 2005). • In the Environmental Protection Agency, the Clean Air Markets Division Business System (CAMDBS, a tool supporting emission trading programs) as of 2005 was already $2.8 million and
  • 10. two years over the originally budgeted plan. Similarly, PeoplePlus, came in $3.7 million and one year over EPA original plans (Thormeyer, 2005b). • The Senate Committee on Homeland Security and Governmental Affairs’ Permanent Subcommittee on Investigations held hearings in September, 2005, and found that the Defense Travel System (a web-based travel booking system, meant to be similar to Expedia or Travelocity) was over budget by over $200 million and was four years behind schedule (Onley, 2005a). • A 2005 Inspector General’s report on systems employed by the Federal Aviation Administration found that 9 of 16 major acquisitions had experienced schedule delays of 2 to 12 years, and 11 had experienced cost growth of about $5.6 billion over initial budgets (Thormeyer, 2005e). Often after such failures, as in the case of the Veterans Administration, the proposed solution was even larger-scale “enterprise” software systems. Many more examples of public sector IT failure are tracked in The Risks Digest.
  • 11. The disaster surrounding Hurricane Katrina, which left the city of New Orleans flooded in 2005, provided a case example relevant to why information technology projects may fail. Rep. Nydia Velazquez (R NY), ranking member of the House Small Business Committee, noted that the Small Business Administration’s Office of Disaster Assistance (ODA) was hampered in being responsive due to ongoing centralization of ODA core applications a new Disaster Credit Management System (DCMS), operated from a single location in Fort Worth, to which staff were still adjusting (Thormeyer, 2005a). Though created months earlier, the DCMS was reported in the press to have “stumbled badly because there haven't been enough new computers or staff trained to use them” and also because “those who have made it out into the field have discovered that they can't always link up from the disaster area to handle new loan applications and file reports on existing ones” (Gaouette, Miller, and Alonso-Zaldivar, 2005). Though these represent potentially solvable problems, they also illustrate common factors which may undermine IT implementation efforts - neglect of training and shortage of resources.
  • 12. What to Do? The list of IT failures is daunting, yet public organizations are increasingly dependent on information technology to engage citizens, deliver services and provide information. IT management consultant Robin Goldsmith (2005) suggests there are three prime reasons for the failure of IT projects, all having to do with top-down implementation. The Gartner group, a leading systems consulting firm, gives similar reasons (Olsen, 2005). Below are four critical mistakes IT managers make when implementing IT projects: • Failure to freeze requirements. As the project develops, “requirements creep” occurs as management adds previously unanticipated and unbudgeted new scope and new requirements. When the project proceeds before system specifications are defined and final, the chances of failure escalate markedly. • Bad budgeting. Failure may be linked to management mandating budgets rigidly, without due
  • 13. regard for requirements necessary for success. Alternatively, cost-plus contract budgeting gives little incentive to constrain costs. • Inadequate time. In order to meet pre-established deadlines, project quality and production values are sacrifices. Similarly, the pace of technology or programmatic urgencies (ex., terrorism in DHS IT projects) can mean unrealistic time frames and failure. • Hierarchical, top down, control oriented approaches to systems development may relate to any of these three causes of IT failure and is itself a widely-cited failure factor (ex., Brown and Brudney, 2004). Failure Factors A 2006 Commerce Department report on the technical and economic impact of new Internet protocols on government IT projects concluded that failure to plan was one of the most significant implementation risks. In this case, to plan for new network arrangements, planning required that administrators benchmark existing network performance.
  • 14. Without this benchmarking, it is difficult to gauge new performance specifications in IT contracting (Government Computer News, 2006b). Neglect of human factors in technology implementation continues to be the prime reason for technology failure. For instance, Rinkus et al. (2005) studied healthcare technology, finding that many projects failed due to inattention to such human factors as workflow, organizational change, and usability. Rinkus found that even when technology project managers do look at human factors explicitly, they usually limit their focus to designing better user interfaces, whereas what is needed more is greater consideration of users’ functionality. Neglect of the human dimensions of information technology may be linked to the narrowly technocratic nature of leadership frequently found in IT projects. A Cutter Consortium survey published in CIO Magazine found human factors, not lack of expertise or technological experience, to be all of the “top five failure factors” for information technology officers. Based on a study of 250 senior IT executives who were asked to describe the attributes of “the worst IT manager” they had ever known, the five
  • 15. failure factors were: • Poor interpersonal skills (58%) • Being self centered (56%) • Failure to acknowledge problems (55%) • Untrustworthiness (54%) • Weak management skills (52%) Noting that failed IT leadership was associated with lack of empathy, lack of emotional ability, and inability to connect with others, the authors of the survey identified the overall main cause of IT leadership failure as “lack of emotional intelligence” (Prewitt, 2005). Poor data as a failure factor New Jersey’s Department of Youth and Family Services (DYFS) was sued in 2003 by the New York Times and other newspapers for access to its Services Information System (SIS) database. Documents subsequently revealed that “outdated, inefficient technology” plagued the DYFS. One social worker was
  • 16. quoted as saying, “most of what we put on SIS is wrong” (Peterson, 2003b: 28). While DYFS has since implemented a new information system, its 2003 plight illustrated the GIGO (“garbage in, garbage out”) principle which applies to all IT systems. Overdependence on automated systems as a failure factor Perry, Wears, and Cook, 2005, studied a near-fatal IT failure in a hospital setting, where a automated dispensing unit (ADU) failed to dispense needed medicines, instead giving a “Printer not available” error message and locking the keyboard. All drugs became unavailable to all nursing units. The authors note that, “Ironically, the more reliable the technological system, the more deskilled workers become in responding to the eventually inevitable anomalies” (p. 60). Hospital workers did not have in place procedures to handle IT systems failure manually and, indeed, the ADU was designed to prevent manual access to medicines, some of which were controlled substances. The authors conclude that while automation can be an effective tool, it can also be “a potent source of new vulnerability” (p. 60).
  • 17. Inappropriateness for local culture as a failure factor E-government “transparency” projects are an example of global technology transfer, but the failure to take into account organizational culture in the local context can lead to failure (Heeks, 2005). Likewise, in a study of European back-office support for e-governance services, Westholm (2005) found that governance structures are “historically grown,” preventing otherwise innovative models for integrated e-government services from being copied from one nation to another in any simple manner. Culture as an obstacle to technology transfer has been confirmed in numerous other studies (ex., Rezgui, Wilson, Olphert,& Damodaran, 2005). Internal Success Factors Despite the problems IT projects have encountered there is a consensus emerging as to why IT projects succeed. Success factors include: 1. Management Support – active involvement of top management. 2. Stakeholder Motivation – typical resistance to change must be overcome by convincing
  • 18. stakeholders of the benefits. 3. Goal Clarity – project scope must be clear. 4. Support for Organization Culture. 5. Participatory Implementation – employee resistance factor. 6. User Friendliness – a way to increase stakeholder motivation. 7. Adequate Budget and Time Horizon. 8. Phased Implementation – extension of goal clarity 9. Process and Software Engineering – dealing with legacy systems 10. Project Management – professional is better. Commonly cited success factors include incomplete user requirements, inadequate management involvement, ineffective communication, immature work processes, technicians’ unwillingness to be constrained by formal standards (Richardson and Ives, 2005). It is not uncommon for agencies to identify long lists of such success factors in formal planning documents. Here, for instance, is such a list from the Food and Drug Administration (FDA, 2005: Appendix B):
  • 19. 1. Effective communications between FDA IT and our customers and suppliers, that increase the opportunities for mutual understanding and collaboration 2. Clear and consistent IT management guidance and feedback 3. An adequately sized and skilled work force with an excellent customer service ethic 4. Adequate tools for the work force to accomplish the FDA mission 5. Effective and well documented governance, policies, processes, and procedures 6. Adequate funding for the IT infrastructure 7. Adequate facilities to house the IT work force and infrastructure 8. Clear measurements of performance Such agency lists are a combination of common sense, project experience, and wish-list, not necessarily based on any systematic study in a social science sense. Nonetheless, they give a flavor of the conventional wisdom on the subject. Internationally, Europe’s Organization for Economic Cooperation and Development has listed 10 success factors “to get IT right” (OECD, 2001):
  • 20. 1. establish appropriate governance structures; 2. think small; 3. use known technologies; 4. identify and manage risk; 5. ensure compliance with best practices for project management; 6. hold business managers accountable; 7. recruit and retain talent; 8. prudently manage knowledge; 9. establish environments of trust with private vendors; and 10. involve end users. However, as discussed above, there is no one “right” list of success factors, and any given success factor (ex., “think small”) may be inappropriate in a particular setting. Success stories Agency success stories are a second source of insight into the internal success factors thought important in public information systems. For instance, implementing ERP (Enterprise Resource Planning) software
  • 21. can be a challenging or even disaster-prone effort, but a success story is the Department of Transportation’s (DOT) implementation of the Oracle Federal Financial e-Business suite. DOT’s effort was so successful, it was designated by the OMB as a “center of excellence” for the Financial Management Line of Business initiative. Six key success factors cited as underlying this success story are: 1. Avoiding customization of the software, instead making agency practices conform to pre- determined process formats dictated by the vendor. 2. Gradual rather than rushed implementation, over the 2000 - 2003 period. 3. Getting stakeholder buy-in by creating test labs in each agency to demonstrate workability. 4. Having a contract which required the vendor to maintain the software, including cost of upgrades. 5. Providing individual desk-side transitional assistance for the first six weeks. 6. Creating a user support group to share problems and solutions across the department’s units.
  • 22. The completed DOT system allowed the department to produce financial statements at 8 a.m. of the first of every month, and to meet deadlines for year-end financial statements (Government Computer News, 2006c). Leadership In a 22-country survey of e-government activity of 2001 compared to 2000, Hunter (2001) categorized countries’ progress on e-government based (innovative leaders, visionary followers, and steady achievers). Progress was measured in terms of service maturity (number and depth of online services) and delivery maturity (degree of way toward ideal of single- point-of-access cross-agency web presence). “Interestingly,” he concluded, “the research found that overall progress in e-Government is not closely correlated with similar social, political or economic characteristics.” Rather, adoption was driven by “leadership, political will, commitment of deliverables and accountability for results.” (Hunter, 2001)
  • 23. Supportive organizational culture As the information economy replaces the production economy, the challenge of the typical organization becomes increasingly one of management of professional workers. Professionalization has often been associated with the ethos of autonomy, and professional workers have been characterized as resistant to close managerial supervision. However, a study of professional expectations of managers by Hernes (2005) found not so much a desire for autonomy from managers as for a supportive style of management. Supportive management is characterized by managers giving priority to relationships, constructive feedback, encouragement and motivation, and to clear and realistic communication of management expectations. Power struggles and organizational politics are associated with the dissemination of information technology, which in turn requires greater collaboration and partnering skills in public management (Kernaghan and Gunraj, 2004). Management of information technology, especially given the increased
  • 24. importance of public-private partnerships, requires a team based collaborative organizational culture to motivate and retain knowledge workers. Classical approaches to technology management do not adequately take politics and organizational culture into account (Haynes, 2005). A discussion of IT-based change in Computer World (Karacsony, 2006) emphasized implementation success factors such as recognizing and working with organizational politics, involving all staff from the beginning, emphasizing communications, being non-threatening, showing benefits of change, and providing support and training for the change process. In particular, a criticism-tolerant organizational culture can be an important success factor. In a study of IT in the setting of public schools, Hinman, Saarlas, and Lloyd Puryear (2004) found that “A key factor in successful collaborative efforts is camaraderie and trust among a group’s members that problems (and failures) can be shared without negative consequences” (p. S10). Participative IT initiatives
  • 25. End-user participation is almost universally listed as a success factor in IT implementation (ex., OECD, 2001: 6). When in 2005 it came time for the Census Bureau to plan the enterprise architecture for its 2010 Census, priority was given to employee participation in its development. “The people who have participated in developing it, I think, are seeing the benefits because they are referencing it,” said Marjorie Martinez, branch chief of the Census Bureau’s software and systems standards division (Mosquera, 2005b). Adequate staffing Adequate staffing prevents work overload and provides career paths necessary for employee motivation. Kim’s (2005) study of state government information technology staff found that work exhaustion combined with lack of opportunity for advancement were prime correlates of job turnover, while salary satisfaction was not a significant correlate. Outcome-based contracts as success factor
  • 26. Principal-agent theory points to the importance of outcome- based contracting as the key success factor (Mahaney and Lederer, 2003), viewing IT implementers as “agents” who must be held clearly accountable by the CEO “principal.” Whether the principal- agent analogy is a good one, few disagree with the idea that goal clarity is important in successful IT implementation and conversely, when management or external forces make the goal a shifting target, failure is much more likely to occur. The role of training Lack of training can lead to IT system disuse. For instance, DHS’s Homeland Security Information Network employs Common Operating Picture (COP) applications software to share information and maps related to activities and events in disaster response. In late 2006, Senators Susan Collins (R-ME) and Joseph Lieberman (I-CT) charged in a letter to DHS that COP was “hugely” underutilized, used regularly by fewer than 6% of first responders. The senators wrote, “DHS has done little to inform first responders about the common operating picture or to train them how to use it” (Lipowicz, 2006n).
  • 27. Training for workforce information literacy is an oft-cited critical success factor (Fowell and Elliot 2002). In general, the critical role of training has been emphasized in any number of lists of success factors for IT implementation (ex., OECD, 2001: 6). There is a direct correlation between the amount of attention and resources devoted to training and the success of implementing IT. The main aim of training is to make staff feel competent using new IT. Training requires time that is almost always taken from other tasks – this requires the support of management and contingency for handling work displaced. Timing is important in training – too early or too late can affect success. Development of a supportive culture for training and development is essential External Success Factors An agency’s external environment can be a major determinant of IT success. External success factors include: 1. Partnerships with Vendors and Other Strategic Partners –
  • 28. partnering can often tap into knowledge bases unavailable in house. Partnering can be short term or long term 2. Creating Independence from Vendors – project implementation is more likely to be successful if the agency is not dependent on the vendor. Agencies should avoid allowing contracts to deplete the in-house staff to a point where in-house expertise to evaluate contracts is either limited or even eliminated 3. Accountability to the Political Layer – IT projects will be more successful if long-term funding is secure and the benefits made clear to the political funders External funding Partnering and strategic alliances are a key success factor for growth in competitive IT markets. (U. S.
  • 29. Commercial Service, 2005). Non-profit organizations, particularly in human services, are often on the wrong side of the digital divide. For non-profits, external funding through grants is a major factor in whether the organization acquires and uses information technology. External funding is more of a determinant than is organizational size (Humphrey, Kim, and Dudley, 2005). Professionalization Based on a regression study of e-government in the 50 states, research suggests that more extensive e- government is correlated with legislative professionalization, state professional networks, and Republican control of the legislature. Contrary to (citizen) demand theory, states with more Internet access and use actually have less developed e-government features (McNeal, Tolbert, Mossberger, and Dotterweich, 2003). Evaluation of Public Information Systems Strategies for Evaluating Public Information Technology. Traditionally there has been little incentive to attempt this in the public sector. Some questions are:
  • 30. 1. What are the criteria for evaluation? Are they to be weighted? 2. How to value intangibles in dollar terms? 3. Should dollar be in net present value? Financial models to be used? 4. How far into the future to measure benefit stream? 5. Should potential value be included? If so, how? 6. Should benefits be discounted for other causal factors of results? 7. Should internal benefits be counted in return on investment? 8. Should subjective benefits be included in evaluations? Strategies for Evaluating Public Information Technology Questions Regarding Measurement of Costs: 1. Will cost measurement be total or marginal cost? 2. How much agency infrastructure will be allocated to the IT cost analysis and by what formula? 3. Are life cycle costs included? 4. Are training costs included? 5. Is depreciation of facilities, equipment, and software included?
  • 31. Life Cycle Costing is an important aspect of cost evaluation. This includes taking into account ownership costs such as operations, maintenance, and disposition. The Private Sector Model: Return on Investment (ROI) From a managerial perspective, IT investment tends to shift from a “honeymoon” period to a later period of “competitive realism.” Return on investment analyses search for the illusive tangible benefits. The private sector model of calculation focuses on quantitative indicators, but this is often difficult to transfer to the public sector due to accounting for intangibles. In the public sector, intangible benefits of IT can be as important as the tangible ones. These would include such items as client and citizen awareness, participation and satisfaction, upholding democratic accountability and transparency, policy consistency, more effective targeting of services, staff morale and improving citizen choices.
  • 32. ROI an increasing federal priority A survey of 97 federal IT executives by CDW Government, Inc., found that while cybersecurity remained a top but declining priority (24 percent in 2006, compared to 43% in 2005), proving return on investment (ROI) was a rising second priority (18% in 2006, compared to 10% in 2005) (Miller, 2006i). The problem with ‘hard” ROI evaluations Behind every ROI study, there are assumptions, often very questionable ones. An example was the Defense Department’s 2006 claim that its new travel information system would save $56 million per year. When the Government Accountability Office investigated, however, they found the $56 million figure had two components. First, DoD claimed over $24 million would be saved by personnel savings, but the GAO found that the Navy and Air Force, when queried, disclaimed there would be any personnel savings, though some personnel might be shifted to other functions in the future if warranted. Second, the DoD claimed a $31 million savings in reduced commercial travel fees, but the GAO found there had
  • 33. been no actual fiscal analysis at all, but rather the figure was derived from an article in a travel trade journal (Onley, 2006b). The case raised issues common to ROI, even when confined to economic impact measurement: From the DoD view, displacing personnel was “savings,” but not necessarily from the viewpoint of Congress, which the GAO serves. Likewise, the DoD claimed guesstimates found in the press were better than no data at all, but are they? Unfortunately, every such estimate of costs and benefits is fraught with similar problems of measurement. Soft ROI evaluations? A strikingly more flexible version of a ROI evaluation strategy emanated from a 2006 report of the Center for Technology in Government, University at Albany, in conjunction with SAP, Inc. (Cresswell, Burke, & Pardo, 2006; Government Technology, 2006k). (SAP is one of the leading producers of Enterprise Resource Planning [ERP] software). The CTG/SAP strategy emphasized how IT projects can create value, either governmental improvements or services to groups and individuals, which goes well beyond the usual measures of expected direct financial savings.
  • 34. Such value includes, for instance, increasing the integrity of government through greater transparency. In a set of six international case studies, the report set forth a framework for incorporating such important but non-financial values into ROI evaluations. The CTG/SAP framework (which is non-proprietary) calls on evaluators to identify stakeholder groups and for each to then identify not only financial but also political, social, strategic, ideological, and “stewardship” (trust, integrity, and legitimacy) impacts on each, finding ways to measure these impacts. In addition to the usual efficiency (input/output ratio improvements) and effectiveness (achievement of organizational goals) impacts, value is construed to include also enablement (positive externalities) and intrinsic enhancements (public goods like legitimacy and transparency). The framework acknowledges that actual measurement of impacts often must be qualitative rather than quantitative, though case studies give examples of each. Finally, the framework urges that risk analysis include not only the usual
  • 35. analysis of development risks (risks the project may not be implemented on time, within budget, and to specifications) but also benefit risks (risks that anticipated beneficial impacts identified in stakeholder analysis will not materialize). In summary, the CTG/SAP framework for ROI stands the traditional ROI approach on its head, making it quite different from financial analysis and embracing many long- standing criticisms of ROI for being too narrowly economic in nature. Stakeholder analysis through Value Measuring Methodology (VMM) Although VMM is a form of cost-benefit analysis, the intent of its authors was to conduct such analysis in relation to stakeholder groups. A governmental example was the application of VMM to the project which created the XML.gov registry (Johnson and Foley, 2005). This project was intended to spread the gospel of Extensible Markup Language (XML) as a solution to interoperability problems associated with sharing documents and data, and to facilitate cooperative efforts among government agencies and the private sector toward this end. A specific problem was that many agencies were creating their own
  • 36. separate XML registries, ironically creating lack of interoperability. A VMM evaluation calculated expected benefits for each stakeholder group (direct users, government financial, government operational, social, and political/strategic actors). Stakeholder scores were aggregated into a total benefit score for each of three proposed alternatives, one of which was establishing a unified federal XML registry. By proceeding in this way, information was highlighted which showed which groups benefitted the most, where costs occurred, and where the greatest risks occurred. This allowed alternatives to be evaluated in terms of risk-adjusted scores. The stakeholder focus of VMM can be used to dovetail with other stakeholder-oriented organization development and organization change efforts. Evolution of a Public Sector IT Model for Evaluation Public administration makes a continual attempt to improve evaluation of services provided. Over the years a wide variety of performance measures for IT have been practiced or advocated. However, over time, the measures have shifted to emphasize effectiveness in terms of mission fulfillment.
  • 37. Key Legislation - Chief Financial Officers Act of 1990 - Government Performance and Results Act of 1993 - Federal Acquisition Streamlining Act of 1995 - Paperwork Reduction Act of 1995 - Clinger-Cohen Act of 1996 Models used for evaluation: Traditional Value Ranking Methods – establishment of value categories Value-Measuring Methodology (VMM)– a multicriterion decision-making methodology The Program Assessment Rating Tool (PART) – identifies a program’s strengths and weaknesses The Performance Reference Model (PRM) – calls for government agencies to develop their own performance metrics Evaluation of Governmental Web Sites
  • 38. In evaluating government websites, it may be important to emphasize interactive features, not simply for the intrinsic value of e-participation and e-democracy, but also because such features seem to play a critical role in attracting users to government portals. For instance, in 2005 Firstgov.gov dropped from being the most-visited federal website to being in ninth place, according to a study by Darrell West of Brown University’s Taubman Center for Public Policy. West attributed the drop to the failure of FirstGov.gov to add new features, such as the interactive e-mail feature which the White House website added, helping propel it, West believed, into first place (Lipowicz , 2005f). However, in West’s 2006 survey, FirstGov returned to being one of the top two most- visited websites, along with the USDA. Melitski, Holzer, and Kim (2005) suggest Seoul, Hong Kong, Singapore, New York, and Shanghai are the top five large cities worldwide in providing digital government using a 5 stage model for evaluating public sector websites that includes: 1. Security & Privacy 2. Usability
  • 39. 3. Up-to-date, accurate content 4. Service provision 5. Citizen participation OMB Standards for Federal Websites The 2004 OMB requirements for federal websites, incorporating requirements found in the E- Government Act, the Paperwork Reduction Act, and OMB Circular A-130, includes ten specifications (Miller, 2004r): 1. Using .gov, .mil, or fed.us domains 2. Implement security controls 3. Maintain accessibility for the disabled. 4. Protect the privacy of data 5. Have an information quality assurance process 6. Follow requirements for managing federal records 7. Maintain schedules for posting information 8. Establish and enforce agencywide linking policies 9. Communicate with state and local government, and citizens,
  • 40. regarding information needs. 10. All agency websites should have a search function. The Range and Type of Services Approach Many evaluation strategies focus on the range and quality of government services provided over the Internet. The Brown University (West) surveys cited above ranked over 1,500 state and federal websites based on content, including contact information, comment forms, automatic email notification, publications, databases, audio and video clips, foreign language features, disability services, advertisements, user fees, and security and privacy statements. The survey found 77% of government web portals offered fully executable services online in 2006, up from 73% in 2005. Some 71% had posted privacy policies, up from 69% in 2005 Thormeyer, Rob (2006q). A second services approach is exemplified by Scott (2005), who evaluated municipal websites using measures on five different dimensions of service quality. Confirming previous studies, Scott found that city size was a key predictor of quality, but there was wide variation in Internet services
  • 41. Similarly, the National Policy Research Council (NPRC) undertook to rank all known official U.S. state and local government websites in 2006 (Government Technology, 2006aa) using a combined range-of- functionality and type-of-content approach, giving higher ratings to websites that had more of some 25 measured features in these categories: • website navigation aids • home page content and links • handicapped accessibility options • website outreach via foreign language translations, e-surveys online comment forms • legislative, mapping, and socio economic statistical information • online payment options for taxes and services • online business and/or construction permit options • online procurement information and transactions • government job listings and online job applications. The Usability Approach
  • 42. A manual containing design and usability guidelines was published in 2003 by the Department of Health and Human Services in partnership with the General Services Administration. This manual, Research based web design & usability guidelines (HHSD, 2006), has since become perhaps the single most widely used resource of its type and reflects the input of a wide variety of public, private, and academic sector experts. The manual contains a large number of guidelines ranked by importance. As of the 2006 edition, those in the top category of importance (category 5) were: • Provide useful content • Establish user requirements • Understand and meet user ’s expectations • Involve users in establishing user requirements • Do not display unsolicited windows or graphics • Comply with section 508 • Design forms for users using assistive technology • Do not use color alone to convey information • Enable access to the homepage
  • 43. • Show all major options on the homepage • Create a positive first impression of your site • Place important items consistently • Avoid cluttered displays • Place important items at top center • Eliminate horizontal scrolling • Use clear category labels • Use meaningful link labels • Distinguish required and optional data entry fields • Label push buttons clearly • Make action sequences clear • Organize information clearly • Facilitate scanning • Ensure that necessary information is displayed • Ensure usable search results • Design search engines to search the entire site
  • 44. ’Relative importance ’ was determined by a panel of external reviewers (16 experts in the original manual, 36 more in the revised manual), half web designers and half usability specialists. In addition, the following criteria were judged to be in the top category by strength of evidence in reported web research (which depends as much on what researchers choose to look at as importance, hence the differential between expert opinion and empirical research): • Provide useful content • Standardize task sequences • Design for working memory limitations • Align items on a page • Use descriptive headings liberally • Use black text on plain, high contrast backgrounds • Use attention attracting features when appropriate • Use familiar fonts • Emphasize importance • Order elements to maximize user performance • Use data entry fields to speed performance
  • 45. • Use simple background images • Use video, animation, and audio meaningfully • Use images to facilitate learning • Use mixed case with prose • Group related elements • Use color for grouping • Use an iterative design approach While the HHSD guidelines are meant to be design criteria, they may be converted for use as criteria when evaluating government websites. As a group, this set of criteria focus on usability, downplaying or omitting considerations such as interactivity or e-democracy support or even range and type of services. “Citizen-centric” evaluation strategies Wang, Bretschneider, and Gant (2005) have proposed an evaluation approach which emphasizes why some web designs are better than others in facilitating citizens' information seeking tasks. They contrast this “citizen-centric” approach with most existing evaluation methodologies, which either focus on
  • 46. usage and accessibility or on user reactions. Counting number of page hits, for instance, shows usage in quantitative terms but does not show if citizens are finding what they were seeking. Rejecting private sector evaluation methods which assess websites in terms of contribution to competitive advantage of the firm, the Wang-Bretschneider-Gant approach instead focuses “task related information” (relevancy of the website to tasks citizens seek to accomplish), “task complexity” (skills needed to utilize information presented), and (3) effectiveness of “perceptual stimuli” such as text, audio, video, and charts utilized to convey information. A simplified version of the “citizen-centric” evaluation strategy is simply counting interactive forms available to the citizen, since these are almost by definition oriented to citizen tasks. The University of Minnesota Extension Service (2005), for instance, notes “Perhaps the quickest measurement of e- government service to citizens is the number of forms on the website. They can be simple contact or feedback forms, requests for services, subscriptions to email lists, search boxes, or easy to use e-
  • 47. commerce functions for paying taxes, obtaining licenses, or buying a county souvenir. The more forms and the easier they are to use, the more audience focused the government website. No forms means no interaction— failed website.” ID Task Mode Task Name Duration Start Finish Aug 20, '17 Sep 24, '17 Oct 29, '17 Dec 3, '17 Jan 7, '18 Feb 11, '18 Mar 18, '18 Apr 22, '18 May 27, '18 Jul 1, '18 Aug 5 S M T
  • 48. W T F S S M T W T F S S M T W T F S S M T 1 PROJECT DESIGN 63 days Mon 9/4/17 Wed 11/29/1 2 DEVELOP FUNCTIONAL SPECIFICATIONS 10 days Mon 9/4/17 Fri 9/15/17 3 DEVELOP SYSTEM ARCHITECTURE 4 DEVELOP PRELIMANARY DESIGN SPECIFICATIONS
  • 49. 20 days Mon 9/18/17 2.5 days Mon 10/16/17 Fri 10/13/17 Wed 10/18/17 5 DEVELOP DETAILED DESIGN SPECIFICATIONS 6 DEVELOP ACCEPTANCE TEST PLAN 20 days Wed 10/18/17 10.5 days Wed 11/15/17 Wed 11/15/17 Wed
  • 50. 11/29/17 7 PROJECT DEVELOPMENT 157 days Thu 11/30/1 Fri 7/6/18 8 DEVELOP COMPONENTS 60 days Thu 11/30/17 Wed 2/21/18 9 PROCURE SOFTWARE 25 days Thu 2/22/18 Wed 3/28/18 10 PROCURE HARDWARE 25 days Thu 3/29/18 Wed 5/2/18 11 DEVELOPMENT ACCEPTANCE TEST PACKAGE 12 PERFORM UNIT/INTEGRATION TEST 25 days Thu 5/3/18 Wed 6/6/18 22 days Thu 6/7/18
  • 51. Fri 7/6/18 13 PROJECT DELIVERY 9.5 days Mon 7/9/18 Fri 7/20/18 14 INSTALL SYSTEM 2 days Mon 7/9/18 Tue 7/10/18 Project: Web Based Tool Date: Sun 7/23/17 Task Split Milestone Project Summary Inactive Task Inactive Milestone Manual Task Duration-only Manual Summary Rollup Start-only Finish-only
  • 52. External Tasks Deadline Progress Manual Progress Summary Inactive Summary Manual Summary External Milestone Page 1 ID Task Mode Task Name Duration Start Finish Aug 20, '17 Sep 24, '17 Oct 29, '17 Dec 3, '17 Jan 7, '18 Feb 11, '18 Mar 18, '18 Apr 22, '18
  • 53. May 27, '18 Jul 1, '18 Aug 5 S M T W T F S S M T W T F S S M T W T F S S M T 15 TRAIN CUSTOMERS 1 day Wed 7/11/18 Wed 7/11/18 16 PERFORM ACCEPTANCE 3 days Thu 7/12/18 Mon TEST
  • 54. 17 PERFORM POST PROJECT REVIEW 18 PROVIDE WARRANTY SUPPORT 7/16/18 3 days Tue 7/17/18 Thu 7/19/18 1 day Tue 7/17/18 Tue 7/17/18 19 ARCHIVE MATERIALS 0.5 days Fri 7/20/18 Fri 7/20/18 Project: Web Based Tool Date: Sun 7/23/17 Task Split Milestone Project Summary Inactive Task Inactive Milestone Manual Task
  • 55. Duration-only Manual Summary Rollup Start-only Finish-only External Tasks Deadline Progress Manual Progress Summary Inactive Summary Manual Summary External Milestone Page 2 7 � �
  • 69. � � � � � � BudgetNew Web Based Tool Budget EstimatePROJECT TASKSLABOR HOURSLABOR COST ($)MATERIAL COST ($)OTHER COST ($)TOTAL PER TASKPROJECT DESIGNDevelop Functional
  • 70. Specifications80.0$3,100.00$2,499.00$750.00$6,429.00Develop System Architecture160.0$1,200.00$1,500.00$0.00$2,860.00Develop Preliminary Design Specification20.0$300.00$1,295.00$0.00$1,615.00Develop Detailed Design Specifications160.0$1,100.00$1,700.00$0.00$2,960.00Develop Acceptance Test Plan85.0$1,500.00$2,500.00$0.00$4,085.00Subtotal505.0$7,200 .00$9,494.00$750.00$17,949.00PROJECT DEVELOPMENTDevelop Components480.0$1,700.00$6,250.00$0.00$8,430.00Procure Software (Capital Equipment)200.0$400.00$15,500.00$0.00$16,100.00Procure Hardware (Capital Equipment)200.0$600.00$27,500.00$0.00$28,300.00Developme nt Acceptance Test Package200.0$1,200.00$1,000.00$0.00$2,400.00Perform Unit/Integration Test 180.0$650.00$1,000.00$0.00$1,830.00Subtotal1,260.0$4,550.00 $51,250.00$0.00$57,060.00PROJECT DELIVERYInstall System16.0$320.00$1,750.00$500.00$2,586.00Train Customers8.0$160.00$200.00$612.00$980.00Perform Acceptance Test24.0$480.00$500.00$0.00$1,004.00Perform Post Project Review24.0$480.00$0.00$0.00$504.00Provide Warranty Support8.0$160.00$0.00$0.00$168.00Archive Materials4.0$80.00$150.00$0.00$234.00Subtotal84.0$1,680.00$ 2,600.00$1,112.00$5,476.00PROJECT MANAGEMENTCustomer Progress Meetings/Reports480.0$1,000.00$0.00$0.00$1,480.00Internal Status Meetings/Reports480.0$1,000.00$0.00$0.00$1,480.00Subcontra ctors480.0$1,000.00$0.00$2,500.00$3,980.00Interface to Other Internal Departments480.0$1,000.00$0.00$0.00$1,480.00Configuration
  • 71. Management480.0$1,000.00$0.00$0.00$1,480.00Quality Assurance480.0$1,000.00$0.00$0.00$1,480.00Overall Project Management480.0$1,000.00$0.00$0.00$1,480.00Subtotal3,360. 0$7,000.00$0.00$2,500.00$12,860.00OTHER COSTTravel0.0$0.00$0.00$4,396.00$4,396.00Telecommunicati ons0.0$0.00$0.00$1,759.00$1,759.00Documentation0.0$0.00$0. 00$500.00$500.00Subtotal0.0$0.00$0.00$6,655.00$6,655.00Tot al Project Funding Requirements5209.0$20,430.00$63,344.00$11,017.00$100,000. 00 Development & TestingDevelopment & Testing Report - New Web Based Tool (testing results)Test Item Test Condition Expected Result Procedure Pass/FailDefect ID#Browser / DeviceBug Issue Description Owner CommentsSearch Form 1.aStable 5.00CompletePass325670.00Windows 8 (IE10)NoneJasmine McCordnavigation 1.aCritical 5.00Escallated to testing for fixFail325672.00Windows 8 (IE10)Cannot move mouse to menu area without it disappearing. (See screenshot) Jasmine McCordnavigation 1.bCritical 5.00Escallated to testing for fixFail325673.00Windows 8 (IE10)Cannot scroll on pages 4, 6, 8Jasmine McCordsearch form 1.bStable 5.00CompletePass325671.00Windows 8 (IE10)NoneJasmine McCordimage carousel 1-3.aCritical 5.00Escallated to testing for fixFail325674.00Windows 8 (IE10)The left and right arrows in box do nto scrollJasmine McCordCustomer Support Field 1.aCritical 5.00Escallated to testing for fixFail325675.00Windows 8 (IE10)Customer cannot submit form unless all required fields are filled outJasmine McCordNeed an extra 10 lead time for resolution Data Entry Field (feedback)Critical 5.00Escallated to testing for fixFail325676.00Windows 8 (IE10)Some fields get stuck when data entered --> customer cannot get passed field without restarting browserJasmine McCordNeed to resolve asap - Priority level Sheet2
  • 72. Preliminary Project Charter Worksheet CPMGT/305 Version 12 1 University of Phoenix Material Preliminary Project Charter Worksheet Complete this project charter worksheet according to the instructions in section 4.1 of A Guide to the Project Management Body of Knowledge. Completed by: Learning Team A Date: 7/22/2017 1. Project title: New Web Based Tool 2. High-level project scope ( fewer than 50 words) A new performance management process will be created that utilizes a web-based system; providing a convenient way to document and track employee performance goals, provide regularly updated status reports, and a vehicle for managerial feedback. The system will use quantified data for salary/bonus treatment at year’s end. 3. Problem to be solved or opportunity to be realized by this project (fewer than 25 words) The new performance management system will provide an efficient way to quantify employee performance; providing a fairer, more accurate system for performance appraisals. 4. Project purpose or justification including specific measurable business impacts or results (fewer than 50 words) The new performance management system is needed because of the lack of accuracy in past employee performance reviews and
  • 73. merit increases. The new system will provide both qualitative and quantitative data metric data, which will result in providing more precise analysis of employee work performance. 5. Measurable project objectives and related success criteria including metrics (Provide three to four objectives with metrics) · Determine which performance measuring criteria should be implemented into the new system, and describe how the system utilizes it. · Develop performance management system platform · Integrate system into company website · Achieve increased employee performance of at least 15% within 12 months of implementation. 6. High-level requirements (fewer than 100 words) The performance management system project will require a hand-selected project team with extensive background in human resources and supervisory management. Two human resource clerks will participate, in addition to a single project manager. The team will also require the services of a software development consultant and her team of two software engineers. The project will be staged in a small suite of offices on the third-floor of the downtown Seattle office. The project will require the purchase of a new web server, and dedicated internet connection during preliminary stages prior to implementation in a production environment. 7. High-level risks (fewer than 100 words) Some high-level risks include: developmental delays, scope creep, excessive change requests, technical difficulties, lack of employee enthusiasm for new performance system. 8. Summary of high-level milestones schedule (identify the major deliverables and subtasks)
  • 74. 1. Develop Employee Performance Grading System · Analyze industry standards, averages · Create methodology for quantifying performance goals/results · Develop template for employee performance status reports · Create grading matrix for salary/bonus criteria 2. Develop software platform for Performance Grading System · Develop outline for software development · Create software for Performance Grading System · Pilot test group prior to production rollout · Work out bugs, smooth out feature set · Implementation 3. Company Intranet Integration · Implement Performance Management software into Web space · Develop supervisor feedback system that integrates with PMS · Test and Troubleshoot · Pilot software platform/web site · Full implementation 9. Summary of high-level budget including expense dollars,
  • 75. capital dollars, and headcount (identify costs for major deliverables and tasks identified in the preceding milestone schedule) Deliverables Estimated Cost Due Date DEVELOP FUNCTIONAL SPECIFICATIONS Labor Costs = $7,200 September 9th, 2017 DEVELOP SYSTEM ARCHITECTURE Labor Costs = $7,000 October 15th, 2017 DEVELOP PRELIMANARY DESIGN SPECIFICATIONS Labor Costs = $4,500 October 18th, 2017 DEVELOP DETAILED DESIGN SPECIFICATIONS Labor Costs = $1,100 November 15th, 2017 DEVELOP COMPONENTS Labor Costs = $600 August 22th, 2017 PROCURE SOFTWARE Labor Costs = $7,200 March 28, 2017 DEVELOPMENT ACCEPTANCE TEST PACKAGE Labor Costs = $1,680 June 6th, 2018 PERFORM UNIT/INTEGRATION TEST Labor Costs = $7,000 July 6th, 2018 INSTALL SYSTEM Labor Costs = $3,200 July 10th, 2018
  • 76. 1 PROJECT PLAN OVERVIEW & PRELIMINARY PROJECT PLANProject Plan Overview & Preliminary Project Plan Team A: Project Selection Criterion One of the biggest decisions an organization makes is what projects to take on. Once a scope of work, bids or a proposal has been presented as project material, there are many factors to consider. The most valuable options should be chosen as well as considering goals and objectives of the portfolio of the organization. The benefits of the project as well as objectives are the main reason for choosing a project. Another thing to consider would be the feasibility, what is the likelihood of this project getting off the ground to completion. Timing must be well-thought-out for each project. Some projects are conducted in an emergency state, whereas, some are planned out far in advance. Budget is one of biggest reasons for a project to fail or be successful. A chosen project must maintain allocated funding from stakeholders. The first step for project selection is defining the project and the criteria thereof. The second step is to score the project for where it stands before other projects. The last step inform stakeholders for approvals For this particular project the criteria behind this project being chosen was the sales force needed a web-based tool that could help in managing client accounts, track sales and maintain customer proposals. It was chosen due to the value of customer satisfaction verses budget constraints. The benefits to cost ratio was analyzed and found to be a profitable advantage for the sales team and their customers. The payback period was examined; the findings were that within a five year period the payback would be higher than cost of project. As the sales team begin to manage their customers with a more
  • 77. organized position, they can take on more customers and keep satisfaction and personable service while maintaining production. Stakeholders chose this project for growth of clients. Another criteria for selection was time management. With the purchase and implementation of this web-based tool, the sales force will have the tools to preserve client accounts in a timely fashion. The web-based tool has the ability to notify sales and clients when their products are low. The tool is also has the ability to perform data-based functions. The sales team inputs what products the client has. The client uses the tool for production accountability. When stock gets low, the tool sends notification to that client’s sales person for further ordering direction. This cost value and customer satisfaction was criteria for stakeholder decision Overview The nature and scope of this project is to develop a web based tool for a client organization. Team A works for an information technology organization. A client has asked our organization to help build and implement a new web-based internal management system for their sales department to manage their customer accounts. Our client’s sales managers and sales executives will use this tool to develop account plans for their customers and to track their own sales results and be able to extract a sales report for each sales representative at each quarter. The focus of this project is to be able to deliver a program that sales representatives can manage easily anywhere and be able to pull statistical information about their clients and themselves and sales criteria. The purpose of this internal management system implementation is to help our clients sales department get more organized and be able to input their client’s information and be able to build programs specific to the needs of their clients. This implementation will also allow the sales representatives to be able to manage their client base much easier and be able to track their clients progress and also be able to keep up to date statistical information in their systems about their clients as well. This system is being customized and developed
  • 78. specifically to fit the needs of our client and requirements to meet the client’s needs internally. The long term goal is for our client to be able to manage their clients much better and be able to address their needs in a more specific fashion. We will be developing the project based on the following criteria; · A description of your internal client's requirements for this sales and account management system. · Identification and engagement of the appropriate stakeholders to define the requirements for the new system. · Development and testing of the system. · Implementation of the system. · A method for obtaining client feedback. Task Project Manager Due Date Owner A description of your internal client’s requirements for this sale and account management system TBD August 14, 2017 TBD Identification and engagement of the appropriate stakeholders to define the requirements for the new system. TBD August 14, 2017 TBD Development & Testing for the system TBD August 14, 2017 TBD A Method for obtaining client feedback TBD August 14, 2017 TBD
  • 79. Preliminary Project Schedule Preliminary Budgetary Plan Reference PMI (2013) Project Management Institute. A Guide to the Project Management Body of Knowledge (PMBOK Guide) - 5th Edition