View stunning SlideShares in full-screen with the new iOS app!Introducing SlideShare for AndroidExplore all your favorite topics in the SlideShare appGet the SlideShare app to Save for Later — even offline
View stunning SlideShares in full-screen with the new Android app!View stunning SlideShares in full-screen with the new iOS app!
A new system for the United Kingdom Passport Agency
This was to replace an aging system that was not millennium compliant, could not handle new security requirements and in order to handle the new children’s passport
A more accountable examination process was also planned
The outgoing system was implemented in 1989 and resulted in major backlogs in passport issuing and soured staff relationships to the point of industrial action
In 1998 the PASS prototype was implemented at the two largest passport offices (out of the then 6 in the UK) – Liverpool (Oct 1998) + Newport (Nov 1998)
This was in line with the original schedule, and the plan was to roll out to all offices before the 1999 busy season
They started with the larger offices as they did not want to make the switch at these offices during the ‘busy season’
This was the first PFI – Siemens Business Services were selected to develop the new computer system and undertake the initial processing of passports, Security Printing & Systems Ltd selected for printing and dispatch
Planned output at Liverpool by 16 th Nov 98 was 30000, in reality is was 8000 –
but the Newport rollout went ahead on 16 th Nov – they had reservations but felt it too hard to stay with the existing system as the building had been reconfigured and staff re-trained, and they would have to pay Security Printing and Systems LTD 2 million compensation for loss of business.
On November 18 th rollout was suspended for all other offices
The automated scanning of application forms (at Siemens) had a high error rate, and manual correction of errors was slower than envisaged
Onsite printing (for high priority cases) did not work as well as expected
Demand was also higher than expected, particularly for children’s passports
There was a significant increase in enquiries from applicants as processing time lengthened - no plan in place for increasing capacity to deal with this
These risks had all been identified during planning but there were no responses in place
The strategy adopted for the 1999 summer season was (as usual for the season) to increase overtime and hire casual staff. By March 1999 this was clearly inadequate and the Agency submitted a new plan - to hire even more staff than usual
A further 400 staff (beyond the normal summer average of 1800) were recruited, and many Agency staff were working 7 days a week
In May, free two year extensions were offered to people in the queues
In late June free two year extensions made available from Post Offices
In late June a new call centre opened to deal with queries
It was not until the end of the busy season that the problem eased
The roll out to the other 4 offices was delayed and their systems made Y2K compliant
The cost of additional measures was 12.6 million, including 6 million for additional staffing and 200,000 for making the old system Y2K compliant
The unit cost of passports rose from 12 to 15 pounds (passed on to consumer).
By late 1999 161,000 had been paid in compensation for the 500 missed travel dates.
2.5 million in compensation from Siemens was agreed (but as far as I can tell they ended up only paying a small fraction of this)
The target for 1999 of meeting 99.9% of travel dates was actually achieved, but the Home Office said this target did not represent a meaningful standard of quality to the public – Also, the targets for cost reduction and efficiency were also met over the longer term (and so was this a ‘failure’?)
The examination process for applications was successfully more accountable
The Agency lost its Charter Mark for public service
Testing was ‘witnessed’ by representatives from the agency
The first stage (factory testing) took place at Siemens test centre
Before Go-Live the Central Computer and Telecoms agency advised that the system had been well-designed. In its view the Liverpool office should be a controlled pilot because reliability had been a problem during testing
An internal audit said “Given the tight timescales under which the project has been developed it has been well managed with sound controls in place to ensure the quality of the end products. However there has been a compression of the testing timescales… (we) understand the compelling reasons for wanting to meet the 5 October 1998 start date…”
The test program did not cover productivity testing, although testing at the Siemens test centre had highlighted that the examination stage of passport processing was taking longer under the new system.
Terminals had been set aside at Liverpool for productivity testing onsite but were never used (as a result of time constraints)
Training related to the use of the computer and not to the wider clerical processes needed to support the new system
In order to test the printing system, they needed high volumes of output – and so the staged roll out of the system was seen as a staged test of the printing capacity
Public bodies offering a demand-led public service should be aware of capacity constraints, and have contingency plans in place to cope with any likely surge in demand, taking full account of reasonable public expectations of service standards, the likely cost and the level of risk.
Public bodies providing demand-led services should ensure that their forecasting techniques, though necessarily imprecise, are nonetheless sufficiently robust to enable them to manage their business efficiently, for example to enable them to plan their capacity needs.
The business case drawn up to justify any new computer system should test the likely financial cost of different options on a sufficiently wide range of business volumes, and allow an informed judgment, taking account of the impact of any likely changes in policy.
Public bodies should undertake a formal risk analysis before introducing new computer systems and have realistic plans to maintain services to the public if things go wrong.
Project managers should plan for adequate testing of the new system before committing to live operation, in particular for staff to learn and work the system.
Pilot tests of any new system which is critical to business performance should be on a limited scale so that any shortcomings do not have a major impact on service delivery. Where pilots need to be on a large scale to test operations at high volumes, the risks should be identified and addressed in contingency plans .
Organisations should pay special attention to the interaction between the new system and those expected to use it, and take into account users’ views on the practicability and usability of the new system.
Agencies should make a realistic assessment of whether they have the capacity to deal with potential problems and be prepared to seek early assistance from their parent departments and elsewhere if necessary.
When service delivery is threatened, public bodies should have the capability to keep the public well informed, so as to avoid unnecessary anxiety and relieve pressure on services.
Public bodies should have adequate systems for recording performance, and ensure that they are in a position to claim any compensation due from contractors for failure to meet agreed performance standards, subject to appropriate risk-sharing within the partnership.