Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Testing Sociotechnical Systems: Passport Issuing


Published on

Published in: Technology, Business
  • Be the first to comment

Testing Sociotechnical Systems: Passport Issuing

  1. 1. Testing Socio-Technical Systems Part 2: The 1999 “PASS” Passport Issuing System John Rooksby University of St Andrews
  2. 2. The introduction of the PASS Passport System
  3. 3. The PASS Passport System <ul><li>A new system for the United Kingdom Passport Agency </li></ul><ul><li>This was to replace an aging system that was not millennium compliant, could not handle new security requirements and in order to handle the new children’s passport </li></ul><ul><li>A more accountable examination process was also planned </li></ul><ul><li>The outgoing system was implemented in 1989 and resulted in major backlogs in passport issuing and soured staff relationships to the point of industrial action </li></ul><ul><li>In 1998 the PASS prototype was implemented at the two largest passport offices (out of the then 6 in the UK) – Liverpool (Oct 1998) + Newport (Nov 1998) </li></ul><ul><li>This was in line with the original schedule, and the plan was to roll out to all offices before the 1999 busy season </li></ul><ul><li>They started with the larger offices as they did not want to make the switch at these offices during the ‘busy season’ </li></ul><ul><li>This was the first PFI – Siemens Business Services were selected to develop the new computer system and undertake the initial processing of passports, Security Printing & Systems Ltd selected for printing and dispatch </li></ul>
  4. 5. <ul><li>Planned output at Liverpool by 16 th Nov 98 was 30000, in reality is was 8000 – </li></ul><ul><li>but the Newport rollout went ahead on 16 th Nov – they had reservations but felt it too hard to stay with the existing system as the building had been reconfigured and staff re-trained, and they would have to pay Security Printing and Systems LTD 2 million compensation for loss of business. </li></ul><ul><li>On November 18 th rollout was suspended for all other offices </li></ul><ul><li>The automated scanning of application forms (at Siemens) had a high error rate, and manual correction of errors was slower than envisaged </li></ul><ul><li>Onsite printing (for high priority cases) did not work as well as expected </li></ul><ul><li>Demand was also higher than expected, particularly for children’s passports </li></ul><ul><li>There was a significant increase in enquiries from applicants as processing time lengthened - no plan in place for increasing capacity to deal with this </li></ul><ul><li>These risks had all been identified during planning but there were no responses in place </li></ul>Failure
  5. 6. <ul><li>By March the delays had started to attract parliamentary and media attention </li></ul><ul><li>Phone lines were jammed and huge queues formed at passport offices </li></ul><ul><li>Callers were given a fax number to write to but this soon collapsed </li></ul><ul><li>The failure dominated the news, and it is claimed many members of the public panicked and added to the queues </li></ul><ul><li>By May the telephone enquiry service at Liverpool had ‘virtually collapsed’ </li></ul><ul><li>By June there was a backlog of 565000 applications and an average waiting time of 50 working days </li></ul>The Failure Gets out of hand
  6. 7. <ul><li>The strategy adopted for the 1999 summer season was (as usual for the season) to increase overtime and hire casual staff. By March 1999 this was clearly inadequate and the Agency submitted a new plan - to hire even more staff than usual </li></ul><ul><li>A further 400 staff (beyond the normal summer average of 1800) were recruited, and many Agency staff were working 7 days a week </li></ul><ul><li>In May, free two year extensions were offered to people in the queues </li></ul><ul><li>In late June free two year extensions made available from Post Offices </li></ul><ul><li>In late June a new call centre opened to deal with queries </li></ul><ul><li>It was not until the end of the busy season that the problem eased </li></ul><ul><li>The roll out to the other 4 offices was delayed and their systems made Y2K compliant </li></ul>Response
  7. 8. <ul><li>The cost of additional measures was 12.6 million, including 6 million for additional staffing and 200,000 for making the old system Y2K compliant </li></ul><ul><li>The unit cost of passports rose from 12 to 15 pounds (passed on to consumer). </li></ul><ul><li>By late 1999 161,000 had been paid in compensation for the 500 missed travel dates. </li></ul><ul><li>2.5 million in compensation from Siemens was agreed (but as far as I can tell they ended up only paying a small fraction of this) </li></ul><ul><li>The target for 1999 of meeting 99.9% of travel dates was actually achieved, but the Home Office said this target did not represent a meaningful standard of quality to the public – Also, the targets for cost reduction and efficiency were also met over the longer term (and so was this a ‘failure’?) </li></ul><ul><li>The examination process for applications was successfully more accountable </li></ul><ul><li>The Agency lost its Charter Mark for public service </li></ul>Costs and Targets
  8. 9. The Project Plan and Reality
  9. 10. <ul><li>Testing started four months later than planned </li></ul><ul><li>Testing was ‘witnessed’ by representatives from the agency </li></ul><ul><li>The first stage (factory testing) took place at Siemens test centre </li></ul><ul><li>Before Go-Live the Central Computer and Telecoms agency advised that the system had been well-designed. In its view the Liverpool office should be a controlled pilot because reliability had been a problem during testing </li></ul><ul><li>An internal audit said “Given the tight timescales under which the project has been developed it has been well managed with sound controls in place to ensure the quality of the end products. However there has been a compression of the testing timescales… (we) understand the compelling reasons for wanting to meet the 5 October 1998 start date…” </li></ul><ul><li>The test program did not cover productivity testing, although testing at the Siemens test centre had highlighted that the examination stage of passport processing was taking longer under the new system. </li></ul><ul><li>Terminals had been set aside at Liverpool for productivity testing onsite but were never used (as a result of time constraints) </li></ul><ul><li>Training related to the use of the computer and not to the wider clerical processes needed to support the new system </li></ul><ul><li>In order to test the printing system, they needed high volumes of output – and so the staged roll out of the system was seen as a staged test of the printing capacity </li></ul>Testing
  10. 11. <ul><li>Public bodies offering a demand-led public service should be aware of capacity constraints, and have contingency plans in place to cope with any likely surge in demand, taking full account of reasonable public expectations of service standards, the likely cost and the level of risk. </li></ul><ul><li>Public bodies providing demand-led services should ensure that their forecasting techniques, though necessarily imprecise, are nonetheless sufficiently robust to enable them to manage their business efficiently, for example to enable them to plan their capacity needs. </li></ul><ul><li>The business case drawn up to justify any new computer system should test the likely financial cost of different options on a sufficiently wide range of business volumes, and allow an informed judgment, taking account of the impact of any likely changes in policy. </li></ul><ul><li>Public bodies should undertake a formal risk analysis before introducing new computer systems and have realistic plans to maintain services to the public if things go wrong. </li></ul><ul><li>Project managers should plan for adequate testing of the new system before committing to live operation, in particular for staff to learn and work the system. </li></ul>The passport delays of 1999 - Ten Lessons
  11. 12. <ul><li>Pilot tests of any new system which is critical to business performance should be on a limited scale so that any shortcomings do not have a major impact on service delivery. Where pilots need to be on a large scale to test operations at high volumes, the risks should be identified and addressed in contingency plans . </li></ul><ul><li>Organisations should pay special attention to the interaction between the new system and those expected to use it, and take into account users’ views on the practicability and usability of the new system. </li></ul><ul><li>Agencies should make a realistic assessment of whether they have the capacity to deal with potential problems and be prepared to seek early assistance from their parent departments and elsewhere if necessary. </li></ul><ul><li>When service delivery is threatened, public bodies should have the capability to keep the public well informed, so as to avoid unnecessary anxiety and relieve pressure on services. </li></ul><ul><li>Public bodies should have adequate systems for recording performance, and ensure that they are in a position to claim any compensation due from contractors for failure to meet agreed performance standards, subject to appropriate risk-sharing within the partnership. </li></ul>The passport delays of 1999 - Ten Lessons
  12. 13. The Passport Office October 2008 <ul><li>The system is working well </li></ul><ul><li>The system has been named as an example of a successful project, and used to justify ID cards </li></ul><ul><li>The post office now undertakes some initial processing </li></ul><ul><li>The UK Passport agency is now the UK Identity and Passport Service </li></ul><ul><li>In 2008 an online system was scrapped after a disastrous go-live </li></ul><ul><li>They are under a lot of pressure with the ID Card system </li></ul><ul><li>The home office is trying to learn from mistakes and to be more open about them </li></ul>
  13. 14. <ul><li>Discussion </li></ul><ul><ul><li>What role does testing have when there are fixed deadlines? </li></ul></ul><ul><ul><li>Do the outcomes of these two cases actually tell us anything about testing other than “you should have tested it more” – are the lessons about design? </li></ul></ul><ul><ul><li>Can decisions in testing / not-testing be coupled to wider understanding of socio-technical systems? </li></ul></ul><ul><ul><li>Should testing be used to answer “what do I need to worry about at go-live”? (given that testing is routinely compromised, and a disastrous go live does not mean the system is a long-term failure) </li></ul></ul>
  14. 15. <ul><li>Key Documents </li></ul><ul><ul><li>The United Kingdom Passport Agency (1999) The Passport Delays of Summer 1999. Report by the Comptroller and Auditor General. </li></ul></ul><ul><ul><li>National Audit Office (2007) Identity and Passport Service: Introduction of ePassports. Report by the Comptroller and Auditor General. </li></ul></ul>