Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

When Controls Don’t Work: 4 Unhappy Information Security Scenarios


Published on

This is a set of slides from an information assurance course I helped create in 2003. Content remains accurate but may appear dated. Lessons learned still apply. Originally delivered at Norwich University, Vermont, Master of Science in Information Assurance (MSIA) program.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

When Controls Don’t Work: 4 Unhappy Information Security Scenarios

  1. 1. When Production Controls Don’t Work: 4 Unhappy Scenarios Stephen Cobb, CISSP Author: Privacy for Business: Web sites & Email Contributing Author: Computer Security Handbook, 5 th Edition MSIA Norwich University
  2. 2. 4 Unhappy Scenarios <ul><li>Rogue operations </li></ul><ul><ul><li>Ziff Davis Media </li></ul></ul><ul><li>Maintenance problems </li></ul><ul><ul><li>Western Union </li></ul></ul><ul><li>Failure to apply controls </li></ul><ul><ul><li>Eli Lilly </li></ul></ul><ul><li>Compound problems </li></ul><ul><ul><li>London Ambulance Service </li></ul></ul>
  3. 3. Rogue Operations: Ziff Davis Media <ul><li>Company uses outside services for secure website operations </li></ul><ul><li>Marketing folks decided to run a promotion via the website </li></ul><ul><li>They bypass the process in place to ensure changes to website don’t affect security </li></ul><ul><li>Web server accumulated names, addresses, and some credit cards, in an unprotected text file </li></ul><ul><li>Result was lawsuits and fines </li></ul>
  4. 4. Maintenance Problems: Western Union <ul><li>The Western Union Money Transfer Service at lets you send cash to almost anywhere in the world. Send money to a loved one in the U.S. and over 150 other countries – Quote from website </li></ul><ul><li>I sent money to my daughter, using my credit card </li></ul><ul><li>Got a call some months later advising me to cancel card because it was “possibly compromised” </li></ul><ul><li>Hacker had been probing site during a maintenance cycle which had altered permissions on site, presumably for convenience, found the credit card file </li></ul><ul><li>Impact: no fines, but stock price took a hit, as did customer confidence (guess who has not used them since) </li></ul><ul><li>Note that the web site now says NOTHING about security </li></ul>
  5. 5. Failure to Apply Controls: Lilly <ul><li>The Eli Lilly Prozac Email Snafu </li></ul><ul><li>Programmer hand-coded application to send same message to 700 people </li></ul><ul><li>One character in code was in the wrong place </li></ul><ul><li>Result: email addresses of all recipients appeared in “To” instead of remaining hidden in “BCC” </li></ul><ul><li>Many mainframe-oriented production controls in place at Lilly, but not followed by website </li></ul><ul><li>Inadequate testing and no peer or management review of code </li></ul>
  6. 6. Compound Problems: LAS CAD Disaster <ul><li>London Ambulance System (LAS) covers 600 sq. miles </li></ul><ul><li>The only ambulance service for 6.8 million people </li></ul><ul><li>In 1992, there were 2,700 staff and 750 ambulances </li></ul><ul><li>2,000-2,500 calls received every day by dispatch </li></ul><ul><li>New Computer Aided Dispatch (CAD) system introduced </li></ul><ul><li>Taking emergency calls, accepting incident details </li></ul><ul><li>Determining which ambulance to send and communicating details of the incident to the ambulance dispatched. </li></ul><ul><li>On November 4th, the system crashed and rebooting the system did not help, calls in the computer were lost </li></ul><ul><li>Some ambulances arrived too late to help the victims </li></ul>
  7. 7. LAS CAD Disaster: The Reasons <ul><li>Company that won the bid was $1 million cheaper than the next lowest bidder </li></ul><ul><li>Reason for the large margin was not questioned </li></ul><ul><li>The system elements were tested, but no test done on the fully integrated system </li></ul><ul><li>Company did not adequately stress test the system under simulated conditions </li></ul><ul><li>Problems found by testing system elements were not fixed </li></ul><ul><li>Backup file servers were not tested, and not fully configured by time system went live </li></ul><ul><li>Potentially dangerous problems were overlooked </li></ul>
  8. 8. LAS CAD Disaster: The Reasons <ul><li>The system was fully implemented before full confidence in the reliability, accuracy, and quickness of the system was achieved </li></ul><ul><li>The system crash was caused by a memory leak, a programming error resulting from carelessness and lack of quality assurance of program code changes </li></ul><ul><li>“ What is clear from the Inquiry Team's investigations is that neither the CAD system itself, nor its users, were ready for full implementation…the CAD software was not complete, not properly tuned, and not fully tested… </li></ul><ul><li>“ The resilience of the hardware under a full load had not been tested. The fall-back option to the second file server had certainly not been tested… </li></ul><ul><li>“ There were outstanding problems with data transmission to and from the mobile data terminals.…Staff, both within Central Ambulance Control (CAC) and ambulance crews, had no confidence in the system and was not all fully trained and there was no paper backup… </li></ul>
  9. 9. LAS CAD Disaster: The Reasons <ul><li>“ There had been no attempt to foresee fully the effect of inaccurate or incomplete data available to the system (late status of reporting/vehicle locations etc.)… </li></ul><ul><li>“ These imperfections led to an increase in the number of exception messages that would have to be dealt with and which in turn would lead to more call-backs and enquiries. In particular the decision on that day to use only the computer generated resource allocations (which were proven to be less than 100% reliable) was a high-risk move.” </li></ul><ul><li>Software for the system was written in Visual Basic and was run in a Windows operating system (“a fundamental flaw in the design”). </li></ul><ul><li>“ The result was an interface that was so slow in operation that users attempted to speed up the system by opening every application they would need at the start of their shift, and then using the Windows multi-tasking environment to move between them as required. This highly memory-intensive method of working would have had the effect of reducing system performance still further.” </li></ul>
  10. 10. Thank You! <ul><li>Note: This is a set of slides from an Information Assurance course I helped create in 2003 </li></ul><ul><li>Part of award-winning Master of Science program (MSIA) at Norwich University, Vermont </li></ul><ul><ul><li> </li></ul></ul><ul><li>Content remains accurate (but may appear dated) </li></ul><ul><li>Lessons learned still apply </li></ul><ul><li>Stephen Cobb, CISSP </li></ul><ul><ul><li> </li></ul></ul><ul><ul><li> </li></ul></ul>