Information Systems Audit & CISA Prep 2010

2,013 views

Published on

Overview of Information systems auditing and assessment. In addition, coverage of Certified Information Systems Auditor certification.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,013
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
358
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Information Systems Audit & CISA Prep 2010

  1. 1. Donald E. Hester CISSP, CISA, CAP, CRISC, MCT, MCITP, MCTS, MCSE Security, Security+, CTT+ Director, Maze & Associates University of San Francisco / San Diego City College www.LearnSecurity.org | www.linkedin.com/in/donaldehester | www.facebook.com/LearnSec | www.twitter.com/sobca DonaldH@MazeAssociates.com Rev7/8/2011 © 2011 Maze & Associates
  2. 2. Course Overview • The Process of Auditing Information Systems • COBIT and other Standards • Governance and Management of IT • IS Acquisition, Development and Implementation • IS Operations, Maintenance and Support • Protection of Information Assets • CISA Exam Rev7/8/2011 © 2011 Maze & Associates
  3. 3. The Process of Auditing Information Systems
  4. 4. NIST RMF Assess Security Controls © 2011 Maze & Associates 5 • Assessment Preparation – Develop, review, and approve a plan to assess the security controls • Security Control Assessment – Assess the security controls in accordance with the assessment procedures defined in the security assessment plan • Security Assessment Report – Prepare the security assessment report documenting the issues, findings, and recommendations from the security control assessment • Remediation Actions – Conduct initial remediation actions on security controls based on the findings and recommendations of the security assessment report and reassess remediated control(s), as appropriate
  5. 5. Situation • Organizations are becoming increasingly dependent on technology and the Internet • The loss of technology or the Internet would bring operations to a halt • The need for security increases as our dependence on technology increases • Management wants to have assurance that technology has the attention it deserves 6 © 2011 Maze & Associates
  6. 6. Management Questions • Does our current security posture address what we are trying to protect? • Do we know what we need to protect? • Where can we improve? • Where do we start? • Are we compliant with laws, rules, contracts and organizational policies? • What are your risks? 7 © 2011 Maze & Associates
  7. 7. Reasons for Audits • Provide Assurance • Demonstrate due diligence • Make risk based decisions • Required for compliance (SOX, PCI, FISMA) 8 © 2011 Maze & Associates
  8. 8. Terms • Certification • Assessment • Audit • Review • ST&E = Security Test & Evaluation • Testing • Evaluation • Interviewing 9 © 2011 Maze & Associates
  9. 9. NIST Assessment • Detailed security review of an information system • Comprehensive assessment of – Management security controls – Operational security controls – Technical security controls • To determine the extent to which the controls are – Implemented correctly – Operating as intended – Producing the desired outcome • Providing the factual basis for an authorizing official to render a security accreditation decision 10© 2011 Maze & Associates
  10. 10. Assessment © 2011 Maze & Associates 11 An information security assessment is the process of determining how effectively an entity being assessed meets specific security objectives. Three types of assessment methods can be used to accomplish this—testing, examination, and interviewing. Assessment results are used to support the determination of security control effectiveness over time. - NIST SP 800-115
  11. 11. Audit © 2011 Maze & Associates 12 “Independent review and examination of records and activities to assess the adequacy of system controls and ensure compliance with established policies and operational procedures.” - CNSS Instruction No. 4009
  12. 12. Security Testing and Evaluation (ST&E) © 2011 Maze & Associates 13 “Examination and analysis of the safeguards required to protect an information system, as they have been applied in an operational environment, to determine the security posture of that system.” - CNSSI No. 4009
  13. 13. Testing, Examination and Interviewing © 2011 Maze & Associates 14 Testing is the process of exercising one or more assessment objects under specified conditions to compare actual and expected behaviors. Examination is the process of checking, inspecting, reviewing, observing, studying, or analyzing one or more assessment objects to facilitate understanding, achieve clarification, or obtain evidence. Interviewing is the process of conducting discussions with individuals or groups within an organization to facilitate understanding, achieve clarification, or identify the location of evidence. Source NIST SP 800-115
  14. 14. Risk Management Framework “The security certification and accreditation process is designed to ensure that an information system will operate with the appropriate management review, that there is ongoing monitoring of security controls, and that reaccreditation occurs periodically.” NIST SP 800-100 15© 2011 Maze & Associates
  15. 15. Scope • Scope of the certification should include the controls of the entire system being certified • Using the SSP the assessor will start by reviewing the listed controls • May identify additional areas of weakness “Security certification is a comprehensive assessment of the management, operational, and technical security controls in an information system, made in support of security accreditation, to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system. The results of a security certification are used to reassess the risks and update the system security plan, thus providing the factual basis for an authorizing official to render a security accreditation decision.” NIST SP 800-100 16 © 2011 Maze & Associates
  16. 16. Multiple levels of assessment © 2011 Maze & Associates 17
  17. 17. Program Level © 2011 Maze & Associates 18 • The Office of Inspector General (OIG) will conduct an audit of the agency’s FISMA compliance. (Program Level) • The OIG will select a number of individual systems to audit. These systems will already have an ATO. • OIG will conduct the assessment with internal staff or external contractors • A memorandum for the OIG will be given to the head of the agency and the Office of Management and Budget (OMB) • As of 2010 OIGs will use CyberScope for FISMA reporting
  18. 18. OIG © 2011 Maze & Associates 19 • Inspector Generals (IGs) are required to assess agency performance in the following programs: – Certification and Accreditation – Configuration Management – Security Incident Management – Security Training – Remediation/Plans of Actions and Milestones – Remote Access – Identity Management – Continuous Monitoring – Contractor Oversight – Contingency Planning
  19. 19. System Level © 2011 Maze & Associates 20 • Security Control Assessor (Certification Agent) – The individual, group, or organization responsible for conducting a security control assessment • Conducts an assessment at the system level • The testing and/or evaluation of the management, operational, and technical security controls in an information system to: – determine the extent to which the controls are implemented correctly, – operating as intended, and – producing the desired outcome
  20. 20. Assessment Lifecycle 21 Planning Information Gathering Business Process Assessment Technology Assessment Risk Analysis & Reporting © 2011 Maze & Associates
  21. 21. Common Types of Assessments 22 • Vulnerability Assessment • Penetration Test • Application Assessment • Code Review • Standard Audit/Review • Compliance Assessment/Audit • Configuration Audit • Wireless Assessment • Physical/Environmental Assessment • Policy Assessment © 2011 Maze & Associates
  22. 22. Determine your Scope 23 • What will be the scope of the assessment? – Network (Pen Test, Vul Scan, wireless) – Application (Code or Vul scan) – Process (business or automated) • How critical is the system you are assessing? – High, medium – use independent assessor – Low – self-assessment • Audit Risk – What is the risk that an auditor will miss a material error (Important or significant) – What level of risk is the auditor willing to accept© 2011 Maze & Associates
  23. 23. Material © 2011 Maze & Associates 24 • Material in the context of risk refers to an error that should be considered significant to any part of the area in question or to anyone one with vested interest. – The failure or absence of a control (or combination of controls) leaves the organization highly susceptible to the occurrence of a threat • Materiality is often left to professional judgment. Take into consideration: – Effect on the organization as a whole – What types of errors or irregularities can be expected – Any illegal acts that may arise • Materiality is more difficult for IS auditors than it is for financial auditors
  24. 24. Level of Effort • Testing levels will be dictated by the sensitivity of the data and criticality of the system. • Audit risk and system risk are not the same but are tied closely together • Lower risk systems – May allow for a self-assessment (less independence) – Checklist based • Higher risk systems – Will require independent review (more independence) – Testing – Sample sizes increase with risk 25 © 2011 Maze & Associates
  25. 25. Risk & Audit • Inherent Risk – risk that an error exists assuming there are not compensating controls (This will tie directly to the system criticality and sensitivity) • Control Risk – the risk that a control failure or error will not be prevented or detected in a timely manner during the normal course of business. • Detection Risk – the risk the auditor/assessor will use inadequate testing procedure and conclude that material errors do not exist when they actual do exist. • Overall Audit Risk – Combination of risk related to different categories of controls. © 2011 Maze & Associates26
  26. 26. Independence • The level of the system will dictate the level of independence required • Must be independent of the entire process, especially the system owner • Can use internal or external auditors • Often use independent contractors • There should be a level of review with the auditors • May use automated tools for testing such as CAATs (Computer Aided Audit Tools) 27 © 2011 Maze & Associates
  27. 27. Assessment Life Cycle © 2011 Maze & Associates28 Plan Execute Post Execution 3 phased information security assessment methodology NIST SP 800-115 has a 3 step process This process mirrors FISCAM’s 3 phases: Plan, Perform and Report (Federal Information System Control Audit Manual) available from the US Government Accountability Office (GAO)
  28. 28. Plan (FISCAM) • Understand the Overall Audit Objectives and Related Scope of the Information System Controls Audit • Understand the Entity’s Operations and Key Business Processes • Obtain a General Understanding of the Structure of the Entity’s Networks • Identify Key Areas of Audit Interest • Assess Information System Risk on a Preliminary Basis • Identify Critical Control Points • Obtain a Preliminary Understanding of Information System Controls • Perform Other Audit Planning Procedures © 2011 Maze & Associates29
  29. 29. Perform (FISCAM) • Understand Information Systems Relevant to the Audit Objectives • Determine which IS Control Techniques are Relevant to the Audit Objectives • For each Relevant IS Control Technique Determine Whether it is Suitably Designed to Achieve the Critical Activity and has been Implemented • Perform Tests to Determine Whether such Control Techniques are Operating Effectively • Identify Potential Weaknesses in IS Controls and Consider Compensating Controls © 2011 Maze & Associates30
  30. 30. Report (FISCAM) • Evaluate the Effects of Identified IS Control Weaknesses – Financial Audits, Attestation Engagements, and Performance Audits • Consider Other Audit Reporting Requirements and Related Reporting Responsibilities © 2011 Maze & Associates31
  31. 31. Certification Process (NIST SP 800-37) • Security Control Assessment – Prepare for Assessment – Conduct Assessment – Document Results • Security Certification Documentation – Provide the certification findings & recommendations – Update the system security plan (SSP) – Prepare the plan of actions and milestones (POA&M) – Assemble accreditation package 32 © 2011 Maze & Associates
  32. 32. Security Control Assessment Tasks Task 1“Assemble any documentation and supporting materials necessary for the assessment of the security controls in the information system; if these documents include previous assessments of security controls, review the findings, results, and evidence.” Task 2 “Select, or develop when needed, appropriate methods and procedures to assess the management, operational, and technical security controls in the information system.” Task 3 “Assess the management, operational, and technical security controls in the information system using methods and procedures selected or developed.” Task 4 “Prepare the final security assessment report.” NIST SP 800-37 33 © 2011 Maze & Associates
  33. 33. Security Documentation Tasks Task 1“Provide the information system owner with the security assessment report.” Task 2 “Update the system security plan (and risk assessment) based on the results of the security assessment and any modifications to the security controls in the information system.” Task 3 “Prepare the plan of action and milestones based on the results of the security assessment.” Task 4 “Assemble the final security accreditation package and submit to authorizing official.” NIST SP 800-37 34 © 2011 Maze & Associates
  34. 34. Assessment Tasks (NIST SP 800-37 Rev 1) • Task 1: Identify and select the security control assessor(s) and determine if the selected assessor(s) possess the required degree of independence for the assessment. • Task 2: Develop a plan to assess the security controls. • Task 3: Review and approve the plan to assess the security controls. • Task 4: Obtain appropriate documentation, records, artifacts, test results, and other materials needed to assess the security controls. • Task 5: Assess the security controls in accordance with the assessment procedures defined in the security assessment plan. • Task 6: Prepare the preliminary security assessment report documenting the issues, findings, and recommendations from the security control assessment. 35 © 2011 Maze & Associates
  35. 35. Assessment Tasks (NIST SP 800-37 Rev 1) • Task 7: Review the preliminary security assessment report. • Task 8: If necessary, conduct remediation actions based on the preliminary security assessment report. • Task 9: Assess the remediated security controls. • Task 10: Update the security assessment report and prepare the executive summary. • Task 11: If necessary, prepare an addendum to the security assessment report that reflects the initial results of the remediation actions taken and provides the information system owner or common control provider perspective on the assessment findings and recommendations. • Task 12: Update the security plan based on the findings and recommendations of the security assessment report and any remediation actions taken. • Task 13: Prepare the plan of action and milestones based on the findings and recommendations of the security assessment report. 36 © 2011 Maze & Associates
  36. 36. Assessor/Auditor Selection • The assessor should have the level of independence required for the system being evaluated – AO or designate determines the level of independence needed (risk based) – Capable of conducting an impartial assessment – Impartial: free from any perceived or actual conflicts – Can be from the public or private sector, can also be from within the organization • The assessor should have the technical expertise to conduct the assessment – Knowledge of software, hardware and firmware being assessed © 2011 Maze & Associates37
  37. 37. Assessor KSAs Knowledge SkillAbility 38 © 2011 Maze & Associates
  38. 38. Assessor Competence • Priority Certifications – Certified Information Systems Auditor (CISA)* – GIAC Systems and Network Auditor (GSNA) • Secondary Certifications – Vendor Neutral: CISSP, Security+, GIAC, CISM, etc… – Vendor Specific: Microsoft, Cisco, etc… • For example if staff is assessing Windows servers, – The staff should have Priority certifications (e.g. CISA) – In addition the vendor specific certifications (e.g. MCITP) would be helpful *GAO recommends 65% of audit staff to be CISA © 2011 Maze & Associates 39
  39. 39. Legal Considerations • At the discretion of the organization • Legal Review – Reviewing the assessment plan – Providing indemnity or limitation of liability clauses (Insurance) – Particularly for tests that are intrusive – Nondisclosure agreements – Privacy concerns • Rules of Engagement (ROE) – Detailed guidelines and constraints regarding the execution of information security testing. The ROE is established before the start of a security test, and gives the test team authority to conduct defined activities without the need for additional permissions. [NIST SP 800-115] 40 © 2011 Maze & Associates
  40. 40. Developing the test plan • Assessment Procedures – Aka: Audit approach or ST&E (Security Testing & Evaluation) Procedures • Detailed description of the testing methodology that will be used • Will include the following – Scope – Testing requirements – Testing approach – Tests to be used – Timeline – Responsibilities – Test team – Remediation plan, recommendations 41 © 2011 Maze & Associates
  41. 41. Assessment Methodology • It is important to have repeatable and documented security assessment methodologies • Benefits: – Lower detection and audit risk – Provide consistency and structure to security testing, which can minimize testing risks – Expedite the transition of new assessment staff – Address resource constraints associated with security assessments © 2011 Maze & Associates42
  42. 42. Auditors/Assessors • Will want to see prior assessments – Helps with scope – Determine is progress has been made – Determine audit risk – What has changed since last assessment – Time required – What was the level of independence © 2011 Maze & Associates43
  43. 43. Key Definitions © 2011 Maze & Associates44 Passive Security Testing: Security testing that does not involve any direct interaction with the targets, such as sending packets to a target. Active Security Testing: Security testing that involves direct interaction with a target, such as sending packets to a target. Covert Testing: Testing performed using covert methods and without the knowledge of the organization’s IT staff, but with full knowledge and permission of upper management. Overt Testing: Security testing performed with the knowledge and consent of the organization’s IT staff. Source: NIST SP 800-115
  44. 44. Key Definitions (cont.) © 2011 Maze & Associates45 Internal Security Testing: Security testing conducted from inside the organization’s security perimeter. External Security Testing: Security testing conducted from outside the organization’s security perimeter. Target Identification and Analysis Techniques: Information security testing techniques, mostly active and generally conducted using automated tools, that are used to identify systems, ports, services, and potential vulnerabilities. Target identification and analysis techniques include network discovery, network port and service identification, vulnerability scanning, wireless scanning, and application security testing. Source: NIST SP 800-115
  45. 45. How much testing? © 2011 Maze & Associates46 “Risk assessments should be used to guide the rigor and intensity of all security control assessment related activities associated with the information system to enable cost effective, risk-based implementation of key elements in the organization’s information security program” - NIST SP 800-37 rev 1
  46. 46. Sample Size • Testing a sample – Test a subset of the population – Not testing the entire population – It may be much more efficient and nearly as effective (statistically) • Feasibility of using a sample for assessment – Selection of sample size should be based on the risk of not finding a material weakness or control deficiency – If the thousands of hosts are managed and similarly configured – You can select a smaller percentage to test • If weaknesses are found you increase your sample size to determine if the weaknesses found are pervasive or isolated incidents © 2011 Maze & Associates47
  47. 47. Sampling • Statistical Sampling – Selecting an appropriate portion of an entire population based on mathematical calculations and probability – Purpose is to make scientifically and mathematically sound inferences about the entire population – Should group items by characteristics (auditors judgment) – Each item in the populations should have an equal chance to be selected for testing • Need to ensure the sample is unbiased and that the ones selected are representative of the whole population – For example you have 50 computers 25 are Mac and 25 are Windows. If your sample size was 5 you would want to ensure you select 3 Mac and 3 PC to test. Generally… © 2011 Maze & Associates48
  48. 48. Typical Sampling and Evaluation Criteria © 2011 Maze & Associates49 Populations over 250 Control Testing Sample Size Table Significance of Control Inherent Risk Minimum Sample Size1 High High 60 High Low 40 Moderate High 40 Moderate Low 25 Compliance Testing Sample Size Table Desired Level of Assurance Minimum Sample Size1 High 60 Moderate 40 Low 25 1: No exceptions expected With populations under 250, the minimum sample size is 10%, no exceptions expected.
  49. 49. Assessment Methods • Objective – Determination of security control existence, functionality, correctness, completeness, and potential for improvement over time • Examine – Checking, inspecting, reviewing, observing, studying, or analyzing • Interview – Conducting discussions with individuals or groups • Test – Compare actual with expected behavior © 2011 Maze & Associates50
  50. 50. Assessment Objectives and Guidance © 2011 Maze & Associates51 NIST SP 800-53A Rev 1 (Draft) May 2010
  51. 51. NIST SP 800-53A Rev 1 Example © 2011 Maze & Associates52
  52. 52. Identify and Select Automated Tools • Computer Assisted Audit Techniques or Computer Aided Audit Tools (CAATS) • Computer Assisted Audit Tools and Techniques (CAATTs) – SQL queries – Scanners – Excel programs – Live CDs – Checklists 53 © 2011 Maze & Associates
  53. 53. Checklists • AuditNet – www.auditnet.org • ISACA & IIA – Member Resources • DoD Checklists – iase.disa.mil/stigs/checklist/ • NIST Special Publications – csrc.nist.gov/publications/PubsSPs.html 54 © 2011 Maze & Associates
  54. 54. Live CD Distributions for Security Testing • BackTrack • Knoppix Security Tool Distribution • F.I.R.E. • Helix 55 © 2011 Maze & Associates
  55. 55. Review Techniques • Documentation Review • Log Review • Rule set Review • System Configuration Review • Network Sniffing • File Integrity Checking 56 © 2011 Maze & Associates
  56. 56. Target Identification and Analysis Techniques • Network Discovery • Network Port and Service Identification – OS fingerprinting • Vulnerability Scanning • Wireless Scanning – Passive Wireless Scanning – Active Wireless Scanning – Wireless Device Location Tracking (Site Survey) – Bluetooth Scanning – Infrared Scanning 57 © 2011 Maze & Associates
  57. 57. Target Vulnerability Validation Techniques • Password Cracking – Transmission / Storage • Penetration Testing – Automated / Manual • Social Engineering – Phishing 58 © 2011 Maze & Associates
  58. 58. Checklists / MSAT • Microsoft Security Assessment Tool (MSAT) 59 © 2011 Maze & Associates
  59. 59. GRC Tools Governance RiskCompliance 60 Dashboards Metrics Checklists Reporting Trend Analysis Remediation © 2011 Maze & Associates
  60. 60. Test Types • Black Box Testing – Assessor starts with no knowledge • White Box Testing – Assessor starts with knowledge of the system, i.e. the code • Grey Box Testing – Assessor has some knowledge, not completely blind 61 © 2011 Maze & Associates
  61. 61. Testing • Basic Testing (black box) – No knowledge of the internal structure and implementation detail of the assessment object • Focused Testing (grey box) – Some knowledge of the internal structure and implementation detail of the assessment object. • Comprehensive Testing (white box) – Explicit and substantial knowledge of the internal structure and implementation detail of the assessment object © 2011 Maze & Associates62
  62. 62. Incremental Testing • Testing before the complete implementation of a system • Not testing all the controls in the system security plan just those ready • This may be more cost effective and efficient • Common Controls are a good example © 2011 Maze & Associates63
  63. 63. Verification Testing Input • Data Entry Data Collection • Database Storage Output • Reports 64 Verification Match © 2011 Maze & Associates
  64. 64. Application testing • Code Review – Automated/Manual • Vulnerability scanning • Configuration review • Verification testing • Authentication • Information leakage • Input/output Manipulation 65 © 2011 Maze & Associates
  65. 65. Database Auditing • Native Audit (Provided by DB) • SIEM & Log Management • Database Activity Monitoring • Database Audit Platforms – Remote journaling & analytics • Compliance testing • Performance 66 © 2011 Maze & Associates
  66. 66. Intrusion Detection/Prevention • Configuration • Verification testing • Log and Alert review 67 © 2011 Maze & Associates
  67. 67. 68 © 2011 Maze & Associates
  68. 68. EMR Testing • Electromagnetic Radiation • Emissions Security (EMSEC) • Van Eck phreaking • Tempest • Tempest surveillance prevention • Faraday Cage 69 © 2011 Maze & Associates
  69. 69. Green Computing • Assessment on the use of resources • Power Management • Virtualization Assessment 70 © 2011 Maze & Associates
  70. 70. Business Continuity • Plan Testing, Training, and Exercises (TT&E) • Tabletop Exercises – Checklist Assessment – Walk Through • Functional Exercises – Remote Recovery – Full Interruption Test 71 © 2011 Maze & Associates
  71. 71. Vulnerability Scanning • Vulnerability: Weakness in an information system, or in system security procedures, internal controls, or implementation, that could be exploited or triggered by a threat source. • Vulnerability Scanning: A technique used to identify hosts/host attributes and associated vulnerabilities. (Technical) • Target Vulnerability Validation Techniques: Active information security testing techniques that corroborate the existence of vulnerabilities. They include password cracking, remote access testing, penetration testing, social engineering, and physical security testing. 72 © 2011 Maze & Associates Source: NIST SP 800-115
  72. 72. MBSA • Microsoft Baseline Security Analyzer 2.2 73 © 2011 Maze & Associates
  73. 73. Vulnerability Reports 74 Sample from Qualys © 2011 Maze & Associates
  74. 74. External and Internal 75 Where is the best place to scan from? External scan found 2 critical vulnerabilities Internal scan found 15 critical vulnerabilities © 2011 Maze & Associates
  75. 75. Vulnerability Scanners 76 Source: http://www.gartner.com/technology/media-products/reprints/rapid7/173772.html © 2011 Maze & Associates
  76. 76. Red, White and Blue Teams 77 Penetration Testers Incident Responders Mimic real-world attacks Unannounced Observers and Referees © 2011 Maze & Associates
  77. 77. Red and Blue Teams 78 Penetration Testers Incident Responders Mimic real-world attacks Announced © 2011 Maze & Associates
  78. 78. Penetration Testing • Aka “Red Team” • Assessors attempt to circumvent security controls and features of a given system • Should have an agreed upon rules of engagement • Should have a clear and documented scope • It is dramatic proof of the existence of vulnerabilities and how much effort is required to circumvent them © 2011 Maze & Associates79 “A test methodology in which assessors, typically working under specific constraints, attempt to circumvent or defeat the security features of an information system. “ - CNSSI No. 4009
  79. 79. Penetration Test Phases 80 © 2011 Maze & Associates
  80. 80. Attack Phases © 2011 Maze & Associates 81 Source NIST SP 800-115
  81. 81. Penetration Assessment Reports 82 Sample from CoreImpact © 2011 Maze & Associates
  82. 82. Vulnerability Information • Open Source Vulnerability DB – http://osvdb.org/ • National Vulnerability Database – http://nvd.nist.gov/ • Common Vulnerabilities and Exposures – http://cve.mitre.org/ • Exploit Database – http://www.exploit-db.com/ 83 © 2011 Maze & Associates
  83. 83. Physical Assessments • Posture Review • Access Control Testing • Perimeter review • Monitoring review • Alarm Response review • Location review (Business Continuity) • Environmental review (AC / UPS) 84 © 2011 Maze & Associates
  84. 84. The role of the host • In order for the auditor to finish on time he/she will need to have access to documents, systems and people • Delays in providing the auditor with needed items will slow the process • The host organization should ensure the auditor has what he/she needs in a timely fashion, preferably before they ask for it 85 © 2011 Maze & Associates
  85. 85. Test execution • Document the testing procedures • Who conducted the test • What was the results of the test • Signed off by tester • Reviewed by supervisor • Rank findings by severity (not required but useful) • Provide interim results before final report • No need to test controls that are not in place • Documented so that another auditor with no knowledge of the system can follow the findings 86 © 2011 Maze & Associates
  86. 86. Post-Testing Activities • Assessment Findings (Assessor) – Security Assessment Report (SAR) • Mitigation Recommendations (Assessor) – Technical, Managerial or Operational • Reporting – Draft and Final Reports (Assessor) – Comments and System Owner response (System Owner) • Remediation / Mitigation (System Owner) – Not enough to finds problems need to have a process to fix them 87 © 2011 Maze & Associates
  87. 87. Documenting the results • Security Assessment Report (SAR) – Convey the results of the security assessment to appropriate organizational officials • Reliable indication of the overall security state of the information system • Provide authorizing officials with the information necessary to make credible, risk-based decisions 88 © 2011 Maze & Associates
  88. 88. SAR • Executive Summary report – Summary – Geared for managers without the technical background – Synopsis of key findings • Detail report – Each control covered with either pass or fail – Results of the test – Reference – Recommendations for remediation – Level of detail determined by risk to system 89 © 2011 Maze & Associates
  89. 89. Included in the SAR • Information system name; • Security categorization; • Site(s) assessed and assessment date(s); • Assessor’s name/identification; • Previous assessment results (if reused); • Security control or control enhancement designator; • Selected assessment methods and objects; • Depth and coverage attributes values; • Assessment finding summary (indicating satisfied or other than satisfied); • Assessor comments (weaknesses or deficiencies noted); • Assessor recommendations (priorities, remediation, corrective actions, or improvements) © 2011 Maze & Associates90
  90. 90. Audit / Assessment Documentation • Audit Trail – “A visible trail of evidence enabling one to trace information contained in statements or reports back to the original input source.”* • Evidence – supports the audit conclusion • Includes: – Notes taken from interviews – Internal documentation – Correspondence – Results of tests • Remember some evidence is more reliable than others © 2011 Maze & Associates91 Definition form ISACA
  91. 91. Audit Papers • The auditor / assessor should document the procedures used and evidence such that another auditor reviewing that documentation will come to the same conclusion. • If another auditor /assessor cannot follow the documentation your documentation is inadequate. © 2011 Maze & Associates92
  92. 92. Concurrent Remediation • Remediation taken place during the assessment period • Assessor will communicate results as they perform their assessments • System Owner many take corrective actions during that period of time • Assessor will have to re-test to see if the control deficiency has truly been corrected • Items that cannot be corrected will be entered into the Plan of Action and Milestones (POA&M) for future remediation © 2011 Maze & Associates93
  93. 93. Disagreements with findings • Optional addendum to the Security Assessment Report • This addendum gives the Information system owner and common control providers an opportunity to respond to the assessor's findings • Sometimes called management’s response • This does not change the assessors report • Is it an substantive issue? • Each organization or agency may wish to employ a issue resolution process © 2011 Maze & Associates94
  94. 94. Organizations that can help • Other than DoD and NIST • U.S. Government Accountability Office (GAO) • Information Systems Audit and Control Association (ISACA) • American Institute of Certified Public Accountants (AICPA) • Institute of Internal Auditors (IIA) • SANS • National State Auditors Association (NSAA) 95 © 2011 Maze & Associates
  95. 95. Resources • COBIT • NIST SP 800-115 • NIST SP 800-53a Rev 1 • Federal Information System Controls Audit Manual (FISCAM) • Open Source Security Testing Methodology Manual (OSSTMM) 96 © 2011 Maze & Associates
  96. 96. Summary • Certification process needs – To be performed by professionals – Level of independence depends on the level of the system – Should be well-planned – Should be well-documented – Should be a basis of remediation • Certification / Assessment – Provides assurance that the implemented controls are functioning as expected – “…to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome…” NIST SP 800-100 97 © 2011 Maze & Associates
  97. 97. Class Discussion • What are some reasons why an auditor / assessor might increase the amount of testing? • What can you do to ease the assessment pains? • Is an assessment/audit a adversarial process? • What are some things you would consider in selecting an auditor / assessor? • Why should auditors document what tests they conducted and the results? • What do Red Teams do? • What do you do if you disagree with an auditor / assessors findings? 98 © 2011 Maze & Associates

×