Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

CISSP - Chapter 3 - System security architecture


Published on

CISSP Chapter 3 - Part IV - System Security Architecture

Published in: Education
  • Hi there! I just wanted to share a list of sites that helped me a lot during my studies: .................................................................................................................................... - Write an essay .................................................................................................................................... - Summary of books .................................................................................................................................... - Online coursework .................................................................................................................................... - proquest dissertations .................................................................................................................................... - Movies reviews .................................................................................................................................... - Best powerpoint presentations .................................................................................................................................... - Write a research paper .................................................................................................................................... - Homework help online .................................................................................................................................... - Professional resume writing service .................................................................................................................................. - Help with writing any papers ......................................................................................................................................... Save so as not to lose
    Are you sure you want to  Yes  No
    Your message goes here
  • You are welcome to visit our brilliant writing company in order to get rid of your academic writing problems once and for all! ⇒ ⇐
    Are you sure you want to  Yes  No
    Your message goes here
  • Celebrated pianist Scott Henderson says: "I am thoroughly impressed by the system's ability to multiply your investment! 
    Are you sure you want to  Yes  No
    Your message goes here
  • Dating for everyone is here: ❶❶❶ ❶❶❶
    Are you sure you want to  Yes  No
    Your message goes here
  • Follow the link, new dating source: ❶❶❶ ❶❶❶
    Are you sure you want to  Yes  No
    Your message goes here

CISSP - Chapter 3 - System security architecture

  1. 1. Security Engineering
  2. 2. Trusted Computing Base • Collection of all the hardware, software, firmware components within the system that provides some kind of security control and enforces the system security policy • Any piece of the system that could be used to compromise the stability of the system is part of TCB and must be developed and controlled effectively • If TCB is enabled then the system has • Trusted path • A secure communication path between the user/program and the TCB • Trusted Shell • Actions within the interpreter or shell cannot be exposed out and other processes cannot intrude
  3. 3. Security Perimeter • An imaginary boundary that divides the trusted and untrusted components in the system • A resource within the boundary is considered part of TCB. • Components either side of the boundary can interact only via interfaces • The interfaces limit or restrict the commands and data that can be passed on either side of the boundary creating a security perimeter
  4. 4. Reference Monitor • Its an abstract machine that mediates all access subjects have to objects • It is an access control concept and hence is also referred to as “reference monitor concept” or “abstract machine” • A fully secure system will require subjects to be fully authorized before access is provisioned to the objects • It provides direction on how all access control decisions are made • All access decisions should be made by a core-trusted, tamper proof component of the OS that works at the system kernel ~ security kernel
  5. 5. Security Kernel • It is made up of hardware, software, firmware components within the TCB • It implements and enforces the reference monitor concept • It is the core of TCB • Security kernel has three main requirements • It must provide isolation for the processes carrying out reference monitor concept • Must be invoked for every access attempt and should be tamperproof • Must be small enough to be tested and verified in a complete and comprehensive manner
  6. 6. Security Models • It is a symbolic representation of a security policy • Maps the security policy into a set of rules to be followed by a system • An abstract term that represents the goals and objectives the system must meet to be deemed secure and acceptable • Usually represented in mathematics and analytical ideas, mapped to system specifications and developed as program code
  7. 7. State Machine Models • Used to describe the behavior of a system to different inputs • Provides mathematical constructs that represents sets and sequences • It is based on the state of a system • State: • is the snap-shot of the system at one moment in time; current permissions and current instances of subjects accessing objects must be captured • This model must identify all the initial states of the system and outline how these values will be changed by various inputs so as final state is always safe • Uses “if then” calculation
  8. 8. State Machine model factors • When developing a system using State machine model, the following factors have to be considered: • Define what and where the state variables are • Define a secure state for each of these variables • Define and identify the allowable state transition functions • Test to verify the overall machine state is not compromised and integrity is maintained at all times
  9. 9. Models Basics • Multi-level security Policy: • Subjects with different clearances use the system and the system processes data at different classification levels • Information Flow security model: • Information does not flow in an insecure manner • Enforces • Domination relation • Relationship of the subject’s clearance to the object’s classification • Basic Security Theorem • System initializes in a secure state and all allowed transitions are secure, then every subsequent state will be secure no matter what inputs occur • Tranquility Principle • Subjects and objects security labels cannot change in a manner that violates the security policy • Discretionary security Property [ds-property] • Specific permissions allow named subjects to pass on permission at its own discretion
  10. 10. Bell-LaPadula Model • State machine access control model that enforces confidentiality only • First mathematical model of multilevel security policy that defines secure state of the system • It’s a framework for computer systems that store and process sensitive information • A matrix and security levels are used to determine if a subject can access an object • Uses subjects, objects, access operations and security labels • Proven to provide secure and effective operating system • Model provides a secure state and only permit operations that will keep the system within secure state • Its is a subject to object model • Focuses on ensuring subjects are properly authenticated before accessing an object • MAC systems are based on this model • Uses lattice of sensitivity levels for access decisions
  11. 11. Bell-LaPadula Principle • 3 Main principles • Simple Security Rule – no read up • * Property Rule – no write down • Strong *Property Rule – subjects having read and write permission can only perform those at the same security level. Subjects clearance and objects classification has to match
  12. 12. Biba Security Model • Sate Machine model concerned with Integrity component • It does not worry about confidentiality • Information Flow model • Uses lattice of integrity levels to make decisions • Prevents data from any integrity level from flowing to higher integrity level
  13. 13. Biba Principles • Simple integrity axiom – no read down • Indicates how a subject can read objects • * integrity axiom – No write up • Indicates how a subject can modify objects • Invocation property – subject cannot invoke a subject at a higher integrity level • Indicates how a subject can communicate with and initialize other subjects at run-time
  14. 14. Clark-Wilson Model • Its is an integrity model developed after Biba Model • Addresses all the goals of Integrity model • Prevent unauthorized users making modifications (Biba Model) • Prevent authorized users making improper modification • Maintain internal and external consistency • It focuses on well-formed transactions and separation of duties • Well-formed Transaction • A series of operations that transform a data item from one consistent state to another
  15. 15. Elements of Clark-Wilson Model • Users – Active agents • Transformation procedures [TP] – abstract operations, like read/write/modify • Constrained data items [CDI] – can be manipulated only by TP • Unconstrained data items [UDI] – can be manipulated by user • Integrity verification procedures [IVP] – check the consistency of CDI with external reality
  16. 16. Elements of Clark-Wilson Model • Model segregates data into two subsets • CDI: Data that needs to be highly protected • UDI: Data that does not require high level of protection • CDI data can be modified only by TPs • UDI data can be modified by users/processes • Uses access triple concept to protect integrity of CDI data • IVPs are used to validate that all CDI manipulation follow the applications defined Integrity rules – This ensures consistent state of CDI
  17. 17. Noninterference Model • Any actions that take place at higher security level should not affect or interfere with actions taking place at the lower security level • It does not concern with flow of data, rather with what subject knows about the state of the system • It addresses two attacks • Covert channel attacks • Inference attack • is a data mining technique performed by analyzing data in order to illegitimately gain knowledge about a subject or object • occurs when a user is able to infer from trivial information more robust information about a object without directly accessing it.
  18. 18. Brewer and Nash Model • Also called Chinese wall model • A subject can write to an object only if it cannot read another object that is in a different dataset. • Information flow model, provides access control mechanism that can change dynamically depending on user’s authorization and previous actions • Main goal is to protect against conflict of interest by user’s access attempts
  19. 19. Graham-Denning Model • Defines a set of basic rights in terms of commands that a subject can execute over an object • Has 8 protection rights that detail how these functionalities should take place securely • How to securely create an object • How to securely create a subject • How to securely delete an object • How to securely delete a subject • How to securely provide the read access right • How to securely provide the grant access right • How to securely provide the delete access right • How to securely provide transfer access rights • Each object has an owner that has special rights on it • Each subject has another subject (controller) that has special rights on it. • The model is based on the Access Control Matrix model
  20. 20. Harrison-Ruzzo-Ullman Model • Extended the Graham-Denning Model • Deals with access rights and integrity of the access rights • A subject can carry only a finite set of actions on an object • The HRU model defines a protection system consisting of a set of generic rights R and a set of commands C • The HRU model is used by software designers to ensure no unforeseen vulnerabilities are introduced and the stated access control goals are achieved
  21. 21. Lattice Model • Mathematical model built upon groups • It’s a structure of partially ordered set with • Least upper • Greatest lower bound operators on the set • The most restrictive access control decision is applied using the Least upper bound and least greatest lower bound values
  22. 22. Take-Grant Model • Employs a direct graph to dictate how rights can be passed from one subject to another or object • Take Rule • Allows the subject to take rights over an object • Grant Rule • Allows the subject to grant rights to an object • Create Rule • Allows the subject to create new rights • Remove Rule • Allows a subject to remove rights it has
  23. 23. Gougen-Meseguer Model • It is an integrity Model • Based on predetermining a list of objects that a subject can access • Subjects are allowed only to take predetermined actions • This model is based on automation theory and domain separation • Members of one subject domain cannot interfere with the members of another subject domain
  24. 24. Sutherland Model • It is an Integrity Model • Formally based on state machine and information flow models • It focuses on preventing interference in support of integrity • The model is based on the idea of defining set of system state, initial state and state transitions
  25. 25. Lipner Model • Combines the elements of BPL and Biba model to provide confidentiality and Integrity • Describes two ways of implementing Integrity • First method to separate objects into data and programs • One uses BPL confidentiality model and the other uses both the BPL and Biba integrity model together • In BPL model, • subjects are assigned sensitivity levels and specific job categories • Objects are as assigned similar sensitivity levels and categories • Most of the subjects and objects are assigned the same level; hence categories become the most significant integrity mechanism • In combined model, • The assignment of levels and categories remain same, but integrity levels are used to avoid unauthorized modification of system programs • Integrity categories are used to separate domains that are based on functional areas.
  26. 26. Covert Channel 26 • Information flow model risk • Unauthorized flow of information • Causes of covert channel • Improper coding • Improper access control implementation within software • Shared resource which are not properly controlled • Two types • Covert Storage Channel • Covert Timing Channel
  27. 27. Covert Channel Types • Covert Storage Channel: • Transfers information through the setting of bits by one program and the reading of those bits by another. • Occur when out-of-band data is stored in messages for the purpose of memory reuse. • Eg: Steganography • Covert timing Channel: • Convey information by modulating some aspect of system behavior over time, so that the program receiving the information can observe system behavior and infer protected information. • Methods: • knowing when data is transmitted between parties; • monitoring the timing of operations; • Eg: Monitoring cryptographic functions
  28. 28. Security Evaluation
  29. 29. Trust & Assurance • Trust: • States the level of protection that can be expected from a system • Assurance: • Much deeper than trust, states the system will work in a correct and predictable manner in each and every computing situation • What is evaluated? • Security relevant parts of the system • TCB, Access control, Kernel, Reference monitor, protection mechanism • The relationship and interaction between these components
  30. 30. Common Criteria • Developed by ISO • Developed primarily to reduce complexity of ratings • Evaluates the products against protection profile • Evaluated products are assigned Evaluation Assurance Levels [EAL] • Addresses Functionality and Assurance • ISO/IEC 15408 is used as the basis for evaluation of security properties • 15408-1: Introduction and general evaluation model • 15408-2: Security functional components • 15408-3: Security Assurance components
  31. 31. EAL Packages • EAL1: Functionally Tested • EAL2: Structurally Tested • EAL3: Methodically tested and checked • EAL4: Methodically designed, tested and checked • EAL5: Semiformally designed and tested • EAL6: Semiformally verified design and tested • EAL7: Formally verified design and tested Simple Verification Stringent Verification
  32. 32. Protection Profile • Describes a real world need for a product that is not in market • Contains the set of security requirements, their meanings/reasoning and corresponding EAL rating to be achieved • Describes the environmental assumptions, the objectives and the F/A level expectations • Justifies the Assurance level and requirements for the strength of each protection mechanism • Provides necessary goals and protection mechanisms to achieve the required level of security
  33. 33. Protection Profile sections • Security Problem Description • Lays out the specific problem that any compliant product should address • Security Objective • List the functionalities the compliant product must provide to address the problem • Security Requirements • Very specific requirements for compliant products implementation by system developers and for verification by independent laboratories
  34. 34. Protection Profile Elements • Descriptive Element: • Name of profile and description of the problem statement • Rationale: • Detailed description of the problem statement, environment, usage assumption and threats along with guidance on security policies • Functional Requirements: • Establishes protection boundary; solution must enforce the boundary established at this level • Development assurance requirements: • Identifies specific requirements during development phases • Evaluation assurance requirements: • Establishes the type and intensity of the evaluation
  35. 35. Orange Book • Developed by TCSEC • Addresses Confidentiality Principle • Addresses single system security • Reviews the functionality, effectiveness, and assurance of a product • Bundles functionality and Assurance into one rating • Provides an Hierarchical division of assurance levels • Placed great deal of emphasis on the ability to enforce security in ways that could be formally verified to be correct and reliable • Each division and class incorporates the requirements below it • The act of rating the security capability is called Trusted Products evaluation Program (TPEP) • It introduced the concept of Trusted Computing Base (TCB)
  36. 36. Orange Book – Evaluation Criteria • The criteria breaks down to 7 areas • Security Policy • The policy must be explicit, well defined and enforced within the system • Identification • Subjects must be uniquely identified • Labeling • Labels should be associated properly with objects • Documentation • Adequate documentation, including design, testing, user guides should be available • Accountability • Audit logs should be captured and protected • Life-cycle Assurance • Each component [s/w,h/w,firmware] must able to be tested independently and must provide adequate protection • Continuous Protection • The whole system must protect predictably and acceptably in different situations continuously Rating is the sum total of all these items
  37. 37. Assurance Requirements • The operational assurance requirements specified in the Orange Book are as follows: • System Architecture • System integrity • Covert channel analysis • Trusted facility management • Trusted recovery • The life cycle assurance requirements specified in the Orange Book are as follows: • Security testing • Design specification and testing • Configuration Management • Trusted Distribution
  38. 38. Orange Book – Evaluation Level • D: Minimal Protection • Products that failed to meet the criteria and requirements of higher divisions • C: Discretionary Protection • C1: Discretionary Security Protection • Discretionary access control is based on individuals or groups • Requires separation of users and information • Requires some kind of access control mechanism • Must provide a protected execution domain for privileged system process • Usability • Used in environment where users are processing information of same sensitivity level • Strict access control and auditing are not required • Trusted environment with low security concerns • Users are trusted but a certain degree of accountability is required
  39. 39. • C2: Controlled Access Protection • Individuals must be individually identified and granular access control decisions applied • Requires enhanced auditing and audit log protection capabilities • The system architecture must provide process/resource isolation and must invoke object reuse concept • Must enforce strict logon procedures and provide decision making capabilities when subjects request access to objects • Usability • Best suited for commercial purposes  B: Mandatory Protection  Labeling concept is introduced  Mandatory access control model  Based on BLP model and evidence of reference monitor concept must be available  B1: Labeled Security  Objects must have classification label & Subjects must have clearance label  Access decisions are taken comparing the clearance and classification labels of the sets  Security policy is based on informal statement  Design specifications must be reviewed and verified  Usability  Best suited for environments that require systems handling classified data
  40. 40. • B2: Structured Protection • Security policy should be clearly defined and documented • System design and implementation are subjected to more thorough review and testing • Stringent authentication mechanism and well defined interfaces among layers • Must have trusted path for authentication process • Must not allow covert channels • Covert channel analysis must be done • Covert storage channel is identified • Privileged functions/processes must be isolated • Distinct address spaces must be provided • Operator and Administrator roles must be segregated • Usability • Used in environments where systems processes sensitive information that requires higher degree of protection • Used in environments where the systems should be resistant to penetration and compromise • B3: Security Domains • More granularity is provided in each protection mechanism • Design and implementation should be simple • Reference monitor components must be small enough to test properly and must be tamperproof. • Security administrator role must be clearly defined • System must initialize and load its OS and components in an initial secure state • System must be able to recover from failure without compromising its security level • Covert timing channel is identified • Usability • Used in environments were systems process highly sensitive information • Systems should be highly resistant to penetration attempts
  41. 41. • A: Verified Protection • Formal methods are used • Design, development, implementation and documentation are looked in a formal and detailed way • System is evaluated in a more structured and stringent way • A1: Verified Design • Designed, developed, implemented and tested in a more formal and stringent manner • Formal techniques are used to prove the equivalence between TCB specifications and the security policy model • Stringent change configuration is put in place and the overall design is verified • Even the delivery of the system to customer is scrutinized • Usability • Used in the most secure environments where systems handle top-secret information • No one is adequately trusted without strict authentication, restrictions and auditing
  42. 42. Rainbow Series Orange Book (TCSEC) Deals stand-alone systems, addressing only confidentiality Red Book (Trusted network interpretation of the TCSEC) Applies to networked systems • Rates confidentiality and integrity • Addresses DoS protection • Addresses compromise protection • Restricted to lmited class of networks that are labled as “centralized networks with a single accreditation authority” • Uses 4 rating levels: None, C1,C2,B2 Green Book (Password Management guideline) Provides password creation and management guidelines;
  43. 43. ITSEC – IT Security Evaluation Criteria • Developed by EU • Evaluates two main Attributes • Functionality • Services that are provided to the subjects are evaluated and measured • F1 to F10 • Assurance • It’s the degree of confidence in the protection mechanisms, and effectiveness/capability to function consistently • E0 to E6 • ITSEC uses the concepts of Security targets and Targets of evalulation
  44. 44. ITSEC vs TCSEC Attribute TCSEC ITSEC Government US EU Security Principle Confidentiality CIA Rating Combines Functionality and Assurance Rates functionality and Assurance separately Addresses Single stand-alone System Networked systems Flexibility Rigid More flexible ITSEC TCSEC E0 D F1+E1 C1 F2+E2 C2 F3+E3 B1 F4+E4 B2 F5+E5 B3 F5+E6 A1 F6 High Integrity F7 High Availability F8 Integrity during data communication F9 High Confidentiality F10 Networks on High Confidentiality and Integrity Red Book – Used for rating networking devices, software and configurations
  45. 45. Certification & Accreditation • Certification • Technical review that assess the security mechanism and evaluates their effectiveness • Process may use safeguard evaluation, risk analysis, verification, testing and auditing techniques • Goal is to ensure the system is right for the customer’s purpose • Certification is often an internal verification and are trusted only within the organization • Accreditation • Management’s formal acceptance of the adequacy of a system’s security and functionality • Normally performed by a third-party testing service and the results are trusted by everyone in the world who trusts the specific testing group involved.
  46. 46. Open & Close systems Attribute Open Close Build Built upon standards, protocols and interfaces that are published specifications Proprietary systems that does not follow published specifications Operations Can provide interoperability Interoperability and standard interfaces are not employed
  47. 47. Distributed System Security • Distributed Computing: • Multiple interconnected computers work together to accomplish a task • Cloud Computing: • Use of distributed remote computing devices to provide services • They provide efficiency, performance, reliability, scalability and security
  48. 48. Cloud computing Types SaaS PaaS IaaS Name Software as a Service Platform as a Service Infrastructure as a Service Purpose Specific application hosted in the service provider cloud Computing platform hosted in a server is available for the subscriber The IT environment in the cloud is available for the subscriber Access Only the application access is provided to the subscriber Only the platform access is provided access to the subscriber Subscriber has access to all the components within their subscription Authority No administrative access available to the infrastructure Admin access is restricted to the platform support; full control will still reside with the service provider Complete control will be with the subscriber Responsibility Service provider is responsible Service provide holds much of the responsibility Subscriber is responsible
  49. 49. Parallel Computing • Simultaneous use of multiple computers to solve a complex task by splitting it into smaller segments and processing in parallel • 3 levels of parallel computing • Bit Level • Each bit is processed separately through the use of parallel gates • Most common in all computing devices • Instruction Level • Allows two or more program instructions to be executed simultaneously • Requires two or more processors are available and synchronized • Only those applications that are designed for multicore processors can take advantage • Task Level • Each program is divided into tasks/threads and run in parallel • Data Parallelism • Distribution of data among different nodes for parallel processing • Enables the advancement in BIG data environments
  50. 50. Database Security • Two key security issues in Databases are • Aggregation • Inference • Aggregation • Act of combining information from separate sources. This combination generates new information which otherwise would not be available • Prevention: • Content-dependent access control, subjects should be prevented from accessing any information and its associated components beyond their clearance level • Context-dependent access control, subjects previous actions are recorded and access provisioned based on it
  51. 51. Database Security • Inference • Ability to derive information not explicitly available • It is the intended result of Aggregation • Prevention: • Content and context dependent access control • Cell suppression, portioning, noise and perturbation • Cell Suppression: • Hiding specific cells that may have information • Partitioning • Dividing the database into different parts and applying access control • Noise and perturbation • Technique of inserting bogus information to confuse the attacker
  52. 52. Web Application Security • First step in securing a web application is to review and understand the architecture • User-generated input should be considered unsafe and scrutinized • System generated output should be filtered to ensure sensitive data is not disclosed • Encryption should be used for securing input/output operations • Application should fail securely • It should behave in a predictable and non-compromising manner during failure • Implementing Web Application Firewall (WAF) is an effective approach to web application security
  53. 53. Embedded System Security • A cyber-physical computing device that is part of a electrical or mechanical system • These are small, cheap, rugged and use very little power • Ensuring security of the software is the biggest challenge in protecting these devices
  54. 54. Industrial Control Systems • IT that is specifically designed to control physical devices in Industrial processes • ICS Categories • Programmable Logic Controllers (PLC) • Computers designed to control electromechanical process within a factory • Devices connect to PLCs via standard RS-232 interfaces • Distributed Control Systems (DCS) • Network of control devices that are part of one or more industrial processes within close distance • Protocols are not optimized for WAN communications • DCS consists of devices within a single plant
  55. 55. Industrial Control Systems • Supervisory Control and Data Acquisition (SCADA) • Controls large scale physical processes involving nodes across significant distances • Involves 3 kinds of devices • Endpoints: • Remote terminal unit that connects directly to sensor or actuators • Data Acquisition Server: • Backend that receive all data from endpoints and perform correlation or analysis • User Station: • Human machine interface that displays data from endpoints and allows users to issue commands to the actuators
  56. 56. Threats to Review • Maintenance Hook • Type of backdoor • Instructions within software that are known only to the developer • Useful when development phase • Countermeasures • Code review • Unit/Quality testing • Patching • Preventive Measures • Use HIDS • Use file system encryption • Implement auditing
  57. 57. Threats to Review • Time of check /Time of use Attacks • Deals with the sequence of steps a system uses to complete the task • Attacker jumps in between and makes modifications to control the result • Countermeasure: • use of software locks • Race Condition • Occurs when shared resource are used by multiple processes • Attacker makes the process to execute out of sequence to control the result • It is also known as state attack • Caused by poorly written code, and adoption of applications without assessing the security posture • Countermeasure: • not splitting critical task
  58. 58. Threats to Review • Incremental Attacks • Attacks that occur in slow, gradual increments rather than obvious recognizable attempts • Data Diddling: • Active attack where the attacker gains access and makes small, random and incremental changes to data • It is performed more often by insiders than outsiders • Encryption, integrity verification can help detect this attack • Salami Attack: • Systematic deduction of very small (financial) value regularly and routinely. • Segregation of activities, proper access control, setting financial transaction monitors can help detect this attack
  59. 59. Karthikeyan Dhayalan MD & Chief Security Partner