Portakal Teknoloji Otc Lyon Part 1


Published on

Published in: Technology, News & Politics
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Portakal Teknoloji Otc Lyon Part 1

  1. 1. Open Trusted Computing Part 1 – What is Trusted Computing? Bora Güngören Portakal Teknoloji [email_address]
  2. 2. Information on the Course <ul><li>These lecture notes have been prepared as part of EU FP6 project Open Trusted Computing (OpenTC) by Portakal Teknoloji (PORT). </li></ul><ul><li>Major author is Emre Yüce (PORT). Contributors include: </li></ul><ul><ul><li>Bora Güngören (PORT) </li></ul></ul><ul><li>Quality assurance team includes many OpenTC partners, listed individually: </li></ul><ul><ul><li>Görkem Çetin (TUB) </li></ul></ul>
  3. 3. Major Course Goal for Part 1 <ul><li>The major goal of this part is to enable the student to </li></ul><ul><ul><li>Explain what Trusted Computing is using TCG definitions. </li></ul></ul><ul><li>This includes </li></ul><ul><ul><li>How to define security and under which conditions it can be defeated, while being able to classify these aspects with respect to hardware, OS and application software. </li></ul></ul><ul><ul><li>What cryptography can offer in order to enhance security, </li></ul></ul><ul><ul><li>Use OpenSSL and certificates to implement security, </li></ul></ul><ul><ul><li>Define what Trusted Computing is, </li></ul></ul><ul><ul><li>Define major components of a TC-enabled environment, such as an endorsement key, TPM sealing, Direct Anonymous Attestation (DAA) or enforced policies. </li></ul></ul>
  4. 4. Contents (Not detailed) <ul><li>Security and violations </li></ul><ul><li>Cryptography </li></ul><ul><li>Two terms: trust and trustworthy </li></ul><ul><li>Trusted Computing </li></ul><ul><li>Memory protection </li></ul><ul><li>Attestation and anonymity </li></ul><ul><li>Design of a trusted platform </li></ul><ul><li>TPM capabilities overview: 1.0, 1.1 and finally 1.2 </li></ul><ul><li>Enforced policies </li></ul><ul><li>An example : electronic voting </li></ul>
  5. 5. Security of a system “The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards - and even then I have my doubts. ” Eugene H. Spafford [6]
  6. 6. Security and violations <ul><li>What is a security violation? </li></ul><ul><li>Making attacks based on hardware </li></ul><ul><li>or software flaws that aims to take </li></ul><ul><li>the control of the system or make </li></ul><ul><li>the system malfunction. </li></ul>
  7. 7. What is a Security Flaw ? <ul><li>Security Flaw </li></ul><ul><ul><li>An error of commission or omission in a system that may allow protection mechanisms to be bypassed. [7] </li></ul></ul><ul><li>Malicious softwares used to take control of users' computers </li></ul><ul><ul><li>Infected zombie computers are used </li></ul></ul><ul><ul><ul><li>To send email spam </li></ul></ul></ul><ul><ul><ul><li>To engage in distributed denial-of-service attacks </li></ul></ul></ul><ul><li>Security Flaws </li></ul><ul><ul><li>Hardware Security Flaws </li></ul></ul><ul><ul><li>Software SecurityFlaws </li></ul></ul>
  8. 8. Hardware Security Flaws <ul><li>Measuring the precise time and power requirements of certain operations ( eg: power analysis ) </li></ul><ul><li>Freezing the device ( eg: Cold Boot Attacks ) </li></ul><ul><li>Physical attack of various forms (microprobing, drills, files, solvents, etc.) </li></ul><ul><li>Applying out-of-spec voltages or power surges </li></ul><ul><li>Applying unusual clock signals </li></ul><ul><li>Inducing software errors using radiation </li></ul>
  9. 9. Hardware Security Flaws <ul><li>Power analysis: </li></ul><ul><ul><li>A form of side channel attack in which the attacker studies the power consumption of a cryptographic hardware device (such as a smart card, tamperproof &quot;black box&quot;, microchip, etc). </li></ul></ul><ul><ul><li>It can yield information about what the device is doing, and including key and other secrets. </li></ul></ul>
  10. 10. Hardware Security Flaws <ul><li>Cold Boot Attacks: [8] </li></ul><ul><ul><li>Attack depends on the fact that a computer’s memory is NOT erased almost immediately when it loses power and the authors claim the following fact: </li></ul></ul><ul><ul><ul><li>“Contrary to popular assumption, DRAMs used in most modern computers retain their contents for seconds to minutes after power is lost, even at room temperature and even if removed from a mother- board.” </li></ul></ul></ul><ul><ul><li>New algorithms for finding cryptographic keys (DES, AES and RSA keys) in memory images are offered. </li></ul></ul>
  11. 11. Hardware Security Flaws <ul><li>Cold Boot Attacks [8]: </li></ul><ul><ul><li>Example of a read data from memory. </li></ul></ul>
  12. 12. Software Security Flaws <ul><li>What is a software security flaw ? </li></ul><ul><li>Main reasons of software security flaws: </li></ul><ul><ul><li>Operating System </li></ul></ul><ul><ul><li>Application </li></ul></ul>
  13. 13. Software Security Flaws <ul><li>Buffer overflow </li></ul><ul><li>Denial-of-service attack </li></ul><ul><li>Trojan horse </li></ul><ul><li>Viruses or worms </li></ul>
  14. 14. Software Security Flaws <ul><li>Buffer Overflow: </li></ul><ul><ul><li>A programming error which may result in abnormal program behavior, a memory access exception and program termination, or ― especially if deliberately caused by a malicious user ― a possible breach of system security. </li></ul></ul>
  15. 15. Software Security Flaws <ul><li>Buffer Overflow: </li></ul><ul><ul><li>Overwrite the function return address with a pointer to attacker-controlled data: </li></ul></ul>
  16. 16. Software Security Flaws <ul><li>Virus: </li></ul><ul><ul><li>A self-replicating piece of computer code that can partially or fully attach itself to files or applications, and can cause your computer to do something you don't want it to do. </li></ul></ul><ul><ul><li>Via e-mail, downloads, infected floppy disks, or hacking. </li></ul></ul><ul><ul><li>Self-replicate (make copies of itself) to spread. </li></ul></ul><ul><ul><li>Formats your hard drive, overwrites your hard drive boot sector, or deletes files and render your machine inoperable. </li></ul></ul><ul><ul><li>Antivirus software can detect nearly all types of known viruses, but it must be updated regularly to maintain effectiveness. </li></ul></ul>
  17. 17. Software Security Flaws <ul><li>A common virus example: Knight.exe </li></ul><ul><ul><li>Spreads over removable storage devices. </li></ul></ul><ul><ul><li>Affected operating system is Windows. </li></ul></ul><ul><ul><li>Installs itself in the registry. </li></ul></ul>
  18. 18. Secure System <ul><li>Security of a system can be maintained by following instructions below: </li></ul><ul><ul><li>Controlled loading of an OS </li></ul></ul><ul><ul><li>Confidentiality </li></ul></ul><ul><ul><li>Ensuring that information is accessible only to those authorized to have access. </li></ul></ul><ul><ul><li>Verify each level by data integrity measurements </li></ul></ul><ul><li>Integrity refers to the validity of data, i.e. no Malicious or accidental altering of data. </li></ul><ul><li>One can refer to [15] to get more information about obtaining a secure system and the main security specifications. </li></ul>
  19. 19. Two terms: trust and trustworthy <ul><li>Trustworthy System [9]; </li></ul><ul><ul><li>Defined as one that will not fail. </li></ul></ul><ul><li>Trusted System [9]; </li></ul><ul><ul><li>Defined as the one whose failure can break the security policy. </li></ul></ul><ul><ul><li>Warns the user if there is a security flaw. So one can be sure if the system is still trustworthy or not. </li></ul></ul>
  20. 20. Trusted Computing <ul><li>Definition of Trusted Computing; Hardware and software behaves as expected, i.e. </li></ul><ul><ul><li>For software; source code must not be tampered with. </li></ul></ul><ul><ul><li>For hardware; all elements should be working as the initially predefined work sequence. </li></ul></ul>
  21. 21. Trusted Computing <ul><li>Trusted Computing ( TC ) is mainly an authentication problem. </li></ul><ul><li>TC has five key concepts: </li></ul><ul><ul><li>Endorsement Key </li></ul></ul><ul><ul><li>Secure Input and Output </li></ul></ul><ul><ul><li>Memory curtaining / Protected execution </li></ul></ul><ul><ul><li>Sealed storage </li></ul></ul><ul><ul><li>Remote attestation </li></ul></ul>
  22. 22. Endorsement Key <ul><li>The Endorsement Key </li></ul><ul><ul><li>2,048-bit RSA public and private key pair. </li></ul></ul><ul><ul><li>Created randomly on the chip at manufacture time and cannot be changed. </li></ul></ul><ul><ul><li>The private key never leaves the chip, while the public key is used for attestation and for encryption of sensitive data sent to the chip, as occurs during the TPM_TakeOwnership command. </li></ul></ul><ul><li>EK ensures that the used TPM is genuine. </li></ul>
  23. 23. Secure I/O <ul><li>Secure input and output (I/O) </li></ul><ul><ul><li>A protected path between the computer user and the software with which they believe they are interacting. </li></ul></ul><ul><ul><li>A hardware and software protected and verified channel using checksums to verify that the software used to do the I/O has not been tampered with. </li></ul></ul><ul><li>Malicious software injecting itself in this path could be identified. </li></ul>
  24. 24. Memory curtaining <ul><li>Memory curtaining extends common memory protection techniques to provide full isolation of sensitive areas of memory. </li></ul><ul><li>For example, locations containing cryptographic keys. Even the operating system doesn't have full access to curtained memory, so the information would be secure from an intruder who took control of the OS. </li></ul>
  25. 25. Sealed storage <ul><li>Sealed storage protects private information by binding it to platform configuration information including the software and hardware being used. </li></ul><ul><li>This means the data can be read only by the same combination of software and hardware. </li></ul><ul><ul><li>For example changing a hardware slot will change the platform configuration. So a sealed data carrying the previous platform configuration information can not be unsealed in the new platform configuration. </li></ul></ul>
  26. 26. Sealed storage <ul><li>Sealed storage introduces platform management problem. </li></ul><ul><ul><li>i.e. Changing platform configuration (by updating the operating system or by changing a hardware) will make sealed storage unreachable. </li></ul></ul><ul><li>Microsoft proposed a solution that says every software company should be certified in order to update their software products. </li></ul><ul><ul><li>Then what will be the position of open source developers ? </li></ul></ul>
  27. 27. Remote Attestation <ul><li>Remote attestation allows changes to the user's computer to be detected by authorized parties. </li></ul><ul><li>That way, software companies can avoid users tampering with their software to circumvent technological protection measures. </li></ul><ul><li>It works by having the hardware generate a certificate stating what software is currently running. The computer can then present this certificate to a remote party to show that its software hasn't been tampered with. </li></ul>
  28. 28. Privacy and Anonymity <ul><li>Privacy; </li></ul><ul><ul><li>Ability to seclude yourself or some information about yourself. </li></ul></ul><ul><ul><li>Importance: Flaws in privacy will result in identity theft. Information may be used for marketing purposes. </li></ul></ul><ul><li>Anonymity; </li></ul><ul><ul><li>The wish to remain unnoticed or unidentified in the public realm. </li></ul></ul><ul><ul><li>Anonymity can be provided by Direct Anonymous Attestation (DAA). </li></ul></ul>
  29. 29. Attestation and Anonymity <ul><li>Debate over attestation and anonymity </li></ul>
  30. 30. Direct Anonymous Attestation <ul><li>TCG Definiton: A protocol for vouching for an AIK using zero-knowledge-proof technology. </li></ul><ul><li>Main purpose is to authenticate a user without revealing his privacy i.e. keeping user's anonymity. </li></ul>
  31. 31. Digital Rights Management (DRM) <ul><li>Refers to access control technologies used by publishers and copyright holders to limit usage of digital media or devices. </li></ul><ul><li>Example: Music files that can be played on specific music players, predefined times. </li></ul>
  32. 32. Digital Rights Management (DRM) <ul><li>Ethical Problems </li></ul><ul><ul><li>Copyright holder will acquire a huge amount of power. </li></ul></ul><ul><li>Example: Sony Rootkit </li></ul><ul><ul><li>Installed with CD autorun. </li></ul></ul><ul><li>DRM should be installed if user requests, i.e. not with operating system or not with any autorun. </li></ul><ul><li>DRM should protect anonymity of the user. </li></ul>
  33. 33. Identity Theft <ul><li>Identity Theft: </li></ul><ul><ul><li>Illegal usage of another individual's identity. </li></ul></ul><ul><li>Most common method is phishing. </li></ul><ul><ul><li>Producing fake mails to redirect clients to a fake web page to get client's private information. </li></ul></ul><ul><li>Online banking is not secure if somebody has obtained private information. </li></ul><ul><ul><li>One Time Password (OTP) system can also be cracked. </li></ul></ul><ul><li>Remote attestation can be used to protect against identity theft. </li></ul>
  34. 34. Viruses and Worms <ul><li>Virus: </li></ul><ul><ul><li>A self-replicating piece of computer code that can partially or fully attach itself to files or applications, and can cause your computer to do something you don't want it to do. </li></ul></ul><ul><li>Protection in Trusted Computing: </li></ul><ul><ul><li>Digital signature of software will allow users to identify applications modified by third parties that could add spyware to the software. </li></ul></ul>
  35. 35. Viruses and Worms <ul><li>Worm: </li></ul><ul><ul><li>A self-replicating computer program which uses a network to send copies of itself to other nodes (computer terminals on the network) and it may do so without any user intervention. </li></ul></ul><ul><li>Example: </li></ul><ul><ul><li>Worms spreading over IIS. </li></ul></ul><ul><ul><li>Remote attestation protects network against worms. </li></ul></ul>
  36. 36. Viruses and Worms <ul><li>Difference between viruses and worms: </li></ul><ul><ul><li>Unlike a virus, a worm does not need to attach itself to an existing program. </li></ul></ul><ul><ul><li>Worms almost always cause harm to the network, if only by consuming bandwidth, whereas viruses almost always corrupt or modify files on a targeted computer. </li></ul></ul>
  37. 37. Biometrics <ul><li>Biometrics are used to identify specific people by certain characteristics such as fingerprint, hand, signature, voice etc. </li></ul><ul><li>There is a limit for biometric data to be accepted or rejected. </li></ul><ul><li>Biometric data of a person should be stored more secure than a password, because one can't change his biometric data once it's compromised. </li></ul>
  38. 38. Platform and Trusted Platform <ul><li>Platform </li></ul><ul><ul><li>A collection of resources that provides a service. </li></ul></ul><ul><li>A Trusted Computing Platform </li></ul><ul><ul><li>A computing platform that can be trusted to report its properties. </li></ul></ul>
  39. 39. Root of Trust <ul><li>Root of Trust </li></ul><ul><ul><li>A component that must always behave in the expected manner, because its misbehavior cannot be detected. </li></ul></ul><ul><li>The complete set of Roots of Trust has at least the minimum set of functions to enable a description of the platform characteristics that affect the trustworthiness of the platform. </li></ul><ul><li>More than one root of trust can be necessary for practical reasons because; </li></ul><ul><ul><li>If one of them is compromised, applications, only depending on that root of trust, are effected. </li></ul></ul>
  40. 40. Trusted Platform Module <ul><li>TPM (Generic Trusted Platform): </li></ul><ul><ul><li>A secure cryptoprocessor that can store secured information. </li></ul></ul><ul><li>TPM (TCG): An implementation of the functions defined in the TCG TPM Specification; </li></ul><ul><ul><li>The set of Roots of Trust with shielded locations and protected capabilities. </li></ul></ul><ul><ul><li>Normally includes just the RTS (Root of Trust for Storage) and the RTR(Root of Trust for Reporting). </li></ul></ul>
  41. 41. TPM Capabilities Overview
  42. 42. TPM Capabilities Overview [10] <ul><li>Hardware Random Number Generator (HRNG) </li></ul><ul><li>Hash </li></ul><ul><ul><li>SHA-1 is used and it is implemented as defined by FIPS 180-1 [11]. </li></ul></ul><ul><li>HMAC </li></ul><ul><li>Crtyptographic co-processor </li></ul><ul><ul><li>(RSA key generation, RSA Encrypt/Decrypt) </li></ul></ul><ul><li>Power Detection </li></ul><ul><ul><li>Manages TPM power states and platform power states. </li></ul></ul><ul><li>Volatile Memory </li></ul><ul><ul><li>Used for storing keys in use by the TPM (active TPM keys). </li></ul></ul><ul><li>Non Volatile Memory </li></ul><ul><ul><li>Holds persistent state and identity information (eg: EK, SRK) </li></ul></ul>
  43. 43. TPM Capabilities Overview [10] <ul><li>Hardware Random Number Generator (HRNG): </li></ul><ul><ul><li>The entropy source provides as much unpredictable data, either inserted into or generated inside the TPM, as possible. </li></ul></ul><ul><ul><li>The entropy collector collects entropy and removes any bias. </li></ul></ul><ul><ul><li>State registers hold the most recent HRNG state. </li></ul></ul><ul><ul><li>Volatile and non-volatile registers are used save the state and the data from entropy sources. </li></ul></ul><ul><ul><li>Output must conform to FIPS 140-1 [12] PRNG requirements. </li></ul></ul>
  44. 44. TPM Capabilities Overview [10] <ul><li>HMAC: </li></ul><ul><ul><li>The HMAC engine provides two pieces of information to the TPM: </li></ul></ul><ul><ul><ul><li>Proof that the request arriving is indeed authorised. </li></ul></ul></ul><ul><ul><ul><li>Proof that the command has not been modified in transit. </li></ul></ul></ul><ul><ul><li>TPM must support HMAC calculation according to RFC 2104 [13]. </li></ul></ul><ul><ul><li>The key size must be 20 bytes. </li></ul></ul><ul><ul><li>The block size must be 64 bytes. </li></ul></ul>
  45. 45. TPM Capabilities Overview [10] <ul><li>Crtyptographic co-processor: </li></ul><ul><ul><li>RSA engine </li></ul></ul><ul><ul><ul><li>Used for digital signature generation and encryption. </li></ul></ul></ul><ul><ul><ul><li>RSA 512, 768, 1024 and 2048 is supported. Use of RSA 2048 recommended. [14] </li></ul></ul></ul><ul><ul><li>TPM uses RSA for the generation and verification of digital signatures. </li></ul></ul><ul><ul><li>Symmetric encryption </li></ul></ul><ul><ul><ul><li>Encrypt authentication information. </li></ul></ul></ul><ul><ul><ul><li>Provide confidentiality in transport sessions. </li></ul></ul></ul><ul><ul><ul><li>Provide internal encryption of blobs stored off the TPM. </li></ul></ul></ul><ul><ul><li>The TPM does not support bulk symmetric encryption. </li></ul></ul>
  46. 46. Additional Features of a TPM 1.2 <ul><li>Additional features of a TPM 1.2 [2,10] </li></ul><ul><ul><li>Direct Anonymous Attestation (DAA) and the ability to run and generate a new Attestation Identity Key (AIK) </li></ul></ul><ul><ul><li>Time-stamping </li></ul></ul><ul><ul><li>Locality </li></ul></ul><ul><ul><li>Delegation </li></ul></ul><ul><ul><li>Non-volatile storage </li></ul></ul><ul><ul><li>Optimized transport protection </li></ul></ul>
  47. 47. Additional Features of a TPM 1.2 <ul><li>Direct Anonymous Attestation (DAA) </li></ul><ul><ul><li>Based on cryptographic techniques known as zero-knowledge proofs. [3] </li></ul></ul><ul><ul><li>Allows a TPM to convince a remote 'verifier' that it is indeed valid without the disclosure of the public Endorsement Key (EK), i.e. protecting the privacy of the user. </li></ul></ul>
  48. 48. Additional Features of a TPM 1.2 <ul><li>Time-stamping </li></ul><ul><ul><li>Not a universal clock time but a representation of the number of ticks the TPM has counted. </li></ul></ul><ul><ul><li>A basic tick stamp result consists of a TPM digital signature computed over: </li></ul></ul><ul><ul><ul><li>The data to be time-stamped (a digest of the data to be time-stamped) </li></ul></ul></ul><ul><ul><ul><li>The current tick counter value </li></ul></ul></ul><ul><ul><ul><li>The Tick Session Nonce </li></ul></ul></ul><ul><ul><ul><li>Some fixed text </li></ul></ul></ul><ul><ul><li>An example protocol for associating the tick counter to a universal time clock is given in the specifications, it is merely illustrative. </li></ul></ul>
  49. 49. Cryptographic Key Management <ul><li>Cryptographic Key Management: </li></ul><ul><ul><li>Includes all of the provisions which are related to generation, exchange, storage, use of keys. </li></ul></ul>
  50. 50. Cryptographic Key Management <ul><li>TPM Key Hierarchy: </li></ul><ul><ul><li>Endorsement Key (EK) </li></ul></ul><ul><ul><ul><li>Created at manufacture time. </li></ul></ul></ul><ul><ul><ul><li>Used to protect Storage Root Key (SRK). </li></ul></ul></ul><ul><ul><li>Storage Root Key (SRK) </li></ul></ul><ul><ul><ul><li>Created when TakeOwnership command executed. </li></ul></ul></ul><ul><ul><ul><li>The only TPM storage key to be permanently loaded in the TPM. </li></ul></ul></ul><ul><ul><ul><li>The public (private) portion of SRK is used to wrap (unwrap) the first layer of TPM key objects. </li></ul></ul></ul>
  51. 51. Cryptographic Key Management <ul><li>TPM Key Hierarchy: </li></ul><ul><ul><li>EK and SRK private parts never leave the TPM. </li></ul></ul><ul><ul><li>Signing keys are in the leaf nodes. </li></ul></ul><ul><ul><li>If a key is compromised, keys under it will be revoked. </li></ul></ul><ul><ul><li>Every key is encrypted with parent's public part and stored. </li></ul></ul>
  52. 52. Cryptographic Key Management <ul><li>TPM Key Hierarchy </li></ul><ul><li>Keys are loaded to Key Slots </li></ul><ul><li>Signing keys are in the leaf nodes </li></ul><ul><li>If a key is compromised, keys under it will be revoked. </li></ul><ul><li>EK and SRK private part never leaves the TPM. </li></ul><ul><li>Every key is encrypted with parent's public part and stored. </li></ul><ul><li>For detailed key management refer to [16]. </li></ul>
  53. 53. Cryptographic Key Management <ul><li>Types of keys defined by TCG; </li></ul><ul><ul><li>Storage keys, </li></ul></ul><ul><ul><ul><li>Used to wrap or unwrap other keys in the protected storage hierarchy. </li></ul></ul></ul><ul><ul><li>Signature keys, </li></ul></ul><ul><ul><ul><li>Used for signing operations. </li></ul></ul></ul><ul><ul><li>Identity keys, </li></ul></ul><ul><ul><ul><li>Used by TPM-aware applications to prove that data came from a genuine TPM. </li></ul></ul></ul>
  54. 54. Cryptographic Key Management <ul><li>Types of keys defined by TCG; </li></ul><ul><ul><li>Binding keys, </li></ul></ul><ul><ul><ul><li>Used for TPM bind and unbind operations. </li></ul></ul></ul><ul><ul><li>Legacy keys </li></ul></ul><ul><ul><ul><li>For systems tthat wish to use the same key for signing and encryption. </li></ul></ul></ul><ul><ul><li>Change authorisation keys </li></ul></ul><ul><ul><ul><li>Short-lived keys used during the process of changing authorisation information. </li></ul></ul></ul>
  55. 55. Enforced policies <ul><li>A policy: </li></ul><ul><ul><li>The rules that defines how to access a resource. </li></ul></ul><ul><li>A policy example </li></ul><ul><ul><li>UNIX file permissions. </li></ul></ul><ul><ul><li>Filtering mailbox with respect to sender address. </li></ul></ul><ul><li>A policy is not essentially related to security issues, but Trusted Computing is focused especially on security issues. </li></ul><ul><li>An enforced policy: </li></ul><ul><ul><li>Policy that is embedded in the resource and secured with cryptographic methods to assure that the integrity of the resource is protected. </li></ul></ul>
  56. 56. Enforced policies <ul><li>Example: UNIX </li></ul><ul><ul><li>The tree structure of the file system </li></ul></ul><ul><ul><ul><li>Files ( executable files and data files ) </li></ul></ul></ul><ul><ul><ul><li>Directories </li></ul></ul></ul><ul><ul><ul><li>Lins (pointers to other files) </li></ul></ul></ul><ul><ul><li>Permissions </li></ul></ul><ul><ul><ul><li>Read ( r ) </li></ul></ul></ul><ul><ul><ul><li>Write ( w ) </li></ul></ul></ul><ul><ul><ul><li>Execute ( x ) </li></ul></ul></ul><ul><ul><li>Subjects in UNIX </li></ul></ul><ul><ul><ul><li>Owner </li></ul></ul></ul><ul><ul><ul><li>Group, the owner is in </li></ul></ul></ul><ul><ul><ul><li>Universe, all other users </li></ul></ul></ul>
  57. 57. Hard and Soft Enforced policies <ul><li>Hard Enforced Policies [10]: </li></ul><ul><ul><li>Verification is deterministic. </li></ul></ul><ul><ul><li>The platform state that must exist when the information is revealed is dictated. </li></ul></ul><ul><ul><ul><li>Example: Sealing using TPM. </li></ul></ul></ul><ul><li>Soft Enforced Policies [10]: </li></ul><ul><ul><li>Verification is subjective. </li></ul></ul><ul><ul><li>Cannot be verified by the platform or is dependent upon some future event. </li></ul></ul>
  58. 58. Access Control <ul><li>Access Control: </li></ul><ul><ul><li>Determining if the user requesting the resource has priviliges for using the resource. </li></ul></ul><ul><li>Access Control List (ACL): </li></ul><ul><ul><li>A list of permissions attached to an object. </li></ul></ul>
  59. 59. Access Control <ul><li>UNIX ACL: </li></ul><ul><ul><li>Objects </li></ul></ul><ul><ul><ul><li>Simple objects (such as files) </li></ul></ul></ul><ul><ul><ul><li>Complex objects called containers (such as directories) </li></ul></ul></ul><ul><ul><li>Permissions </li></ul></ul><ul><ul><ul><li>Read ( r ) </li></ul></ul></ul><ul><ul><ul><li>Write ( w ) </li></ul></ul></ul><ul><ul><ul><li>Execute ( x ) </li></ul></ul></ul><ul><ul><ul><li>Change-ACL ( c ) </li></ul></ul></ul><ul><ul><ul><li>Container-insert ( i ) </li></ul></ul></ul><ul><ul><ul><li>Container-delete ( d ) </li></ul></ul></ul><ul><ul><ul><li>Test ( t ) </li></ul></ul></ul>
  60. 60. An example: Electronic Elections <ul><li>Stages of the whole election process: </li></ul><ul><ul><li>Registration </li></ul></ul><ul><ul><li>Casting ballots (voting) </li></ul></ul><ul><ul><li>Counting votes </li></ul></ul><ul><ul><li>Displaying results </li></ul></ul><ul><li>The difficulties must be overcome: </li></ul><ul><ul><li>Ballots must be authentic yet untraceable. </li></ul></ul><ul><ul><li>Each voter must be able to check whether his/her vote is counted without compromising his/her privacy. </li></ul></ul><ul><ul><li>The election protocol must be protected against the illegal activity of both eligible voters and dishonest outsiders. </li></ul></ul>
  61. 61. An example: Electronic Elections <ul><li>Requirements for a secure electronic election protocol: </li></ul><ul><ul><li>Completeness </li></ul></ul><ul><ul><ul><li>All valid votes must be counted correctly. </li></ul></ul></ul><ul><ul><li>Soundness </li></ul></ul><ul><ul><ul><li>Dishonest voters cannot disrupt the voting process. </li></ul></ul></ul><ul><ul><li>Privacy </li></ul></ul><ul><ul><ul><li>All ballots must be secret. </li></ul></ul></ul><ul><ul><li>Unreusability </li></ul></ul><ul><ul><ul><li>No voter can cast their ballot more than once. </li></ul></ul></ul><ul><ul><li>Verifiability </li></ul></ul><ul><ul><ul><li>Nobody can falsify the result of the voting process. </li></ul></ul></ul><ul><ul><li>Fairness </li></ul></ul><ul><ul><ul><li>Nothing must effect the voting. </li></ul></ul></ul>
  62. 62. An example: Electronic Elections <ul><li>Utilizing Trusted Computing </li></ul><ul><ul><li>Anonymity can be provided by Direct Anonymous Attestation. </li></ul></ul><ul><ul><li>The voting machine will prove, that it is still trusted, to server by remote attestation. </li></ul></ul><ul><ul><li>A voter will receive hash of his/her vote encrypted with a public portion of an asymmetric key pair. </li></ul></ul><ul><ul><ul><li>The voter can check his/her vote anytime he/she wants. </li></ul></ul></ul><ul><ul><ul><li>TPM can be used to produce asymmetric key pair and store it. </li></ul></ul></ul>
  63. 63. References <ul><li>[1] TCG Architecture Overview </li></ul><ul><li>( https://www.trustedcomputinggroup.org/groups/TCG_1_3_Architecture_Overview.pdf ) </li></ul><ul><li>[2] TCG TSS v1.2 Specifications </li></ul><ul><li>( https://www.trustedcomputinggroup.org/specs/TSS/TSS_Version_1.2_Level_1_FINAL.pdf ) </li></ul><ul><li>[3] Trusted Computing Group . Main specification: specification changes, version 1.2, October 2003. </li></ul><ul><li>( https://www.trustedcomputinggroup.org/groups/tpm/TPM_1_2_Changes_final.pdf ) </li></ul><ul><li>[4] Trousers FAQ </li></ul><ul><li>( http://trousers.sourceforge.net/faq.html ) </li></ul><ul><li>[5] Direct Anonymous Attestation: Achieving Privacy in Remote Authentication, Jan Camenisch, IBM Zurich Research Laboratory </li></ul><ul><li>( http://www.zisc.ethz.ch/events/ISC2004Slides/folien-jan-camenisch.pdf ) </li></ul><ul><li>[6] Director of the Purdue Center for Education and Research in Information Assurance and Security. </li></ul><ul><li>( http://homes.cerias.purdue.edu/~spaf/quotes.html ) </li></ul><ul><li>[7] Definition of Security Flaw </li></ul><ul><li>( http://packetstormsecurity.org/docs/rainbowbooks/NCSC-TG-004.txt ) </li></ul><ul><li>[8] Lest We Remember: Cold Boot Attacks on Encryption Keys, Princeton University, February 21 2008 </li></ul><ul><li>( http://citp.princeton.edu/memory ) </li></ul><ul><li>[9] R. Anderson. Security Engineering – A Guide to Building Dependable Distributed Systems. John Wiley and Sons, New York, 2001. </li></ul>
  64. 64. References <ul><li>[10] Chris Mitchell, Trusted Computing, IEE Professional Applications of Computing Series 6, 2005. </li></ul><ul><li>[11] NIST. Secure Hash Standard. Federal Information Processing Standards Publication FIPS PUB 180-1, National Institute of Standards and Technology (NIST), April 1997. </li></ul><ul><li>[12] NIST. Security requirements for cryptographic modules. Federal Information Processing Standards Publication FIPS PUB 140-1, National Institute of Standards and Technology (NIST), January 1994. </li></ul><ul><li>[13] H. Krawczyk, M. Bellare, and R. Canetti. HMAC – keyed hashing for message authentication, Internet request for comments 2104, RFC 2104, February 1997. </li></ul><ul><li>[14] A. Menezes, P. van Oorschot, and S. Vanstone. Handbook Of Applied Cryptography, volume 6 of Discrete Mathematics and its Applications, CRC Press, Boca Raton, F1, 1997. </li></ul><ul><li>[15] James P. Anderson, Computer Security Technology Planning Study Volume II, October 1972 </li></ul><ul><li>[16] D10.5 Intermediate Training Documentation, IST-027635 /D10.5/V1.0 Final, WP10, April 2007 (M18) </li></ul>
  65. 65. <ul><li>The information in this document is provided “as is”, and no guarantee or warranty is given that the information is fit for any particular purpose. The user thereof uses the information at its sole risk and liability. </li></ul>Open_TC EC Contract No: IST-027635 The Open-TC project is co-financed by the EC. If you need further information, please visit our website www.opentc.net or contact the coordinator: Technikon Forschungs- und Planungsgesellschaft mbH Richard-Wagner-Strasse 7, 9500 Villach, AUSTRIA Tel. + 43 4242 23355 – 0 Fax. + 43 4242 23355 – 77 Email coordination@opentc.net