Security in CI/CD Pipelines: Tips for DevOps Engineers
The document outlines best practices for integrating security testing in CI/CD pipelines, emphasizing the importance of automating security processes to adapt to agile and DevOps environments. It discusses various testing methodologies such as static, dynamic, and interactive application security testing, their advantages, challenges, and recommends tools for each. The content also highlights the significance of prioritizing vulnerabilities and implementing effective workflows to address security issues in development processes.
Whoami
Stepan Ilyin
● Co-founderand Chief Product Officer of Wallarm
● Based in SF
● Working on several products for F500
and web scale companies to
○ protect cloud-native applications and APIs
○ automate security testing in CI/CD pipelines
3.
Agenda
● It’s nota vendor talk!
● Different approaches to automate security testing in CI/CD
● Recommended set of the DevOps friendly tools you can take
● Best practices of implementing them. How to make them work?
● Examples of the workflows you can apply
Trends and challenges
●Agile and DevOps
○ Short timelines and frequent changes
○ Automated pipeline
● Containers
● Cloud-hosted applications
● Open source
● APIs
● Application Security Testing
(AST) is too slow and requires
too many manual steps
● False positives
● Hard to achieve complete testing
coverage
● Limited remediation advice
● Hard to prioritize issues
Trends Challenges
● Integration
○ Howeasy is it to integrate into CI/CD
● Accuracy
○ Amount of false positives?
● Speed
○ How fast is it? Can it affect the pipeline execution?
● Actionability
○ Signal to noise ratio. Clear guidance
AST criteria — What to keep in mind
10.
Static testing (akaSAST)
● Scan code to identify insecure patterns and potential vulnerabilities
● Challenges
○ False positives and a lot of noise; requires tuning
○ Hard to distinguish exploitable issues from non-exploitable issues
○ Doesn’t have any runtime context (connection with other services, DBs, etc.)
● Deployment
○ Developer machine (as left as possible)
■ IDE checks as spell-checker
○ As a part CI
■ Scan diffs
■ Run scans of full scan of the source-code
11.
Static testing (akaSAST) — Pros and Cons
● Integration
○ Easy
● Accuracy
○ A lot of false positives
● Speed
○ Minutes to hours
● Actionability
○ Exact line of code. But hard to say which of the issues are real issues.
Dynamic testing (akaDAST)
● Sends HTTP requests to test application
○ Library of payloads (SQL injections, XSS, etc)
○ Fuzzing
● Good stuff
○ Finds exploitable stuff (I mean really exploitable)
○ Has runtime context (application is running as it is with connections to DBs, etc)
● Challenges
○ Takes more time than SAST
○ Most of the products can’t scan API and single-page apps (Wallarm FAST can)
○ Most of DAST are hard to integrate into CICD
14.
Dynamic testing (akaDAST) —
Requirements for CICD
● Longest tool in the market
● Most of the tools are developed
○ For pentesters (support to be manually used)
○ For old fashioned apps (when it was easy to crawl website; not anymore with SPAs)
● Requirements
○ Does it support integration to CI?
○ Can it test APIs (and SPAs)
○ Speed
15.
Dynamic testing (akaDAST) — CI/CD tool landscape
● OWASP Zap (OSS)
○ Integration: Console
○ API Testing: Challenging
● Burp Enterprise (Commercial)
○ Integration: API
○ API Testing: Challenging
● Wallarm FAST (Commercial) — DAST + Fuzzing
○ Integration: API
○ API: Strong
16.
DAST uses trafficof your existing tests
Improves security test coverage
● Tests SPAs and APIs
● Detects security issues including
OWASP Top 10
● Expandable without coding
Fine-grain control via policy
Automates security testing
● Auto-generates tests using unit
and functional tests as baselines
● Application-specific fuzzing
● Testing cycles optimized for time
● Configured and run by CI/CD
17.
Dynamic testing (akaDAST) — Pros and Cons
● Integration
○ Test Automation
● Accuracy
○ High. Less configuration
● Speed
○ Usually hours
● Actionability
○ Findings are usually relevant
○ Need to pinpoint the issues in the code
18.
Interactive Application SecurityTesting (IAST)
● Runtime code analysis using instrumentation
● Looks at the code as it’s executed
● Can be deployed for 1-10% of your traffic
● Challenges:
○ Coverage is limited to what is executed
(Test automation scripts needed to drive application behavior)
○ Requires integration into CICD
○ Bound by source programming language and runtime environment
19.
Interactive Application SecurityTesting (IAST) —
Tools for CICD
● Most of the solutions are commercial
○ Synopsys Seeker
○ Contrast Security Assess
20.
Interactive testing (akaIAST) — Pros and Cons
● Integration
○ Quick. But require support of the language / stack. Test automation
● Accuracy
○ High. Runtime context give benefits
● Speed
○ Quick
● Actionability
○ Findings are usually relevant
21.
Software Composition Analysis(SCA)
● SCA to reduce risk from third-party dependencies
● Map dependency tree and find vulnerabilities (CVEs)
in all OSS dependencies
● Tools
○ Snyk
○ GitHub Security Alerts
○ SourceClear
22.
Secret detection
● Scansources codes to find secrets hard-coded by developers
○ API Keys
○ AWS Keys
○ OAuth Client Secrets
○ SSH Private Keys
○ …
● Tools:
○ Tool for Yelp (github.com/Yelp/detect-secrets)
○ git-secrets from awslabs (github.com/awslabs/git-secrets)
23.
Detect secrets fromYelp
● Integration:
○ Pre-commit hook
○ CI to scan all repos
● Language agnostic
○ python, puppet, javascript,
php, java, etc
24.
Containers testing
● Testingperforms detailed analysis on container images
● Lists all packages, files, and software artifacts,
such as Ruby GEMs and Node.JS modules
○ Package lists
○ Software installed manually (pip, rake, ...)
○ Lost credentials
○ Hashes of known vulnerabilities
○ Static binaries
25.
Containers testing
● AnchoreEngine (https://github.com/anchore/anchore-engine)
○ Jenkins plugin
○ REST API
○ CLI
● Clair from CoreOS team (https://github.com/coreos/clair)
● Banyan Collector (https://github.com/banyanops/collector)
● Klar (https://github.com/optiopay/klar)
○ Clair && Docker registry
● Snyk
● Red Hat OpenScap
Prioritize. Or howto avoid backlog overload?
● Prioritize which vulnerabilities represent the highest risk and which may
be acceptable risks
● Avoid duplicate tickets → use tools to filter all the findings out
(vulnerability correlation and security orchestration tools)
○ DefectDojo (OSS)
○ Altran (Aricent), Code Dx, Denim Group, we45, ZeroNorth
30.
Red flags vsOrange flags
● Security issues was found. Now what?
● Establish Red Flags and Orange Flags
Red Flag
Really severe
(e.g. SQL injection from DAST)
● Stop the pipeline (Fail).
● Do not deploy.
Orange Flag
Less severe (potential issue from
SAST)
● Continue pipeline execution.
● Pull issues detail into the backlog
31.
Infrastructure as Code
●Immutable instances / infrastructure
● Replace instead of patching
● Cloud Formation and Terraform
Everything — infrastructure stack,
network, subnets, instances inside
subnets, bridge, NAT gateway — defined
in the JSON/text
● Servers / instance — Chef, Ansible, Salt
● Containers — Docker files