Security
in CI/CD pipelines
Tips for DevSecOps
engineers
Stepan Ilyin
Wallarm, co-founder
@wallarm
Whoami
Stepan Ilyin
● Co-founder and Chief Product Officer of Wallarm
● Based in SF
● Working on several products for F500
and web scale companies to
○ protect cloud-native applications and APIs
○ automate security testing in CI/CD pipelines
Agenda
● It’s not a vendor talk!
● Different approaches to automate security testing in CI/CD
● Recommended set of the DevOps friendly tools you can take
● Best practices of implementing them. How to make them work?
● Examples of the workflows you can apply
Shifts in org structure and processes
Trends and challenges
● Agile and DevOps
○ Short timelines and frequent changes
○ Automated pipeline
● Containers
● Cloud-hosted applications
● Open source
● APIs
● Application Security Testing
(AST) is too slow and requires
too many manual steps
● False positives
● Hard to achieve complete testing
coverage
● Limited remediation advice
● Hard to prioritize issues
Trends Challenges
Security testing tools zoo
● SAST (Static analysis)
● DAST (Dynamic analysis, Fuzzing)
● IAST (Interactive
● SCA
● ...
● Secret detection
● Licensing violation detection
● ..
● Integration
○ How easy is it to integrate into CI/CD
● Accuracy
○ Amount of false positives?
● Speed
○ How fast is it? Can it affect the pipeline execution?
● Actionability
○ Signal to noise ratio. Clear guidance
AST criteria — What to keep in mind
Static testing (aka SAST)
● Scan code to identify insecure patterns and potential vulnerabilities
● Challenges
○ False positives and a lot of noise; requires tuning
○ Hard to distinguish exploitable issues from non-exploitable issues
○ Doesn’t have any runtime context (connection with other services, DBs, etc.)
● Deployment
○ Developer machine (as left as possible)
■ IDE checks as spell-checker
○ As a part CI
■ Scan diffs
■ Run scans of full scan of the source-code
Static testing (aka SAST) — Pros and Cons
● Integration
○ Easy
● Accuracy
○ A lot of false positives
● Speed
○ Minutes to hours
● Actionability
○ Exact line of code. But hard to say which of the issues are real issues.
Commercial
● Checkmarx
● Microfocus
● Synopsys
● etc
OSS
● Ruby (Brakeman, Cane), Python (Bandit), ..
● https://github.com/mre/awesome-static-analysis
IDE
● Mostly from commercial vendors
Static testing (aka SAST) — Tools landscape
Dynamic testing (aka DAST)
● Sends HTTP requests to test application
○ Library of payloads (SQL injections, XSS, etc)
○ Fuzzing
● Good stuff
○ Finds exploitable stuff (I mean really exploitable)
○ Has runtime context (application is running as it is with connections to DBs, etc)
● Challenges
○ Takes more time than SAST
○ Most of the products can’t scan API and single-page apps (Wallarm FAST can)
○ Most of DAST are hard to integrate into CICD
Dynamic testing (aka DAST) —
Requirements for CICD
● Longest tool in the market
● Most of the tools are developed
○ For pentesters (support to be manually used)
○ For old fashioned apps (when it was easy to crawl website; not anymore with SPAs)
● Requirements
○ Does it support integration to CI?
○ Can it test APIs (and SPAs)
○ Speed
Dynamic testing (aka DAST) — CI/CD tool landscape
● OWASP Zap (OSS)
○ Integration: Console
○ API Testing: Challenging
● Burp Enterprise (Commercial)
○ Integration: API
○ API Testing: Challenging
● Wallarm FAST (Commercial) — DAST + Fuzzing
○ Integration: API
○ API: Strong
DAST uses traffic of your existing tests
Improves security test coverage
● Tests SPAs and APIs
● Detects security issues including
OWASP Top 10
● Expandable without coding
Fine-grain control via policy
Automates security testing
● Auto-generates tests using unit
and functional tests as baselines
● Application-specific fuzzing
● Testing cycles optimized for time
● Configured and run by CI/CD
Dynamic testing (aka DAST) — Pros and Cons
● Integration
○ Test Automation
● Accuracy
○ High. Less configuration
● Speed
○ Usually hours
● Actionability
○ Findings are usually relevant
○ Need to pinpoint the issues in the code
Interactive Application Security Testing (IAST)
● Runtime code analysis using instrumentation
● Looks at the code as it’s executed
● Can be deployed for 1-10% of your traffic
● Challenges:
○ Coverage is limited to what is executed
(Test automation scripts needed to drive application behavior)
○ Requires integration into CICD
○ Bound by source programming language and runtime environment
Interactive Application Security Testing (IAST) —
Tools for CICD
● Most of the solutions are commercial
○ Synopsys Seeker
○ Contrast Security Assess
Interactive testing (aka IAST) — Pros and Cons
● Integration
○ Quick. But require support of the language / stack. Test automation
● Accuracy
○ High. Runtime context give benefits
● Speed
○ Quick
● Actionability
○ Findings are usually relevant
Software Composition Analysis (SCA)
● SCA to reduce risk from third-party dependencies
● Map dependency tree and find vulnerabilities (CVEs)
in all OSS dependencies
● Tools
○ Snyk
○ GitHub Security Alerts
○ SourceClear
Secret detection
● Scan sources codes to find secrets hard-coded by developers
○ API Keys
○ AWS Keys
○ OAuth Client Secrets
○ SSH Private Keys
○ …
● Tools:
○ Tool for Yelp (github.com/Yelp/detect-secrets)
○ git-secrets from awslabs (github.com/awslabs/git-secrets)
Detect secrets from Yelp
● Integration:
○ Pre-commit hook
○ CI to scan all repos
● Language agnostic
○ python, puppet, javascript,
php, java, etc
Containers testing
● Testing performs detailed analysis on container images
● Lists all packages, files, and software artifacts,
such as Ruby GEMs and Node.JS modules
○ Package lists
○ Software installed manually (pip, rake, ...)
○ Lost credentials
○ Hashes of known vulnerabilities
○ Static binaries
Containers testing
● Anchore Engine (https://github.com/anchore/anchore-engine)
○ Jenkins plugin
○ REST API
○ CLI
● Clair from CoreOS team (https://github.com/coreos/clair)
● Banyan Collector (https://github.com/banyanops/collector)
● Klar (https://github.com/optiopay/klar)
○ Clair && Docker registry
● Snyk
● Red Hat OpenScap
26
Infrastructure as Code
License analysis
● Automated license compliance
● Scan sources code for OSS licenses violations
● Tools:
○ Whitesource
○ BlackDuck
○ Snyk
Prioritize. Or how to avoid backlog overload?
● Prioritize which vulnerabilities represent the highest risk and which may
be acceptable risks
● Avoid duplicate tickets → use tools to filter all the findings out
(vulnerability correlation and security orchestration tools)
○ DefectDojo (OSS)
○ Altran (Aricent), Code Dx, Denim Group, we45, ZeroNorth
Red flags vs Orange flags
● Security issues was found. Now what?
● Establish Red Flags and Orange Flags
Red Flag
Really severe
(e.g. SQL injection from DAST)
● Stop the pipeline (Fail).
● Do not deploy.
Orange Flag
Less severe (potential issue from
SAST)
● Continue pipeline execution.
● Pull issues detail into the backlog
Infrastructure as Code
● Immutable instances / infrastructure
● Replace instead of patching
● Cloud Formation and Terraform
Everything — infrastructure stack,
network, subnets, instances inside
subnets, bridge, NAT gateway — defined
in the JSON/text
● Servers / instance — Chef, Ansible, Salt
● Containers — Docker files
Protection of Cloud Workloads
Questions?
@Wallarm: Twitter | LinkedIn | Facebook
My email: si@wallarm.com

Security in CI/CD Pipelines: Tips for DevOps Engineers

  • 1.
    Security in CI/CD pipelines Tipsfor DevSecOps engineers Stepan Ilyin Wallarm, co-founder @wallarm
  • 2.
    Whoami Stepan Ilyin ● Co-founderand Chief Product Officer of Wallarm ● Based in SF ● Working on several products for F500 and web scale companies to ○ protect cloud-native applications and APIs ○ automate security testing in CI/CD pipelines
  • 3.
    Agenda ● It’s nota vendor talk! ● Different approaches to automate security testing in CI/CD ● Recommended set of the DevOps friendly tools you can take ● Best practices of implementing them. How to make them work? ● Examples of the workflows you can apply
  • 4.
    Shifts in orgstructure and processes
  • 5.
    Trends and challenges ●Agile and DevOps ○ Short timelines and frequent changes ○ Automated pipeline ● Containers ● Cloud-hosted applications ● Open source ● APIs ● Application Security Testing (AST) is too slow and requires too many manual steps ● False positives ● Hard to achieve complete testing coverage ● Limited remediation advice ● Hard to prioritize issues Trends Challenges
  • 6.
    Security testing toolszoo ● SAST (Static analysis) ● DAST (Dynamic analysis, Fuzzing) ● IAST (Interactive ● SCA ● ... ● Secret detection ● Licensing violation detection ● ..
  • 9.
    ● Integration ○ Howeasy is it to integrate into CI/CD ● Accuracy ○ Amount of false positives? ● Speed ○ How fast is it? Can it affect the pipeline execution? ● Actionability ○ Signal to noise ratio. Clear guidance AST criteria — What to keep in mind
  • 10.
    Static testing (akaSAST) ● Scan code to identify insecure patterns and potential vulnerabilities ● Challenges ○ False positives and a lot of noise; requires tuning ○ Hard to distinguish exploitable issues from non-exploitable issues ○ Doesn’t have any runtime context (connection with other services, DBs, etc.) ● Deployment ○ Developer machine (as left as possible) ■ IDE checks as spell-checker ○ As a part CI ■ Scan diffs ■ Run scans of full scan of the source-code
  • 11.
    Static testing (akaSAST) — Pros and Cons ● Integration ○ Easy ● Accuracy ○ A lot of false positives ● Speed ○ Minutes to hours ● Actionability ○ Exact line of code. But hard to say which of the issues are real issues.
  • 12.
    Commercial ● Checkmarx ● Microfocus ●Synopsys ● etc OSS ● Ruby (Brakeman, Cane), Python (Bandit), .. ● https://github.com/mre/awesome-static-analysis IDE ● Mostly from commercial vendors Static testing (aka SAST) — Tools landscape
  • 13.
    Dynamic testing (akaDAST) ● Sends HTTP requests to test application ○ Library of payloads (SQL injections, XSS, etc) ○ Fuzzing ● Good stuff ○ Finds exploitable stuff (I mean really exploitable) ○ Has runtime context (application is running as it is with connections to DBs, etc) ● Challenges ○ Takes more time than SAST ○ Most of the products can’t scan API and single-page apps (Wallarm FAST can) ○ Most of DAST are hard to integrate into CICD
  • 14.
    Dynamic testing (akaDAST) — Requirements for CICD ● Longest tool in the market ● Most of the tools are developed ○ For pentesters (support to be manually used) ○ For old fashioned apps (when it was easy to crawl website; not anymore with SPAs) ● Requirements ○ Does it support integration to CI? ○ Can it test APIs (and SPAs) ○ Speed
  • 15.
    Dynamic testing (akaDAST) — CI/CD tool landscape ● OWASP Zap (OSS) ○ Integration: Console ○ API Testing: Challenging ● Burp Enterprise (Commercial) ○ Integration: API ○ API Testing: Challenging ● Wallarm FAST (Commercial) — DAST + Fuzzing ○ Integration: API ○ API: Strong
  • 16.
    DAST uses trafficof your existing tests Improves security test coverage ● Tests SPAs and APIs ● Detects security issues including OWASP Top 10 ● Expandable without coding Fine-grain control via policy Automates security testing ● Auto-generates tests using unit and functional tests as baselines ● Application-specific fuzzing ● Testing cycles optimized for time ● Configured and run by CI/CD
  • 17.
    Dynamic testing (akaDAST) — Pros and Cons ● Integration ○ Test Automation ● Accuracy ○ High. Less configuration ● Speed ○ Usually hours ● Actionability ○ Findings are usually relevant ○ Need to pinpoint the issues in the code
  • 18.
    Interactive Application SecurityTesting (IAST) ● Runtime code analysis using instrumentation ● Looks at the code as it’s executed ● Can be deployed for 1-10% of your traffic ● Challenges: ○ Coverage is limited to what is executed (Test automation scripts needed to drive application behavior) ○ Requires integration into CICD ○ Bound by source programming language and runtime environment
  • 19.
    Interactive Application SecurityTesting (IAST) — Tools for CICD ● Most of the solutions are commercial ○ Synopsys Seeker ○ Contrast Security Assess
  • 20.
    Interactive testing (akaIAST) — Pros and Cons ● Integration ○ Quick. But require support of the language / stack. Test automation ● Accuracy ○ High. Runtime context give benefits ● Speed ○ Quick ● Actionability ○ Findings are usually relevant
  • 21.
    Software Composition Analysis(SCA) ● SCA to reduce risk from third-party dependencies ● Map dependency tree and find vulnerabilities (CVEs) in all OSS dependencies ● Tools ○ Snyk ○ GitHub Security Alerts ○ SourceClear
  • 22.
    Secret detection ● Scansources codes to find secrets hard-coded by developers ○ API Keys ○ AWS Keys ○ OAuth Client Secrets ○ SSH Private Keys ○ … ● Tools: ○ Tool for Yelp (github.com/Yelp/detect-secrets) ○ git-secrets from awslabs (github.com/awslabs/git-secrets)
  • 23.
    Detect secrets fromYelp ● Integration: ○ Pre-commit hook ○ CI to scan all repos ● Language agnostic ○ python, puppet, javascript, php, java, etc
  • 24.
    Containers testing ● Testingperforms detailed analysis on container images ● Lists all packages, files, and software artifacts, such as Ruby GEMs and Node.JS modules ○ Package lists ○ Software installed manually (pip, rake, ...) ○ Lost credentials ○ Hashes of known vulnerabilities ○ Static binaries
  • 25.
    Containers testing ● AnchoreEngine (https://github.com/anchore/anchore-engine) ○ Jenkins plugin ○ REST API ○ CLI ● Clair from CoreOS team (https://github.com/coreos/clair) ● Banyan Collector (https://github.com/banyanops/collector) ● Klar (https://github.com/optiopay/klar) ○ Clair && Docker registry ● Snyk ● Red Hat OpenScap
  • 26.
  • 27.
    License analysis ● Automatedlicense compliance ● Scan sources code for OSS licenses violations ● Tools: ○ Whitesource ○ BlackDuck ○ Snyk
  • 29.
    Prioritize. Or howto avoid backlog overload? ● Prioritize which vulnerabilities represent the highest risk and which may be acceptable risks ● Avoid duplicate tickets → use tools to filter all the findings out (vulnerability correlation and security orchestration tools) ○ DefectDojo (OSS) ○ Altran (Aricent), Code Dx, Denim Group, we45, ZeroNorth
  • 30.
    Red flags vsOrange flags ● Security issues was found. Now what? ● Establish Red Flags and Orange Flags Red Flag Really severe (e.g. SQL injection from DAST) ● Stop the pipeline (Fail). ● Do not deploy. Orange Flag Less severe (potential issue from SAST) ● Continue pipeline execution. ● Pull issues detail into the backlog
  • 31.
    Infrastructure as Code ●Immutable instances / infrastructure ● Replace instead of patching ● Cloud Formation and Terraform Everything — infrastructure stack, network, subnets, instances inside subnets, bridge, NAT gateway — defined in the JSON/text ● Servers / instance — Chef, Ansible, Salt ● Containers — Docker files
  • 32.
  • 33.
    Questions? @Wallarm: Twitter |LinkedIn | Facebook My email: si@wallarm.com