7. Key Trends
• While individual rule
compliance is up, testing of
security systems is down
• Sustainability is low. Fewer
than a third of companies
were found to be still fully
compliant less than a year
after successful validation.
8. Why?
• Inability to keep up with a moving target
Requirements change by an average of 18%
over a year
Line-of-business initiated changes
• Inability to continuously monitor
environments for compliance
25. Compliance Rule Types
Now Later
How
What
Sequence
• Authentication before action
• Authentication in AD and ITSM
• Security review before production
deployment
State
• Customer data and Form data not
logically co-resident
• NTP installed
• SELinux enforcing AND Centrify
Agent
• Digital Guardian and NOT sudo
Supervision
• Audit trail of changes and
approval
Scope
• Third party access via named
accounts.
• Splunk access to global logs
only.
It seems like the last few years have been increasingly bad for information security.
Office of Personnel Management hack
Premera Blue Cross (11M customers affected, May 2015)
Anthem Health (80M customers affected)
Home Depot (56M credit cards compromised)
Target
And the list goes on. You don’t want to end up on this one: http://www.nytimes.com/interactive/2015/07/29/technology/personaltech/what-parts-of-your-information-have-been-exposed-to-hackers-quiz.html
As someone who works in IT, this is both frightening and appalling to me.
How many of you are in regulated industries?
It's so bad that you don't have to go past the first page of this report…
80% of companies still fail at interim assessment
CAGR of 66% in security incidents since 2009.
Security requirements change by an average of 18% over a year
And of course, LoB initiates changes
Anyone ever read Michael Lewis's books? (Liar’s Poker, Moneyball, etc.)
Brad Katsuyama is one of the main characters in "Flash Boys"; he's an ex-RBC trader now in charge of a dark pool that aims to bring fairness to the public markets (that's a whole other story)
But I was fortunate enough to hear Brad speak recently and he spoke of the "regulated industry death spiral"
(Talk about that, then talk about how it applies to security)
How did we get to this horrendous state of affairs? And PCI 1.0 was ratified in 2004, so over ten years ago – predating Facebook, Twitter, Instagram, mobile apps, etc.
Clearly if we continue down whatever road we're on, things are going to get worse, not better, right?
What is the current state and what are the market forces that are driving this continued worsening of security and compliance?
The big problem, I think, is the manner in which we do business – and do “security”.
This is how I think of “security reviews” – they slow down the flow and change backs up. The more changes back up, the more we need to “expedite” or “force” things through the dam in order to satisfy LoB needs. Which leads to…
Ok, so we still need to assure ourselves that things are "secure", right? So what we do is replace this security 'gate" with "scanning"
At some later date
Not with the same cadence as we're flowing changes through the system
Usually only when the auditors are on site
Which is why the sustainability is low – same thing as mentioned in the PCI report from Verizon
What are the outputs of this scanning?
Thousands of pages of reports
No idea whether the errors are important or not
What happens to these reports?
Routinely ignored
Or, it's a fire drill later because the auditors have flagged an error repeatedly
Moreover, half the time these reports are useless because they're outside-looking-in.
I think we need to stop whatever we’re doing and go back to first principles for a second.
First I want to start defining security and compliance. These are not the same thing, although obviously they are related.
Up until now we have been primarily concerned with compliance activities. Actually in many cases you could be “compliant” without achieving the intended aims of standards, which is to assure security.
The objective isn't to pass the compliance audit, the objective is to actually make our systems more secure.
You can spend a lot of time negotiating with auditors and getting them to accept "compensating controls" and the like in order to pass a compliance audit, but this is what Jez Humble would call "compliance theatre". How much can you pull the wool over the eyes of your auditors while still having a company that still has huge risks?
Before you can be compliant, you first have to truly be secure.
Compliance is about:
Don’t get hacked
Don’t lose data
Compliance activities aren’t just about “making auditors go away” which is how we’ve treated them.
This is a great book primarily about the quest for "quality" and it's based on Pirsig's real-life experiences as a professor.
Pirsig says quality is an attribute that is inherent to a work product. It is not possible to take something that has low quality and add quality later.
In software, this is why “QA teams” in industry are problematic, as you cannot charge a single team and only that team with accountability for software quality. What about the developers who wrote the program code in the first place.
If you regard information security as just one aspect of quality, it therefore follows that you cannot design a system and simply bolt on security.
We always talk about how in DevOps it's not about the tools, it's about the culture. But you want to choose tools that will help you reinforce the culture you want. I've spent the first part of this presentation talking about how that it's the culture and mindset of how to do security and compliance culture that are problematic these days. Now I'll talk about how you can use tools and processes to help get to the culture you want, which is to build in security as part of the quality of getting a thing to production.
One of our beliefs is that the world of humans acting directly on computers is over – it’s not scalable, it’s not reproducible, it’s not testable
So the future is clearly humans acting on code, and code acting on machines – that’s the era that Chef grew up in.
For the purposes of compliance, we actually wanted a common language, in code, that would allow all audiences – compliance, security, and devops – to collaborate on. And this code will then act on systems.
And for this I’d like to turn it over to my colleague Galen Emery to show you Chef Compliance, and that language we invented, called InSpec.
This all goes into centralized reporting, and in particular you can export the live, cleaned up data stream to something like Splunk, and run arbitrary reports
Anyone ever read Michael Lewis's books? (Liar’s Poker, Moneyball, etc.)
Brad Katsuyama is one of the main characters in "Flash Boys"; he's an ex-RBC trader now in charge of a dark pool that aims to bring fairness to the public markets (that's a whole other story)
But I was fortunate enough to hear Brad speak recently and he spoke of the "regulated industry death spiral"
(Talk about that, then talk about how it applies to security)
Traditionally, compliance requirements have flowed one way.
Requirements from regulatory authorities become policy at a company after they're interpreted by execs and folks who are responsible for that
The implementation flows down to business units and the folks charged with security
But there's often a giant disconnect, as anyone who has tried to implement any of these compliance frameworks knows.