Attack detection for webapps
Type and weight analysis
The PHPIDS and some of its whereabouts
Generic attack detection vs. plain blacklisting
4. Current Situation
Webapps grow in numbers and complexity
User generated input of all possible kinds
Securing new apps is hard
Securing existing apps is even harder
Difficult to manage the split between usability and
5. Approaches to deal
Total ignorance (yep – that sumtimes happens...)
Drastic filtering, escaping or senseless validation, right
Backup & Restore (for real!!1)
WAFs and IDSses
Training and Consulting
Spending a lot of money for useless stuff
6. The open source
mod_security, JWall, HTMLPurifier, Anti-Samy and
Either very specialized...
...or entirely based on blacklisting
Sometimes generating vulnerabilities themselves
And sometimes crippling user's input
7. Our approach
Say yes to blacklisting!
Use it to detect, categorize and weight
User input won't be touched
Total freedom of choice for the developer
and... generic attack detection
8. Let's have a look
One of the 70 regex rules to detect XSS, SQLi, RCE and
many other attack patterns
<description>finds unquoted attribute breaking in...</description>
9. Step by step
User generated input coming in
First test to check if the whole detection process is
Reporting and optional logging
10. Btw converting...
The converter is capable of normalizing the user's input
from several formats
JS Oct, Hex, Unicode and Charcode
UTF7-Shmootf7 (no idea why this still is an issue)
Loads of entities - be they hex, dec, named or others
SQL-, obfuscation- and concatenation patterns...
Evil chars, nullbytes, RTL/LTR chars
Comments, special numeric formats etc. etc. ...
11. Easy implementation
Not so hard isn't it?
The „doing something smart“-part might be though...
and no – replacing the comment by echo $result; or a redirect is not the cleverest way...
12. But there were
Exotic vectors omfg noez!!
Superdynamic languages as basis for attack vectors
Ternary obfuscation on acid
Rules getting bloaty by the time
More false alerts then necessary
Performance going down
14. Let's go generic!
Plain blacklisting based detection must be extended
Currently exist two plain (some may call 'em weird) but
The ratio calculation with a prepended normalization
The centrifuge – normalizing and weighting standard
programming language elements
Code and thresholds are result of intense testing
Tests are based on about 500 vectors plus several
random regular texts to avoid false alerts
Since programming languages have similiarities the
centrifuge results do either
Still space left for optimization
19. The future...
Optimization of the existing code
More detection routines
More granular and statistic based weighting and string
Cooperation with several universities and other
More verbose demo and result object
Suggestions and other input are always welcome
Contact us at any time via our Google Group or forum
or via Email or IM or whatever way you feel like