Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

ip pier solution


Published on

IP PIER Web protection solution

Published in: Internet
  • Be the first to comment

ip pier solution

  1. 1. Intro  56% of Internet traffic is generated by bots  95% of sites breaches and infecting are automatic  300% annual increase in DDoS attacks on the Application layer  30% annual increase in total number of DDoS attacks  Average indications of DDoS attacks are 9.7 Gb/s and 19 Mpps  Major attacks increased beyond 600 Gb/s
  2. 2.  Increase in number of users using NAT and proxy  Increase in number of mobile users  Mass transfer from http to https  PCI DSS requirement to prohibit transferring ssl certificates to third parties  CAPTCHA is not efficient any longer Intro 2
  3. 3. Necessary to block ALL queries by bots New paradigms of breaches
  4. 4.  High reliability of the service  Wide channels for protection from L3&4 DDoS attacks  Protection from DDoS attacks at the Application layer  Capability to detect singular queries by bots  Protection from bots without blocking IP addresses  Capability to filter https without disclosure of traffic Requirements for security systems
  5. 5. In protection  Active bot Protection (ABP)  Protection from DDoS layer 7  Protection from DDoS layer 3  Protection of HTTPS  Detection of bots without CAPTCHA  WAF  Zero Day  White and black lists Increase is site availability  Site boost (caching, optimization, SPDY)  Site balancing (for multiple platforms including)  Optimization (for a mobile client through traffic compression)  Site monitoring and statistics  IPv6  Always Online  Custom pages of errors Capacities of Cloud Cloud fail safety:  2 Tb/s – capacity of communication channels from different operators  2N backup of all Cloud components
  6. 6. ISP 1 client's platform General working principles of clearing cloud Cloud connection:  Change A of a DNS record  Network notice on BGP (not less than /24) ISP 2 ISP N client's platform client's platform
  7. 7. Basic protection principles border packet filter hardware packet filter software packet filter stateful analyzer Application Layer verification
  8. 8. Implementation features:  Detection of some attacks by means of traffic analysis for L3&4 using original math algorithms  Active interaction with bots  System of automated security levels control  Different security levels for different URL simultaneously are available  Interaction with bots within 0.2 – 64 Kb of traffic  Counter-bot system (we make an attack to be resource-intensive and economically unsound). Active Bot Protection (basic principles)
  9. 9. Benefits for client:  Protection from DDoS at the Application layer  Protection from scanning  Protection from automated replication  Protection from spam-bots in comments and forums  Avoiding necessity to use CAPTCHA  Protection beginning from the first query for HTTPS, both with and without disclosure of traffic Active Bot Protection for client
  10. 10. Operation modes of the complex: ✓ Filtration at the Application layer disabled. ✓ “DDoS protection” – we analyze every query, but do not make changes to user-application interaction until the user seems to be suspicious to us. This is the most common mode, suitable for most sites. If any suspicion arises concerning user's legitimacy, then, before proxying his queries, we enable mechanisms of additional verifications - watching his reaction. If everything is good - we allow the query. ✓ "Active Bot Protection" - in this mode we test every user regardless his prior activity. This mode is used when the maximum protection is required, even against a singular bot query. At that approach, analytics is not disabled. User testing modes are selected depending on personal account settings and user's activity. This approach is good for saving a site from bots totally. Operation principles of traffic filtration at Application layer
  11. 11. ssl certificate with key transferred Traffic disclosure HTTPS traffic filtration (with disclosure) Benefits  Requires no integration with security system (except certificate transfer)  Easy setting Drawbacks  Certificate transfer is necessary  PCI DSS requirements are not met
  12. 12. Benefits  Certificate transfer is not required  PCI DSS requirements are met Drawbacks  Integration with security system is necessary  Time lags on protection activation  Impossible to block sessions, only IP addresses Transfer of access logs for analysis and registering bots in blacklists HTTPS filtration (without traffic disclosure, with logs transfer)
  13. 13. HTTPS filtration (without traffic disclosure, with token) Redirect to security system for user verification and granting a token, after that the user is not subject to verification for a certain period of time Benefits  Certificate transfer is not required  PCI DSS requirements are met  No time lags on protection activation  Blocking sessions, not IP Drawbacks  Integration with security system is necessary  During token validity period an attack with use of this token is possible
  14. 14. If we reckon the user is legitimate If we reckon that additional verification is required Information provided by client: URL, IP, t, UA HTTPS filtration (without traffic disclosure, with validation service) Benefits  Certificate transfer is not required  PCI DSS requirements are met  No time lags on protection activation  Blocking sessions, not IP Drawbacks  Integration with security system is necessary
  15. 15. WAF capacities:  Protection from SQL Injections  Protection from Cross-site scripting  Protection from illegal resources access  Protection from Remote file inclusions  System has self learning mechanisms  Custom rules can be added client Protection from manual breach (WAF)
  16. 16. Balancing for multiple platforms platform 1 platform N Balancing modes:  Round robin  With weight ratio  Active-passive
  17. 17. Caching client The complex is capable of:  Caching queried URLs for a set period of time It enables a client to:  Reduce channel load  Reduce hardware load  Smoothen “Habra effect”
  18. 18. The complex is capable of:  Storing static copies of a client’s site and updating them in certain period of time It enables a client to:  Provide users with static part of the site if the client’s infrastructure fails  Save clients  To improve rating in search engines client Always online
  19. 19. Competitors