• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
White Paper: Six-Step Competitive Device Evaluation

White Paper: Six-Step Competitive Device Evaluation



This paper presents a six-step methodology for conducting competitive product evaluations that provide advance insight into the performance, security, and stability of devices within production ...

This paper presents a six-step methodology for conducting competitive product evaluations that provide advance insight into the performance, security, and stability of devices within production network and data center environments.



Total Views
Views on SlideShare
Embed Views



3 Embeds 10

http://www.slideshare.net 5
http://www.breakingpointsystems.com 3
http://www.linkedin.com 2



Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    White Paper: Six-Step Competitive Device Evaluation White Paper: Six-Step Competitive Device Evaluation Document Transcript

    • A Six-Step Plan for Competitive Device Evaluations How to Evaluate and Select the Best Content-Aware Network or Security Devices for Enterprise, Federal, and Carrier InfrastructuresBreakingPoint Enterprise IT Resiliency Serieswww.breakingpoint.com 1© 2005 – 2011. BreakingPoint Systems, Inc. All rights reserved. The BreakingPoint logo is a trademark of BreakingPoint Systems, Inc.All other trademarks are the property of their respective owners.
    • Executive SummaryIT organizations are upgrading to faster and more intelligent content-aware IT infrastructures to support customers, users,and business operations as a whole. The sophisticated high-performance network and security devices within theseinfrastructures require a more comprehensive approach to pre-deployment testing than traditional testing tools can provide.This paper presents a six-step methodology for conducting competitive product evaluations that provide advance insightinto the performance, security, and stability of devices within production network and data center environments. Byfollowing the methodology presented in this paper, organizations will: • Select the right firewall, IPS, UTM, load balancer, virtualized server, or other device to meet business and IT objectives. • Understand device capabilities to improve infrastructure planning and resiliency. • Save up to 50 percent on IT investments. • Eliminate hundreds of man-hours in post-purchase configuration, troubleshooting, and tuning. 2
    • IntroductionIT organizations embarking upon a network, security, Recent high-profile performance and security failures areor data center infrastructure upgrade need new bringing renewed focus to the importance of sufficientmethodologies and tools for testing and validating the testing to ensure content-aware network devices can performperformance, security, and stability of today’s content- under real-world and peak conditions...Network equipmentaware devices. To make purchase decisions about firewalls, providers, service providers, and other organizations requireintrusion prevention systems (IPS), servers, load balancers, testing solutions capable of rigorously testing, simulating, andand so on, CIOs, CISOs, and other IT leaders need better emulating realistic application workloads and security attacksinformation than traditional testing tools can provide. at line speed.Why? Because today’s content-aware and application-aware Equally important, these tools must be able to keep pace withdevices employ deep packet inspection (DPI) capabilities to emerging and more innovative products as well as thoroughlyexamine traffic in ways that legacy testing approaches were vet complex content-aware/DPI-capable functionality bynever designed to validate. Such devices—and the complex emulating a myriad of application protocols and other typestraffic they handle—demand a new and deeper approach to of content at ever-increasing speeds to ensure delivery of ancomparative device testing that uses real application, attack, outstanding quality of experience (QoE) for the customerand malformed traffic at ever-increasing speeds. Without and/or subscriber.this improved approach, content-aware equipment cannotbe stressed thoroughly enough to determine the true limitsof its capabilities. IDC Report: “The Inevitable Failure of Content-Aware/DPI Network Devices — and How to Mitigate the Risk”This paper explains the six steps that organizations mustfollow to validate DPI-enabled equipment and make fullyinformed purchase decisions: The providers of complex network security devices frequently1. Create and prioritize specifications for products to be make marketing claims that are unsupported by hard evaluated. evidence and, in any case, do not reflect the real-world2. Develop a testing plan around repeatable, quantitative requirements of specific enterprises. The only solution for principles. prospective buyers is to define their own enterprise-specific3. Use standardized scores to separate pretenders from business, security, and operational requirements, and test contenders. devices rigorously against those requirements...4. Create individual test scenarios that mirror the production environment and are repeatable yet random.5. Execute a layered testing progression that includes load, Security professionals must be prepared to test in-line application traffic, security attacks, and other stress security products to confirm their security effectiveness and vectors. performance capabilities under real-world conditions.6. Lay the groundwork for successful deployment and maintenance. Gartner Report: “Guidelines for CISOs: A 10-Step Program for Selecting the Right Network Security Devices” (February 2011)Why Marketing Claims Are Not SufficientVendor performance claims are based on generic conditionswithin a vendor’s lab, and will never be sufficient for making Test lab reports are equally inadequate. These labs followsound decisions. They can never accurately portray the a “vacuum” or “clean room” approach, in which deviceresiliency—the performance, security, and stability—of testing is done in isolation, without regard to the uniquedevices as they handle the unique mix of traffic within a environments of customers. Also, test labs are oftencustomer’s network. funded by device manufacturers, which invariably calls into question the objectivity of test results. 3
    • Companies need an approach that allows them to impose Six Steps to the Perfect Competitive Device their own conditions during pre-purchase evaluations—also Evaluationknown as “bakeoffs”—so that they can rigorously validate 1. reate and Prioritize Specifications for Products to Be Cdevice capabilities under real-world scenarios at line rate. EvaluatedOnly by conducting this type of bakeoff will IT buyers As with any project, it is wise to “begin with the end inacquire the actionable answers needed to make informed mind” when planning a device bakeoff. Before consideringpurchase decisions and eliminate time-consuming post- any piece of equipment, IT decision makers should clearlydeployment troubleshooting. define and prioritize the organization’s needs for current and future infrastructure build-out. Otherwise, it is too easy toBenefits of a Successful Device Evaluation dive into questions of speeds and feeds without taking intoIT professionals responsible for choosing content- or account broader objectives. A good way to start is by askingapplication-aware network and security equipment should fundamental questions such as:follow the steps outlined here to ensure that devices are • How should the infrastructure support key businessfully evaluated before purchase and deployment. Using the objectives? For example, what are the transactionfindings of pre-purchase and pre-deployment testing, they latency requirements?can refine their understanding of the unique real-world • How important is the security of transactions inconditions affecting their networks. By following the six comparison to their speed?steps for device bakeoffs explained in this paper, purchasers • Which services are most sensitive, requiring thewill ensure that they: highest levels of security? • Select the right products to meet their business • Is application inspection necessary or not? objectives. Doing this requires a clear knowledge of device resiliency when handling a mix of real The answers to questions like these may not be as obvious application traffic, security attacks, and malformed as they initially appear. Obvious or not, they help establish traffic under heavy load. the priorities for the infrastructure, which are then used to • Understand device capabilities to improve generate specific evaluation criteria for selecting the right infrastructure planning and resiliency. The product. information gained during the bakeoff process allows IT planners to rightsize network and data center Making this selection is about more than finding the right infrastructures to meet business needs for resiliency make and model of device; it also means choosing the right while controlling costs. amount of equipment to rightsize the infrastructure while • Save up to 50 percent on IT investments. Customers meeting business needs. To enable rightsizing, it is important that perform independent testing are able to rightsize infrastructures and pay for only the performance they actually get from each device. As this paper will show, Testing is not necessarily about proving that the most-capable, buyers are negotiating better vendor discounts when most-expensive product is the best choice. A well-designed armed with detailed information about the capabilities testing plan may actually show that a lower level of performance of devices under their own network conditions. is acceptable at certain points on the network, and this can • Eliminate hundreds of man-hours in post-purchase reduce purchase and deployment costs. IT organizations that configuration and tuning. Performing thorough do not perform relevant tests in-house may introduce serious pre-purchase validation gives purchasers advance security and performance issues to their networks by purchasing knowledge of device capabilities and prevents weeks underspecified devices, or may overspend significantly on higher of delays caused by post-deployment troubleshooting levels of performance and coverage that are not required. and vendor finger-pointing. This insight also helps IT organizations configure devices appropriately to avoid Gartner Report: “Protecting the Enterprise: Verifying the surprises and disruptions. Performance of Complex Network Security Products” (January 2011) 4
    • to suspend assumptions about how many devices will be in Step 1 will translate into the specific parametersrequired, because the approach to bakeoffs described here evaluated during the testing itself.often leads to surprising insights that overturn assumptions.In one recent example, a large financial services organization Elements to Include in the Plan:had planned to purchase two sets of redundant firewalls • Controlled Variables — Throughout the planningfor a particular installation. But the company was working and execution stages, the bakeoff must rigorouslyfrom a false assumption about how many devices it would control variables. The goal should always be to isolateactually need. A scientific bakeoff enabled the firewalls device capabilities and problem areas, which—as into be validated and properly tuned using the firm’s actual any scientific investigation—requires repeating testsnetwork conditions instead of canned traffic or estimates. exactly and changing only one input at a time.This process revealed that the firewalls performed better • Accurate Baselines — As a further control, each testthan expected. The company needed to buy only one set of process should include an initial run through a simpleredundant devices, not two, which cut its firewall bill in half. piece of cable to establish what the traffic looks like without intermediation by any device. DoingTo facilitate the capture and rigorous analysis of so creates a valid basis for comparison against theperformance results, organizations may want to build out subsequent run of the same traffic through the DUT.a Planning Matrix. In a spreadsheet, each important feature • Uniform Configurations — Consistent collectionof devices to be evaluated is given its own row, with the of data requires that devices be set up uniformlyrows arranged in priority order. Weighted values are then throughout the bakeoff. This means that each vendorassigned to each row. Each device being considered is given should configure its device to match standardits own column, and the columns are filled in as results are settings established by the purchaser. It also meansgathered from the testing processes described below. When that settings for the testing equipment should bethe matrix is fully populated with performance details, it maintained across all DUTs.should provide objective clarity about how well each device • Escalating Complexity — To achieve a comprehensiveperformed across all criteria. understanding of device capabilities, bakeoff testing should proceed by stages, evolving in complexity until Metric Weight Device A Device B Device C it fully reflects the purchaser’s unique mix of traffic. The Security Coverage Attacks Blocked (%) 35% 65 72 73 first of these stages, explained in Step 3 below, uses Performance standards-based application, attack, and malformed Flows per Second 30% 111,374 97,764 119,384 traffic to evaluate a longer list of devices and quickly Max Concurrent Flows 20% 2,000,000 2,350,000 1,850,000 eliminate obviously unsuitable choices. The second Throughput (Mbps) 15% 11,542 13,127 9,842 stage, addressed in Steps 4 and 5, uses custom traffic mixes and progressive rounds of testing to precisely mirror the actual conditions that shortlisted devices2. ethink Testing around Repeatable, Quantitative R will face once they are deployed in the purchaser’s Principles infrastructure.Purchasers should use the specifications generated in • recision Tools — Bakeoffs must use testing tools PStep 1 to create a plan for stressing each device under that create precise real-world network conditionstest (DUT) with real-world application, attack, and again and again and enable variables to be changedmalformed traffic under load. Doing so is not as simple as one at a time. These tools must also capture exacttaking older, ad hoc approaches to testing and injecting measurements of device behavior to enable accurateauthentic traffic. The entire plan must embrace a scientific comparisons among devices.methodology to accurately validate the capabilities ofDPI-enabled devices, which means it must use repeatable In the past, network and security professionals have lackedexperiments that yield clear, quantitative results. Only this the precision tools necessary to enforce truly consistent,approach ensures that the evaluation criteria established scientific standards across their testing processes. That 5
    • has hampered their ability to make decisions based from the scores, they can then choose the three or fouron hard quantitative data and forced them to make most suitable devices for deep, customized testing. Usingestimates about device resiliency based on whatever standardized scores at this stage saves time and money byperformance numbers they could gather. Today, however, quickly establishing which devices are the most likely to fulfillsuperior tools create authentic application traffic and the business needs set out earlier in the process. Formulationcapture precise measurements of its effects, even for the of the custom tests for shortlisted devices is covered in Stepcomplex interactions common in 10GigE content-aware 4, while execution of them is addressed in Step 5.environments. Companies that lack such tools can employthem for the duration of a bakeoff by contracting with a 4. reate Individual Test Scenarios That Mirror the Cthird party for on-demand device evaluation services. Production Environment and Are Repeatable yet Random3. se Standardized Scores to Separate Pretenders from U With this step, the bakeoff process moves into Contenders comprehensive testing to fully validate the capabilitiesPurchasers can turn a long list of candidate devices into a of DPI-enabled devices. Authentic validation requires anshort list without performing comprehensive validation on accurate understanding of the application, network, andeach product by using standardized scoring methods. These security landscape in which devices will be operating.scores can quickly eliminate from consideration equipment Therefore, purchasers should review their own trafficthat clearly does not meet an organization’s needs. mix and the mixes of service providers before designing individual tests; this will ensure that their testing equipmentFor example, BreakingPoint has developed a Resiliency Score reflects the latest versions and types of application trafficthat is calculated using industry standards from organizations that traverse their network. They should also consultsuch as US–CERT, IEEE, and the IETF, as well as standard sets of independent security research as well as the findings of theirsecurity strikes and real-world traffic mixes from the world’s own in-house network or security operations centers for thelargest service providers. This scientific, repeatable process latest information on security attacks, including malwareis designed to enable meaningful comparisons without and evasions. Companies that need help in collecting thispartiality to any vendor. It uses a battery of simulations to information can turn to on-demand services that specializeevaluate a DUT’s capabilities in terms of throughput, sessions, in network security.robustness in the face of corrupted traffic, and security. Theresulting score is presented as a numeric grade from 1 to 100. It is important to note that packet captures (PCAPs) ofDevices may receive no score if they fail to pass traffic at any network traffic are inadequate for this survey of thepoint or degrade to an unacceptable performance level. The landscape, since they attempt to substitute a tiny sliceResiliency Score takes the guesswork and subjectivity out of of real traffic for a steady flow of it. Modern application-validation and allows administrators to quickly understand aware devices typically come equipped with huge cachethe degree to which system security will be impacted under memory, allowing them to ignore repetitive traffic suchload, attack, and real-world application traffic. as that found in PCAPs. Simplistic traffic such as plain UDP or HTTP packets, IMIX, or a blend of a few homogenousThe product certification firm Underwriters Laboratories has protocols is likewise inadequate, because it does not reflectrecently announced a similar standard, UL 2825, that uses a the complexities under which DUTs will operate once theyscientific evaluation system to validate network and security are put into production. For these reasons, real statefulequipment. Once the standard is implemented, it will serve application traffic, along with live attacks and malformedas a vendor-neutral benchmark for the performance, security, traffic, must be used to push devices to their limits.and stability of devices. UL intends to publish certifications forall equipment that meets the standards set forth in UL 2825. Generating real stateful traffic, however, is not enough: Validation processes must also be repeatable yet random.Purchasers can use one of these standardized scores to Repeatability demands that the testing equipment generateevaluate a list of perhaps six to 10 candidate devices. Working the same traffic in the same way for each DUT to ensure 6
    • accurate “apples to apples” comparisons. Randomization processes described in this section ensure that the devicemakes test traffic behave like real-world traffic, creating being evaluated can easily handle the load it will face, inunexpected patterns that force DUTs to work harder. terms of both sessions and application throughput. If theRandomization prevents vendors from relying on self- device cannot pass these tests with traffic known to bepublished performance numbers achieved in sterile free of attacks, there is no way it will process enough trafficenvironments designed to show their wares in a favorable once its security features are turned on or when it must alsolight. Creating repeatable yet random traffic requires the handle other stress vectors such as malformed traffic.use of a pseudo-random number generator (PRNG). Usinga PRNG, the purchaser sets a seed value, which the testing Sessionsequipment uses to create standardized tests by generating This set of tests uses TCP traffic to validate the DUT’s abilityall data variants in the same way for each test executed, to (1) create and tear down TCP sessions at a prescribedwhether for a single DUT or several. rate and (2) handle a prescribed maximum number of concurrent sessions. Each of these tests can be run in stair-Creating test scenarios around these guidelines reinforces step fashion, ramping up the degree of stress by steadythe quantitative, scientific principles laid down in Step 2 and increments until the device fails. This will determineprepares the way for the actual battery of customized tests whether the device achieves its advertised limits and howto be performed in Step 5. much headroom it has to handle peak traffic.5. xecute a Layered Testing Progression That Includes E Application Traffic Load, Application Traffic, Security Attacks, and Other These tests determine a device’s ability to handle Stress Vectors real stateful application traffic at high levels of load.This stage is the “main event” of a competitive device BreakingPoint, for example, offers a standard Enterpriseevaluation and, as such, deserves more detailed treatment application traffic mix that includes more than a dozenhere. During this stage, the wisdom of a progressive, of the protocols most commonly found traversingscientific approach to testing will become clear. By changing Global 2000 corporate networks. That mix can thenonly one variable at a time and testing the parameters set be customized by changing the weighting of variousforth at earlier stages, this progression will reveal the specific protocols or by adding other protocols that better reflectstrengths and weaknesses of each product, replacing the customer’s unique network environment.guesswork or uncertainty with verifiable results. The session and application traffic processes should all beOnce deployed, a device will not be subjected to one type of run three times. The first pass is a baseline run, using onlystress at a time; instead, it must deal with application traffic, a piece of cable and no DUT. The second pass is performedheavy user load, security attacks, and malformed traffic all with the DUT in place but with no security or inspectionat once. That is why the ultimate test in this progression will policies turned on. This should result in the purest measurebring together all of those elements into a single battery of of the DUT’s maximum ability to relay traffic. The third passtests. But to develop a proper understanding of how a DUT is performed with the device’s default security or inspectionhandles specific types of stress, its ability to handle load and policies turned on. Since the device will be handling trafficattacks will be tested separately first. Subsequent processes that includes no attacks, evasions, or malformed packets,will combine validation of load, security, and stress vectors. the policies should yield no positive results. But running thisDuring the bakeoff, customers should archive all of these process will indicate the basic impact on performance thattests so that they can be repeated exactly during the comes from having the target device’s application-awaredeployment phase explained in Step 6. features engaged.Load SecurityA device’s specialized capabilities—to block malicious traffic, Having probed the DUT’s ability to handle load withoutdetect trigger keywords, and so on—are not meaningful the complications of security attacks, it is time to try theunless they perform adequately under heavy load. The opposite case: security without load. A firewall, IPS, or 7
    • unified threat management (UTM) device will never be All Stress Vectorsbetter at blocking attacks than when it has no background The layering process concludes by adding other stresstraffic to contend with, so this portion of the testing will vectors that the DUT will encounter in a productionreveal how a DUT’s security features perform under ideal environment.conditions. Malformed TrafficKeeping the device’s default security policies in place, the This traffic can appear maliciously, or simply from devicecustomer runs a standard list of security attacks to see how malfunction. Either way, malformed traffic is a fact of lifewell the DUT catches known malicious traffic. The purchaser on every network and must be included in the bakeoffthen customizes the tests in two ways: (1) tailoring the plan. This portion of the bakeoff progressively determinesstrike list to exercise particular security policies within the how a DUT responds to malformed traffic, including framedevice and then (2) tailoring the device’s security policies to impairments at Layer 2, session fuzzing at Layers 3 and 4,handle particular strikes relevant to the customer’s network and application fuzzing at Layer 7.environment. As with all of the other processes in thebakeoff, these variables should be changed one at a time so Evasionsthat each test run can be used to isolate particular device At a minimum, this part of the bakeoff should include TCPcapabilities and problem areas. segmentation and IP fragmentation evasions. Depending on the customer’s network conditions, custom lists ofBesides establishing the basic security capabilities of a evasions can be included as well.firewall, IPS, or UTM, the customization in this portion of thebakeoff will also give IT staff members an idea of what level Adding these stress vectors to load and attacks completesof support they can expect from a manufacturer. Vendors the picture. Performing a bakeoff in this way ensures thatwill likely never be more responsive than when they are the device being considered can cope with the entire set oftrying to close a sale, so the customer support during this challenges it will face when deployed in the real world.phase should be excellent. 6. ay the Groundwork for Successful Negotiation, LCombining Load and Security Deployment, and MaintenanceThis phase of the bakeoff combines the ultimate tests from Deploying untested network and security devices createsthe preceding Load and Security sections. While this does serious problems for IT professionals, network and datanot complete the range of authentic conditions that will center infrastructures, and organizations as a whole.be included in the next testing phase, bringing these two Untested equipment requires weeks of post-deploymentvalidation processes together may be a watershed for some troubleshooting that is frustrating and time-consuming fordevices that simply cannot handle the combination of load staff members and that often leads to both finger-pointingand security attacks. and costly remediation steps. This is particularly true when device outages, security breaches, or unplanned bottlenecks impact the resiliency of entire infrastructures; such failures Evaluate shortlisted network security devices against a damage reputation and business value while leading to realistic range of potential live attacks... serious, even career-limiting, embarrassment for individuals. By contrast, conducting a rigorous bakeoff minimizes the Testing can expose performance-related problems caused by risk of all these problems and saves hundreds of hours of inappropriate security products, including high latency and staff time by eliminating surprises and guesswork. frequent “fail closed events.” This, in turn, may result in active devices being deployed passively or blocking being disabled, Bakeoffs can also lower equipment prices. Before a purchase making the devices significantly less effective. is completed, customers should use the information generated during the bakeoff to negotiate a discount with Gartner Report: “Guidelines for CISOs” the chosen vendor. That information demonstrates the actual 8
    • Do not limit these testing procedures to the purchasing cycle lab or disaster recovery backup environment. Beyond that, alone; make them an integral part of the ongoing security pre-deployment testing enables predictive modeling across maintenance regime by implementing a solid, continuous a range of use-case scenarios, allowing the IT professional to understand how devices and infrastructures will perform testing initiative. under different configurations and network conditions. It also enables customers to hold vendors accountable for Gartner Report: “Guidelines for CISOs” supporting and improving their products over time. Ultimately, the benefits of pre-deployment testing extendcapabilities of the device under the customer’s own network to the entire organization. Bakeoffs help IT organizationsconditions—not in the vendor’s lab. The purchaser can use control costs and reduce risks while optimizing thethat data to argue for what the device should cost based on performance, security, and stability of each device. Properdemonstrated performance rather than marketing claims. pre-deployment testing enables an organization to meetFor example, a company might select a firewall that meets the key objectives outlined in Step 1 to deliver higher valuethe specifications established in Step 1 of the bakeoff process and meet business objectives. This approach mitigates thebut that blocks attacks at only 70 percent of its advertised risks of outages and vulnerabilities, promotes rightsizing,top speed. In that instance, it would be much easier for the and drastically reduces the time, money, and frustrationcustomer to make the case that the vendor should offer a 30 required to deploy new devices.percent discount on the price of the firewall. SummaryOnce a device is purchased, the tests archived during Step 5 Purchasers of network and security devices should followshould all be run again to enable proper configuration and the scientific, quantitative progression of testing describedensure that the device is production-ready. The detailed here to fulfill the unique needs of their network and datainformation created by these tests gives customers the center infrastructures. Without following this approach,advance insight needed to configure equipment to they will be unable to accurately assess the DPI capabilitiesremediate weaknesses and achieve the optimal balance of today’s content-aware devices operating in 10GigEbetween performance and security. Using real-world traffic environments. It is particularly important that they clearlyto tune the device also promotes rightsizing, because it define necessary device specifications, use standard testingallows engineers to build in enough of a performance methodologies, and validate devices against networkcushion to handle peak traffic, but without creating waste conditions that mirror reality. CISOs and other IT leaders canby overbuilding that cushion. The exact data collected follow the technical recommendations laid out in this paper,from pre-deployment tests also makes it easier to work or they can outsource the work to testing experts using on-with vendors to remediate problems. Customer engineers demand professional services.can share definitive test results, forestalling arguments andallowing vendors to correct problems more quickly. All ofthese benefits enable staff members to deploy equipment About BreakingPointsmoothly, without wasting time and money on remediating BreakingPoint provides the turnkey services thatproblems after the fact. organizations need to gain advance insight into how devices, networks, and data centers will perform under theirThe benefits of pre-deployment testing extend to entire unique traffic mixes. BreakingPoint professional servicesinfrastructures as well. Advance simulation with real-world provide actionable results in only days by leveraging theconditions gives IT staff visibility into how device deployment company’s patented product, dedicated security researchwill impact other infrastructure elements and how those team, and best practices from the Global 2000. Unlike anyelements will affect the device. These insights allow companies other offering, these services enable IT professionals toto install a new device with confidence that it will not disrupt create the real-world simulations required to quickly andthe production environment, but without requiring the cost-effectively harden IT resiliency, minimize IT risk, andtrouble and expense of deploying the device first in a test train their own cyber warriors. 9
    • BreakingPoint Device Evaluation Service The BreakingPoint Device Evaluation Service provides a Contact BreakingPoint complete comparative evaluation, or “bakeoff,” of content- Learn more about BreakingPoint products and services aware network, security, and data center devices using a by contacting a representative in your area. customer’s own network conditions. The service includes 1.866.352.6691 U.S. Toll Free the setup and execution of high-performance stateful www.breakingpoint.com application and attack simulations that mirror real-world traffic for each device. In less than a week, the customer BreakingPoint Global Headquarters will receive detailed analysis of the performance, stability, 3900 North Capital of Texas Highway and security of devices such as application servers, load Austin, TX 78746 balancers, firewalls, IDS/IPS devices, virus and spam filters, email: salesinfo@breakingpoint.com and more. A BreakingPoint Device Evaluation can be tel: 512.821.6000 conducted as a one-time project, providing the advance toll free: 866.352.6691 insight needed to confidently benchmark, select, and negotiate the purchase of IT devices. Or a yearly subscription BreakingPoint Government Solutions Group service includes ongoing Device Evaluations to ensure that Washington, D.C. equipment remains resilient over time. email: government@breakingpoint.com tel: 703.443.1501 Further Reading BreakingPoint has published a series of step-by-step test BreakingPoint EMEA Sales Office methodologies to help enterprises understand how to Paris, France validate the performance, security, and stability of devices, email: emea_sales@breakingpoint.com networks, and data centers. Methodologies include: tel: + 33 6 08 40 43 93 • How to Test Dual Stack IPv4/IPv6 • How to Test Server Load BreakingPoint APAC Sales Office • How to Test Firewalls Suite 2901, Building #5, Wanda Plaza • How to Test IPS Devices No. 93 Jianguo Road • How to Test DPI Devices Chaoyang District, Beijing, 100022, China • How to Test Server Load Balancers email: apac_sales@breakingpoint.com tel: + 86 10 5960 3162 To access these resources, visit www.breakingpoint.com/services/.www.breakingpoint.com 10© 2005 – 2011. BreakingPoint Systems, Inc. All rights reserved. The BreakingPoint logo is a trademark of BreakingPoint Systems, Inc.All other trademarks are the property of their respective owners.