Low-Cost ICS Network Performance Testing
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Low-Cost ICS Network Performance Testing

  • 316 views
Uploaded on

Network performance testing for devices and systems can be a daunting task for vendors and end-users given the cost of test equipment and the investment that the companies have to spend in......

Network performance testing for devices and systems can be a daunting task for vendors and end-users given the cost of test equipment and the investment that the companies have to spend in developing relevant tests and understanding the results. During the last couple years, a group of low cost computing systems have been introduced that are very capable from a functional point of view, but how well do they actually perform? Can they be used in a low-cost performance testing lab system to validate ICS devices before they go into production? Can end-users use them to capture live traffic in their network and get reliable performance results? This talk will discuss how and when different types of equipment can be used to develop a low-cost network performance testing lab. It will also show results from a series of performance tests conducted on some of the equipment and with different testing architectures.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
316
On Slideshare
299
From Embeds
17
Number of Embeds
3

Actions

Shares
Downloads
9
Comments
0
Likes
1

Embeds 17

https://twitter.com 15
http://www.linkedin.com 1
http://www.slideee.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Low-Cost ICS Network Performance Testing Jim Gilsinn Kenexis Consulting June 6, 2014 SCADASides 1
  • 2. How This Got Started • In 2001 while I worked @ NIST my boss said: • Industrial Ethernet is the next big wave for manufacturing, so say our customers (auto manufacturers) • There are still a lot of questions about how well it performs • Is it deterministic enough for the factory floor? Yes, but… • Are there standardized metrics to show performance? Yes, but… • Are there test tools available? Yes, but… • Can companies put performance requirements into their procurements yet? Yes, but… June 6, 2014 SCADASides 2
  • 3. Determinism • Vendors were building industrial Ethernet products that claimed certain performance • End-users were finding quirky performance • End-users would complain • Vendors would say, it works in our lab, there must be a problem in your system • End-users learned not to trust performance claims from vendors • Some build labs to approve devices before implementing them June 6, 2014 SCADASides 3
  • 4. Standardized Metrics • Vendors would describe their performance in many different ways and with varying definitions • With ODVA, I helped to create a standard set of metrics for end-point devices based upon IETF definitions • Throughput • Jitter/Variability • Latency (action latency, response latency) June 6, 2014 SCADASides 4
  • 5. Test Tools • After creating the metrics, NIST helped ODVA develop a set of performance tests • We build the ODVA Performance Testing Laboratory that ODVA charges companies money to certify their stated performance • No one has run the test since no one wants to fail • ODVA charges for every time a company tests and retests • NIST went on to develop a free capture file analysis tool • Available on SourceForge (1st gen is IENetP, 2nd gen is FENT) • Both of these are dormant • NIST also worked with the ODVA Interoperability Workshop to develop a series of 5 tests that could be conducted quickly June 6, 2014 SCADASides 5
  • 6. Procurement Language • Big auto manufacturers have tried to get their vendors to use ODVA performance lab • Hasn’t worked out well • Have convinced vendors to go through PlugFest testing • Vendors and end-users have started using a common language • I guess that’s as good as it gets for now June 6, 2014 SCADASides 6
  • 7. Low-Cost Performance Testing • Uses low-cost/readily-available equipment • Low-cost is relative, $15 – $3k • Readily-available, like laptops, switches, etc. • Uses open-source/low-cost/readily-available software • Open-source, like Linux, Wireshark, background traffic, and analysis tool • Low-cost analysis tool (Kenexis, in development) • Readily-available, like Windows, Office, browsers • Additional useful tools • Protocol-dependent master/scanner (software will get you ~2ms) June 6, 2014 SCADASides 7
  • 8. Testing Equipment • Laptops x2 • Alienware M14x-R2 • Ubuntu 14.04 native • Windows VM • Backtrack 5r3 USB • DreamPlug • Raspberry PI • Model B, rev 1 • Netgear GS108E Switch • Throwing Star LAN Tap • Hilscher netANALYZER June 6, 2014 SCADASides 8
  • 9. Testing Software • Linux (Ubuntu 14.04, Backtrack 5r3, Kali) • Wireshark (apt-get and compiled) • PlugFest background traffic captures and scripts • NIST Analysis Tool • 1st Generation = IENetP – http://www.sourceforge.net/projects/ienetp • 2nd Generation = FENT – http://www.sourceforge.net/projects/fent • Kenexis Analysis Tool • Follow-on, in development June 6, 2014 SCADASides 9
  • 10. PlugFest Background Traffic • Traffic Captures • Generated by Ixia network analyzer and packet generator • Assembled into different sets (editcap & mergecap) • tcpreplay Scripts • Generated Linux scripts to replay capture files • Conducted Analysis of Results • Packet generator transmitting • Laptop transmitting • Laptop receiving June 6, 2014 SCADASides 10
  • 11. PlugFest Background Traffic Traffic Type Rate (pps) Baseline Steady-State Managed Steady-State Unmanaged BurstManaged Burst Unmanaged ARP Request Broadcasts 180 Gratuitous ARP Broadcasts 180 DHCP Request Broadcasts 100 ICMP (ping) Request Broadcasts 100 NTP Multicasts 10 EtherNet/IP ListIdentity Req. 10 EtherNet/IP Class 1 1800 ARP Burst Requests 240 pkts @ 4k Hz
  • 12. PlugFest Testing Architecture June 6, 2014 SCADASides 12
  • 13. Eye Chart Slides Ahead June 6, 2014 SCADASides 13
  • 14. June 6, 2014 SCADASides 14 Example PlugFest Testing (Hilscher)
  • 15. June 6, 2014 SCADASides 15 Example PlugFest Testing (Switch Mirror)
  • 16. Low-Cost Testing Architecture June 6, 2014 SCADASides 16
  • 17. Low-Cost Testing • Laptop  Laptop • Laptop  DreamPlug • DreamPlug  Laptop • Laptop  Raspberry PI • Raspberry PI  Laptop June 6, 2014 SCADASides 17
  • 18. June 6, 2014 SCADASides 18
  • 19. What The Data Shows • Hilscher Capture Card • 10ns resolution time stamping • Hardware assisted • Good enough for hard real-time performance testing (1s µs) • High-End Laptop • Backtrack/Kali better than Ubuntu • Running from USB stick works • Good enough for soft real-time performance testing (~100 µs) June 6, 2014 SCADASides 19
  • 20. What The Data Shows • DreamPlug • Good enough for mostprocess control • Offset of mean (~5-10 ms) • Random delays occur (~5-20 ms, sometimes 100+ ms) • On-par with Windows performance • Raspberry PI • Good enough for slow process control • Offset of mean (~5-25 ms) • Random delays occur (100-1000 ms) June 6, 2014 SCADASides 20
  • 21. More Information • Jim Gilsinn, Kenexis Consulting • Email: Jim.Gilsinn@Kenexis.com • Phone: 614-323-2254 • Twitter: @JimGilsinn • SlideShare: http://www.slideshare.net/gilsinnj • Kenexis GitHub • https://github.com/kenexis/LowCostPerformance June 6, 2014 SCADASides 21