This document discusses performance testing challenges for an agile development team working on a performance critical Java application. It estimates that manually executing performance tests against 9 configurations would take 1+ man-months. To address this, it evaluates options like adding more performance engineers, limiting tests and configurations, or automating performance testing. It recommends automating testing for benefits like running tests continuously and allowing small teams to efficiently test performance. The case study details how the team automated testing using JMeter, built a process integrated with TeamCity, and upgraded infrastructure to support concurrent testing. Automation reduced the testing cycle from over 1 man-month to 4 days, allowing more time for analysis and new testing while finding 17 issues.