Sunday, June 17, 2012

Introduction to Agile Grinding

Hello everyone - this is hopefully one of many future posts exploring the features of the performance test suite called The Grinder. I've used a lot of different performance test tools in my career, and I've come to realise that The Grinder provides an elegant balance between functionality, usability, and flexibility. It doesn't have all the bells and whistles of an enterprise performance test tool such as HP LoadRunner or Borland SilkPerformer, but then again it also doesn't have the five or six figure price tag. What it does have is a very rich set of features out of the box and, using the embedded Jython interpreter, a comprehensive and easily extensible set of programming tools that can leverage all of the power of Java directly from within Jython and The Grinder. Need a JSON interpreter? Simply import the Codehaus Jackson JSON stream reader. Need to integrate with ActiveMQ? Simply import the ActiveMQ client libraries and start processing messages from a JMX queue.

At the same time the basic workflow of creating and running test scripts is easily learnable by non-specialist programmers. Once you get your head around the Python/Jython syntax you'll find it hard to go back to languages such as Java or C#. For a start, you'll forget to append ";" and use "{}" instead of indentation! Writing Jython makes for clean, succinct scripts, and it is a pleasure to "hit the button" and see the script spin up within The Grinder framework and start producing hands-on, relevant, and reliable performance test results.

My background for over 18 years is in Linux systems administration, performance testing, and performance analysis. I'm going to start with a general introductory tutorial covering all the main features of The Grinder, and then start to develop a framework of Open Source tools around The Grinder that focuses on Agile Methods. I have noticed that almost all areas of software development are adopting agile principles in part or in full. This includes functional testing, with such paradigms as Test Driven Development. However, in my experience performance testing still seems to operate in a very "once it is completely developed, then we'll do a complete performance test" waterfall approach. This needs to change, as performance testing needs to become more flexible and provide much earlier feedback so it can better integrate into agile methodologies. I don't see the commercial enterprise test tools being capable of entering this area, as the intent is small, fast, simple iterative feedback that is directly relevant to the developers cutting the code. Too often performance test results are remote in time and space from the very people who need to understand and respond to them, namely the developers who need to know immediately: "if I check-in this one line change, what will be the performance implications?".

No comments:

Post a Comment