Friday, August 28, 2009

Rethink Your Test Plan

Testing for a release, our test plan breaks down about like this:
  • 90-95% regression tests
  • 3-8% new twists on existing stuff
  • 2-7% new stuff
Of course this varies a bit across releases. Every once in a while we get a huge new feature, and every once in a while we'll have a release with nothing truly new ("it's everything you had before... better!"). But in general we have a system that has been around a while, is fairly featureful, and has an awful lot of areas that should still work just like they did before.

(TANGENT) In general I like existing features to stay for the most part. I tend to be suspicious of products that are always doing "clean sweeps" or huge changes; I think it implies you didn't really think the product through the first time and you're probably not really thinking it through this time, either. When you had something good, keep it. (END TANGENT)

The downside for testing is that things start to get stale. You've started a new release and you have to test file-level enforcement of ACLs... again. To a certain extent you can automate away some of the boring parts, but not everything. So you fall into a rut, and then you start missing stuff.

So you change it up. You reformat your test plan. You move sections around. We've been doing this for about a year now and it's helped us find things. Some of the things have been new bugs, others have been areas that we weren't covering well at all.

The next step is to rethink the technique your test plan implies. One of our discoveries was that our test plans made a smoke test very hard. The testing provided by the automated checkin suite was fine, and that got an initial smoke test. The nightly automated tests provided some level of coverage in a guaranteed amount of time. But the human testing.... that got deep quickly and got coverage more slowly. Basically, the test plan was encouraging us to test the heck out of, say, volume creation, even as we managed to ignore, say, NIS integration until later in the process. We eventually hit it all, but we could do it better.

So now we're reformatting again. This time we're explicitly calling out a "smoke test" of each section. We'll do all the smoke tests early, and then go deep.

Test plans are living documents. They're a record of what you're doing, but they also guide your thinking. So when your results get stale, change the test plan. When your thinking gets stale, change the test plan. It will change your tests... see what you can find now!


  1. Catherine,

    Well placed post, you have eloborated importance of test plans quite well, as you said, its live doucment, but now a days agile folks or Exploratory testing discourages having test plans or other documnets, i want to know whats your point of view in this regard?

    Nice post again :) and really thankful for your kind reply.:)

  2. I'm not sure that I totally agree that Exploratory testing discourages documentation. I personally am very into documentation - I think sometimes that if I don't write it down I'll forget it!

    There is a distinction between different kinds of documentation. Checklists, for example, are a hugely different thing from a document containing 1000 scripted test cases. I tend to have a couple different kinds of documents around:
    - an overview of testing at Permabit, generally intended for new testers and new developers. This tends to be a basic "what each part is supposed to be doing" and "so you wanna run a test" type things.
    - a test strategy describing Permabit's approach to testing in general. This one is mostly used for customers, analysts, etc. who want to understand how we ensure product quality.
    - test plan that basically consists of a series of checklists for things we're going to do in human testing. This is basically the one we live in while we're doing testing preparatory to a release. To keep it fresh, we change this one a lot. It's generally feature oriented (e.g., a "WORM" section and a "replication" section), but lately we've been moving to more integrated "real world-ish scenario" tests to try to change up our coverage and increase cross-feature testing.

    That was a really long way to say that I think you have to write something down eventually, in order to remember what you've done, to avoid repeating each others' work, and to show people outside the group what you've done.

  3. Thanks for the response catherine, i got your point now :)