Things start off simply enough: testers set up a system (or several) with the candidate code running on it, and go through finding stuff. Combine that with the bugs that just didn't quite get fixed during feature development, and dev's busy knocking down their bug queues. All well and good so far.
Shortly after that, though, we start to have a bit of testing dilemma. The list of bugs awaiting verification is growing, and the list of areas of the product you still have to cover is shrinking but still definitely far from zero. What do you do first? Prove the fixes, or go looking at areas of the product you haven't touched yet?
In other words, how do you balance coverage and verification?
As with most things we talk about, the answer is "it depends" and "probably not 100% of either". There are good arguments for both actions.
Benefits of increasing coverage:
- If there's a scary issue you haven't yet found because you haven't looked in that area, well, better to find it sooner rather than later.
- The less that is untested, the lower your risk overall (I tend to believe that minimizing unknowns generally minimizes risk).
Benefits of verifying bugs:
- It closes out the lifecycle nicely - you really know it's done.
- If there are any bugs hiding behind the bugs you're verifying, you're likely to catch them here. You are still improving coverage by increasing your depth of coverage in that area.
Typically, I'm going to start off a release really worrying about coverage, and over time we'll increase the amount of verification we do, finishing with a coverage-oriented pass. The schedule looks something like this:
- Beginning of the cycle: 80% of the team's testing time is spent on extending coverage. Until we've done one pass through pretty much everything, I'll err on getting that one pass done.
- About 35% of the way through: Now that we've hit the basics, we start worrying more about finishing out some of this work - say 50% on verification and 50% on more general coverage increases.
- About 60% of the way through: We've flipped and are spending most of our time verifying defects; we do get some incidental increased coverage but it's less of a focus.
- About 90% of the way through: There's a big flip at the end of the release cycle. By now you should have verified pretty much everything that's coming in, and your find rate is probably pretty darn low, so we're just looking at coverage again. 95% coverage increase, 5% rounding out those last few bugs.
This is really just a general guideline. Based on what we're finding and what's getting fixed we'll change it up a little. If our reopen rate is pretty high we'll err more on the side of defect verification. If this release has touched some deep underlying parts of the code, we may err on the side of increasing coverage sooner to pick up side effects. The point is more to introduce some predictability - depending on where you are it's okay that bugs are piling up a little bit in QA, or it's okay that you haven't touched an area yet - and to make sure that you're setting expectations correctly.
In the end, you're going to need both defect verification and some level of coverage before you're through with a release cycle. Think in advance about how you're going to achieve both, and you're much more likely to actually do so in a manner that gives you the feedback you need as early as possible.