Here's the thing, though: I'm not just verifying bugs. I'm performing lots of other tests at the same time.
For example, a bug I verified was a display problem with replication progress. This is a small issue, but, hey, it's fixed, so we'll verify it.
To verify it, I had to:
- install two systems
- create a volume on one system
- configure replication between the two systems
- configure replication on the volume
So, just to verify one bug, I had to do an installation test, a volume creation test, and a replication test. All I had to do was a quick check to confirm these weren't throwing errors not visible to the end user, and then I have a small other thing done. Repeat enough time, and this adds up to rather a lot.
So next time you're verifying a simple bug, ask yourself what other tests you're doing. You may be accomplishing more than you think!
Are you familiar with Replay's recording and replaying tool for software debugging? It's like a DVR for applications - defects can be reproduced without the need to replicate the environment.
ReplyDeleteCheck it out: replaysolutions.com
(Disclosure: I work for Replay)
I noticed that when verifying bugs i usually stumbled on a 10-20% reopen and introduce rate
ReplyDeletewondering if you've experienced that same
Basim: That reopen rate is much higher than I've experienced. Generally I see maybe 1 or 2 bugs reopen a release (so.... 1% or so?), and usually those are cases where there's been some sort of human error in checking in or merging (missing a file kind of thing). In terms of bugs introducing other bugs, that's a bit higher, but I don't have a great feel for the amount.
ReplyDeleteVered: I've heard of the tool but not worked with it. I'll have to check it out.
Thanks, Catherine. Let us know what you think when you have some extra time.
ReplyDelete