"Requirements should be detailed, and should be set well before testers ever see the software."
I can see where that impulse comes from; it sure sounds like a nice world where we all - from product management to development to test to support - know what we're going to do and then do it and then go home and have a nice supper. News flash:
It's not going to happen.
This idea of preplanning is at best idealistic. It can also be dangerous, since it leaves you without the ability to handle changes after that early detailed planning. In the real world, you can expect to see some or all of the following:
- Some behavior that is entirely correct according to the requirements but has an odd consequence.
- Use cases in the field that you never imagined and that want something just a bit different than the requirements you specified.
- Requirements that seem detailed but leave out some point of clarity that doesn't get noticed until development or until test.
- Requirements that change due to some late-breaking information.
- Requirements that someone thought was going to satisfy a customer's need but don't quite actually do so.
All of these add up to two things: you will need to resolve ambiguity, and you will need to handle change in requirements. Fortunately, there are simple things you can do here:
- Communicate. Make sure there is a project meeting including everyone from product management to support at least once a week where you can talk through ambiguities and changes. Don't be shy about talking with other disciplines outside this meeting, too.
- Give Yourself Some Space. Changes and ambiguities generally mean re-testing or expanding your tests. Give yourself a little padding time for that.
- Test Use, Not Requirements. Make sure you test based on customer use cases in rather than the requirements that are derived from those cases. Requirements can be tested directly, but users ultimately care about how the system works as a whole more than about a single requirement.