I've been on a "simplify, simplify" trend recently. Complex systems are a fact of life, and sometimes we have to address that complexity head on. However, for many tests and for many types of tests, addressing a complex system isn't really necessary. We can often break a system into its constituent parts and test those.
Breaking a system into its constituent pieces and identifying those pieces gives us direct access to each part of the system for testing. We can better exercise things we can interact with directly. Inserting abstractions - other methods or entire layers - makes it harder to actually touch the thing we're testing. Take this notion far enough and we've got a unit test. Take it bit less far and you've got a manageable test that exercises what you want it to, and nothing extra.
How simple is simple enough? The short answer is "as simple as is relevant and no simpler". So our challenge is to figure out how small the piece of the system can be and still produce a test with all the relevant variables. There is no magic tool to do this; it requires pretty good knowledge of your system. Start with the entire system and just keep removing cruft (aka things in your system that are great and relevant, but not for this particular thing) until removing cruft changes your test. I find it easier to go through this paring down process with a very few test cases; this makes it much easier to see when your results change and you've paired down too much.
So, when complexity is needed - integration tests, full systems tests, etc - tackle the complexity of your system head on. When complexity is not needed, though, test the system simply and directly. You'll get more precise results... without pulling your hair out!