"This takes the imperfect parts of our simulation of production and attempts to move them into memory size, disk space, and other hopefully less interesting areas."
He's talking about irrelevant imperfections, which is a key concept in testing.
Assumption: No matter how hard you try, you will not completely mimic the production environment. Something will be different, whether it's number of users, amount of data in your database, machine configuration, use of self-signed vs purchased certs, etc.
The further assumption of successful testing is to make sure that the differences between your test system and your production system are irrelevant. It's the tester's job to make this as true as possible.
Analysis of your system can help determine which imperfections are irrelevant:
- For many systems, the hardware resources won't limit most functional testing. If this is a standard webapp, for example, a machine with 2GB RAM is probably about as good as a machine with 4GB RAM.
- In general, the patch level of the underlying OS won't matter unless your product is particularly tied to the OS. Sure, Windows 2003 vs Windows 2008 Server will be different, but Windows 2008 Server patched vs unpatched probably wouldn't be relevant.
- The particular network configuration is probably safe to change in many cases. If you have a 192.168.x network internally and a 215.16.x network in production, that's hopefully irrelevant to any product issues.
All these are imperfections, but for many systems they are irrelevant. Other imperfections are more relevant (e.g., separating the database server to a separate box might matter greatly to your application, or testing with 1 client when production usually has 10 clients simultaneously connected).
Accepting that you won't completely mimic production, the onus is on you as a tester to figure out what is different and make sure that the list of differences are as irrelevant as possible.