Friday, August 31, 2007

Testing for Typos

We've been having a debate lately about how much QA should test marketing output.

The History
Although we're a software company, there are things that (gasp!) engineering doesn't touch. One of the biggest of these is our corporate website. This is the domain of marketing. They've got a content management system to input information, they have their own graphic designer to do logos, screenshots, etc., and they have the ability to publish to production.

Updating the corporate site is something that happens two or three times a week. It's mostly small stuff (a new news article, for example), but it changes a lot.

The Trigger
Sure, we're not SUPPOSED to test the corporate site, but we do use it to get to our downloadable software (log in, click a download link). Being testers, and being slightly picky about the English language, we noticed some typos on the site.

Nothing major, so we dropped a nice note to marketing whenever we noticed things. Then one day it happened. Our "public offering" was missing a critical letter "l", right on the front page of the site. No one caught it until a tester happened to be on the site the next day.

The Debate
Yes, marketing should have caught this. But they didn't, so how much should we test the corporate site? Sure, a tester found things, but that doesn't necessarily mean that QA should have to test the entire site. After all, QA is busy testing the actual software and shouldn't be blocking corporate website updates. However, quality control on the marketing side was clearly lacking.

The Resolution
We finally wound up approaching this problem two ways:
1. marketing now uses a link validation tool (WebLink Validator) for spelling and grammar checks, and as a bonus gets a check for dead links.
2. A marketing intern now comes over to QA two hours a day and is learning how to test a text-oriented website. Just the intern, a dictionary, and lessons in close reading.

No comments:

Post a Comment