When you're at war, your software needs to work -- not in the lab, but in reality. In the field! You don't have time to test your software in the lab, and you don't care whether it works in the lab. You need field-tested software. Software that works -- in the field -- where you need it to work.
Normal Software QA
Normal software QA pays lots of attention to the process of defining, building and deploying software. You hear phrases like "Do it once and do it right;" "quality by design;" "we don't just test, quality is part of our process." There are lots of them. They all, one way or another, promote the illusion that mistakes can somehow be avoided, and that we can -- finally -- have a software release that works, and works the way it's supposed to. This time we're going to take the time, spend the money, and do it right!
How did that work out for you? Most often, it's like predictions of the end of the world. The date comes, the world is still here, and people try to avoid talking about it. Similarly with that great this-time-we're-doing-it-right release, the release comes, there are roughly the usual problems, and people try to avoid talking about it. Or there were fewer problems, but the expense and time were astronomical. Or there were fewer problems, but not much got released. Whatever.
Here are some favorite phrases: "It worked in the lab!" "How could we have anticipated that case?" "The test database just wasn't realistic enough." "Joey So-and-So really let us down on test coverage." "We had the budget to do it right, but not enough time." "We didn't have enough [tools] [training] [experienced people] [cooperation from X] [lab equipment]..." Excuses, every one. Perhaps there's a fundamental reason why we always fail?
This is a subject your CTO and your chief architect should stop ignoring and pay serious attention to. It isn't the only subject, but it sure should be #1 on the list.
QA Should be Field-Based
Who cares how the software operates anywhere except in production?? The lab environment is always different from the production environment. And the most embarassing problems are the ones where it worked in the lab but failed when deployed. The number of potential causes is endless; different machines; different loads; different network delays; different database contents; different user behavior; different practically-anything!
Given this, why wouldn't you test your software on the actual machines it will run on when it's put into production? Of course, you don't load-balance normal user traffic to the test build as though everything were hunky-dory. That's just asking for trouble. But it's not hard to send a copy of the traffic to the test machine. That alone tells you huge amounts. Did it crap out with a normal load? Now there's a real smoke test. And there's lots more you can do as well.
Conclusion
Your customers don't care how your software worked in the lab. They only care how it works for them. Yes, that's awfully self-centered, but that's just how they are, and no one is likely to talk them out of it. So live with it, and shift from pointless lab testing and back-office quality methods to actual field-testing of your software. Yes, it's messy, dirty and uncontrolled -- but it's real life! It's where your software has to run! Better it should get used to it sooner rather than later.
Comments