There are aspects of the theory and practice of computer software that drive me nuts.
I look at a widely accepted theory or practice, and am aghast that the cancer spread and became part of the mainstream. I look at a genuinely good practice and am shocked at the way everyone lies and slithers to make it sound to the unsuspecting that whatever lousy thing they do is a shining example of a good thing. I look at a nearly-universal approach to problem solving that may have made sense many Moore's-Law generations ago, but has long since been rendered irrelevant by advances in hardware. And I look at proven, understandable techniques that improve productivity by whole-number factors that are spurned and/or ignored by most people in the field.
How significant is this stuff I'm complaining about? A minimum of a factor of 10. Sometimes a great deal more. So it matters! This is theoretical, but it has real-world, practical implications. It's the only way, for example, that small companies can beat large, established ones that have software staffs 10 to 1,000 times larger.
I think I understand some of the causes of this deplorable state of affairs. But that doesn't make me feel better. And it definitely doesn't cure the sick or empower the unproductive.
A number of my private-distribution papers go into these subjects in considerable depth. But I thought it might be interesting to summarize a couple of the main observations here.
Among the causes of bad software are:
The blind and deaf leading the blind. In the majority of cases, the people in charge have little to no personal experience of creating software, and no interest in how it's done. This is about as sensible as having someone in charge of a baseball team who not only can't play baseball, but can't see onto the field where it's being played; all he can do is see the score at the end and hear reports from the players about how things are going. So everything turns into politics -- convincing the blind, clueless boss that you're the great contributor, everyone else is a chump. The larger the organization, the more likely it is to be led by such a genius.
Process over substance. The larger and older the organization, the more likely it is to elevate the importance of process, and the more elaborate and all-consuming that process is likely to be. The more process there is, the less code gets written, and the productivity of the innocent few who actually want to work gets ground down to the abysmal norm.
Lawyers. Shoot the lawyers! Shoot them! They and their legal ways are a plague. When lawyers see a problem, they want to write a law or create a regulation that makes the problem go away. Except that instead of saying this in simple, results-oriented terms (e.g. "programmers should not be allowed to die unnecessarily while writing code"), they say it in terms of incredibly elaborate, micro-managing, this-is-how-thou-shalt-do-it regulations (e.g., "programmers should be offered a nutrional meal no later than 12 noon local time each day consisting of no less than..." and 538 similar instructions, constantly growing). If regulations of this kind actually led to good results it would be one thing. All I should have to say at this point is "credit card theft" and PCI.
Fashion over function. Programmers are supposed to be nerds, aren't they? How can programmers let their decisions be made by "fashion," whatever that's supposed to be in the world of software? I refer you to the first point, about software teams being led by people who are clueless. These people want to seem smart in the eyes of their bosses and peers, who know even less than they do. So they make decisions that are guided by C-office fashion trends, which are usually laughably out of step with true optimal decisions.
A preference for bad new ideas over good older ones. How do bad ideas catch on? Some company promotes them. Books are written. Appealing rhetoric is created and repeated. Suddenly a fashion trend is born. Someone can appear to be smart and modern just by advocating the cool new stuff and sounding smart, without actually knowing anything or doing anything. Everyone involved thinks they're helping advance the state of software, when in fact they're digging the hole of despair deeper. It's remarkably like when the dumbest guy in Scotland moves to England, thereby increasing the average intelligence of both places.
A preference for bad old ideas over good newer ones. There aren't many good new ideas; mostly there are old good ideas that are still good. But because conditions change (like processors getting fast and memory getting cheaper), ideas emerge to respond intelligently to the new state of affairs (like this one), instead of blindly and stupidly continuing on as though nothing had changed. Like most people do.
Wrong scope; usually too narrow. This is classic optimization over too narrow a range. Much bad software results from compartmentalism or simply failing to look at things through the eyes of the ultimate consumer. This is an incredibly common mistake. The trouble gets really bad when is the practices that result from the little "silos of excellence" get elevated into industry "best practices." (Whenever I hear the phrase "best practice," it's really hard for me to avoid getting a serious look and pronouncing in deep tones "you folks had best practice a bit longer before you inflict yourselves on the world of paid, professional programming.")
I'm sure there are many more causes of bad software than the ones I've listed here -- we haven't even gotten into bad programmers in their innumerable incarnations! But every item on the list above thoroughly deserves inclusion, even more because most of them are not listed by most people as a cause of software malaise.
Comments