How often do software projects fail? While it’s never talked about and rarely makes the news, the fact is that software projects fail with alarming frequency. Exactly how often do they fail? The shocking answer is, despite there being august professors of Computer Science and Computer Engineering falling out of the woodwork, doing research, pronouncing their deep knowledge in papers and teaching the next generation of students, the truly shocking answer is that no one knows! Or cares! Oh, they say they care about teaching people how to create great software, all you have to do is learn the intricacies of Object-Oriented Programming (or whatever). But none of them studies the results. There is no science in what is called “computer Science.” They all declare they know how to create good software – so long as no one takes the trouble to measure and count, judge and evaluate, to see how often it happens.
In the past, I’ve compared bridges falling with software failures. We can understand this from another angle by looking at … baseball.
Baseball Science
Aaron Judge is an amazing professional baseball player, part of the New York Yankees. He’s an excellent fielder, but he’s best known as a hitter. In his nine seasons so far, he’s already hit over 300 home runs, with 50 so far as of this writing. His batting average this season is an awesome .331 so far, with a .288 average for his career.
Let’s think about that enviable .331 batting average he’s got so far this year. Any player would love to have such an excellent average. But that’s in the context of professional baseball. Simple arithmetic tells you that this “excellent” average means that he only gets a hit in one third of his at-bats! More than two thirds of the time he strikes out or otherwise fails to get a hit! What would you think of someone who managed to drive safely to work just a third of the time, getting into accidents or otherwise screwing up two thirds of the time? What would you think of a car that only worked a third of the time you tried to drive it? And so on … you get the idea.
Why is this? Are all these highly paid pro ball players really losers? Of course not. You can see what’s going on if you watch the home run derby, when top hitters go the plate and get soft, easy pitches from a pitcher standing behind a screen so they don’t get hit. Those guys nearly always hit the ball, and lots of them are home runs! But we all know that’s a once-a-year event.
Most of the time, there’s a highly skilled pitcher on the mound whose goal in life is to strike out the batter or sucker him into hitting an out. Pitchers like Gerrit Cole, who is so good that he’s being paid $36 million dollars this year.
When you’re a batter and walk up to the plate facing Gerrit Cole, you know you’ve got someone who’s in an excellent position to lower your batting average.
So is creating software like trying to get a hit with a fearsome skilled pitcher on the mound? Or is it more like trying to drive your car to work or designing and building a reliable car that just works, with remarkably few exceptions?
The sad fact is that building software resembles the home run derby, except that instead of trying to hit the ball out of the park, all the batter has to do is … not miss. With a rule like this, you’d expect something close to a 1.000 batting average. They try to make it even easier in the software world by messing with requirements, tripling estimates and doing everything they can to make the project a “success.”
Is software really that bad? Yup. Just for fun, I’m going to share a couple of the rarely publicized stories of software failures I made note of a dozen years ago. With things like ransomware exposing the ugly underside of most software operations -- not to mention the awfulness of computer security in general -- you can be sure things haven’t gotten better.
Sample Failures
In 2014, the VA admitted that they have over 57,000 patients waiting for their fist visit. What's their excuse?
"The official also said the VA needs to update its scheduling software package, which the department has been using since 1985. “It predates the internet and Blockbuster’s rise and fall,” he said."
Does that count as a software failure? It's more like the software department died years ago and no one noticed.
Here's a good one from 2012:
The U.S. Air Force has decided to scrap a major ERP (enterprise resource planning) software project after spending US$1 billion, concluding that finishing it would cost far too much more money for too little gain.
Dubbed the Expeditionary Combat Support System (ECSS), the project has racked up $1.03 billion in costs since 2005, “and has not yielded any significant military capability,” an Air Force spokesman said in an emailed statement Wednesday. “We estimate it would require an additional $1.1B for about a quarter of the original scope to continue and fielding would not be until 2020. The Air Force has concluded the ECSS program is no longer a viable option for meeting the FY17 Financial Improvement and Audit Readiness (FIAR) statutory requirement. Therefore, we are cancelling the program and moving forward with other options in order to meet both requirements.”
The Air Force will instead need to use its “existing and modified logistics systems for 2017 audit compliance,” the statement adds.
They started spending money in 2005, spent over a billion dollars by 2012, got nothing of value, estimated they'd need to spend the same again for eight more years to get about a quarter of the original plan done. If there were anyone paying attention to batting averages in software, that would likely be a winner.
Here is rare-to-find information on the frequency of failures from a book.
The odds of a large project finishing on time are close to zero. The odds of a large project being canceled are an even-money bet (Jones 1991).
In 1998, Peat Marwick found that about 35 percent of 600 firms surveyed had at least one runaway software project (Rothfeder 1988). The damage done by runaway software projects makes the Las Vegas prize fights look as tame as having high tea with the queen. Allstate set out in 1982 to automate all of its office operations. They set a 5-year timetable and an $8 million budget. Six years and $15 million later, Allstate set a new deadline and readjusted its sights on a new budget of $100 million. In 1988, Westpac Banking Corporation decided to redefine its information systems. It set out on a 5-year, $85 million project. Three years later, after spending $150 million with little to show for it, Westpac cut its losses, canceled the project, and eliminated 500 development jobs (Glass 1992). Even Vegas prize fights don't get this bloody.
If you care to look, you will find loads more examples of failures the group has been unable to keep secret. The failures keep rolling in spite of the huge efforts to reduce requirements, inflate estimates, extend time lines, increase staff and everything else. You have to ask the question::how many software successes are really failures in disguise? If anyone were serious about calculating software batting averages, this would be a key factor.
This pattern has resulted in some fairly widespread humor that you can be sure isn't mentioned in project management meetings. For example, here are the stages of a software development project:
- Enthusiasm
- Disillusionment
- Panic and Hysteria
- Search for the Guilty
- Punishment of the Innocent
- Praise and Honor for the nonparticipants
Why do Software projects fail and how do you win?
When lots of human beings work at something for a long time, they tend to figure out how to do it. Building software appears to be a huge exception to that rule. With decades of experience under our belt, why is software the exception?
This is a long subject. I have gone into great detail spelling out the causes ... and the cures!
Start with history and evolution:
https://www.blackliszt.com/2023/08/summary-computer-software-history-and-evolution.html
Everyone knows that software project management is essential to producing software that works, on time and on budget. In spite of decades of "innovation," it doesn't get better. The winners follow a different set of rules.
https://www.blackliszt.com/2023/04/summary-software-project-management.html
Software quality assurance is an important specialty within the non-science of computing, but in spite of all the time and money spent, quality continues to be a major issue. There are solutions that have been proven in practice that are ignored by the experts and authorities.
https://www.blackliszt.com/2023/04/summary-software-quality-assurance.html
How do you win with software? Nearly everyone starts with requirements, makes estimates and is judged whether they deliver on time and on budget. This optimizes for expectations and is a proven path to failure. The winning path optimizes for speed, customer satisfaction and continuous quality. It's what the people who need to win do.
https://www.blackliszt.com/2023/07/summary-wartime-software-to-win-the-war.html