Prescription drugs are important elements of our lives. There is a strict, scientific, testing-based process to assure that drugs that become widely used are safe and effective, with known side effects. Computers and software are also important elements of our lives. There is a chaotic, fashion-trend-based process used to select the mixture of tools and techniques used to build, maintain and operate our IT systems, resulting in widespread failures, along with cost and quality problems. Worse, there is no recognition that this is the state of affairs, and no movement to correct the situation.
Everyone knows that drugs are important, and an important part of our economy. Here are some numbers from the CDC.
In 2013, we spent about $271 Billion dollars on prescription drugs. That's quite a bit, but just about 10% of national health spending.
I won't recount the process drugs go through to get approval from the FDA, but I think everyone knows it's an elaborate, multi-year and multi-stage process, with testing at each step to assure that we know how a proposed new drug will work in human beings. While I have my complaints, there is a process, and it's scientific and evidence-based.
The IT industry is also a large one. Here's a breakdown of it worldwide.
There are conflicting estimates of its size in the US, but here's a representative one.
Note that the definition of IT does not include the activities of well-known IT-centric companies like Google.
I was fascinated to see that in 2013, IT was three times the size of the entire pharmaceutical industry. Amazing.
Drugs and IT
Drugs are developed by scientists. They are vetted by a strict scientific process. Only drugs that make it through all the tests are widely used. As a result, the vast majority of drugs are used safely and effectively by the vast majority of patients, with a few experiencing side effects that have already been identified.
IT is run by professionals and staffed with computer scientists and engineers, using tools and techniques developed over many years by scientists and engineers. No matter how high-profile and important the project, regardless of the involvement by government or private companies, a shocking fraction of IT projects end up late, too costly, ineffective or worse. Industry-accepted certifications seem to make no difference. New methods and techniques emerge, become talked about and are deployed widely without any evidence-based process being used to assure their safety and effectiveness. The industry is rife with warring camps, each passionately committed to the effectiveness of their set of tools and techniques. But there isn't even postmortem testing to see which ones were better at gaining its adherents admittance into IT "heaven."
I think the FDA-run drug acceptance process could be much better than it is. But the important thing is, everyone involved in prescription drugs understands and acts scientifically about the process. No one, including me, wants that to change.
The IT industry is at least three times the size of the drug industry. There are computer science and computer engineering departments in every major university, and their graduates staff the industry. It's hard to imagine that they don't understand science, scientific process and evidence-based reasoning. However: they adhere to faith-based processes and vendor-driven products that yield horrible results year after year. None of them say, "hey, this stinks, maybe we can apply that thing that Galileo, Newton and Einstein did, what's it called, science?"
The last thing I want is government involvement in IT, given how horribly government handles its own IT affairs, and I'm not suggesting it here. But it's a sign of just how bad things are in IT that the bureaucratic, government-run FDA does a more scientific job with drugs than anyone does with IT.