Nothing we encounter in our daily lives changes or evolves as quickly as computing. All our habits of thinking are geared towards things that evolve slowly, compared to computing. This is a simple concept. It is disputed by no one. But it has implications that are vast and largely undiscussed and unexplored. It clearly deserves to be a fundamental concept of computing, along with a few correlaries.
Normal evolution speeds
Our planet evolves. Most of the changes are slow, with peak events mixed in. Life forms evolve slowly, over tens of thousands of years. Human culture evolves more quickly -- too fast for some people, and not nearly fast enough for others. Human capabilities also evolve, but for something like speed to double over a period of decades is astounding.
Take people running as an example. Here are the records for running a mile over the last 150 years or so.
The time has been reduced by roughly 25% over all those years. Impressive for the people involved, and amazingly fast for human change.
Once you shift to things made by humans, the rate increases, particularly as science and technology have kicked in. We've invented cars and they've gotten much faster since the early ones. Here is a lady on a race car in 1908. She was called the "fastest lady on earth" for driving at 97 mph in 1906.
in 2013, Danica Patrick won pole position during qualifying rounds at the Daytona Speedway by averaging over 196 m.p.h.
In this case, speed roughly doubled over about a hundred years.
Computer evolution speeds
Computers are different. Moore's Law is widely known: power doubles roughly every 18 months. And then doubles again. And again. And again. Every time someone predicts an end to the doubling, someone else figures out a way to keep it going. This is a fact, and it's no secret. It's behind the fact that my cell phone has vastly more computational power and storage than the room-sized computer I learned on in 1966.
This chart should blow anyone's mind, even if you've seen it before. It shows processor transistor count (roughly correlates with power) increasing by a factor of 10, then another, then another. In sum, it shows that power has increased about one million times over the last forty years.
It would be mind-blowing enough that we have something in our lives that increases in speed at such an incomprehensible rate. But that's not all! Everything about computers has also gotten less expensive! For example, the following chart shows how DRAM storage prices have gone down over the last twenty years, from over $50,000 per GB to around $10. In other words, to about 2% of 1% of the price twenty years ago.
That's faster and cheaper in a nutshell.
So what? Everyone knows that things change quickly with computers, what's the big deal? It's just the way things are!
Here's what: this simple fact has profound implications.
Everything about human beings is geared for people and things that evolve at "normal" speed. Our patterns of thought, the things we do, most of our behaviors were developed for "normal-speed" evolvers, i.e., everything but computers (EBC). A surprising number of them break, are wrong or yield crappy results when applied to computers. There is no reason at all to be surprised by this; in fact, anything else would be surprising. What is surprising and interesting is that the implications are rarely discussed.
Computers evolve more quickly: it matters!
When you apply patterns of thought and behavior that may be appropriate for EBC to fast-evolving computers, those thoughts and behaviors typically fail miserably. This "impedance mis-match" explains failure patterns that persist for decades. Of course everyone knows that computers are different from other things -- just as they know that Indian food is different from Chinese food. What they tend not to know is that computers are different in a different way (because of the speed of evolution) from EBC.
Here are a couple examples.
The mainstream "wisdom" for software project management is essentially the same, with minor modifications, as managing anything else, from building a house to a new brand of toothpaste. It's not! That's one of the many reasons why the larger organizations that depend on those techniques fail to build software effectively.
We treat software programming skills like any other kind of specialized knowledge, like labor law. They're not! Thinking that they are is one of the many reasons why great software people are 10 times better than really good ones, who are themselves 10 times better than average ones.
The normal ways people go about hiring software engineers is crap. They think that hiring folks who deal with a subject matter that changes so dramatically is the same as hiring anyone else. It's not! They also think that extended, in-depth experience with a particular set of technologies is really valuable. It's not! It's actually a detriment!
Software engineers tend to learn to program in a certain way, using a given set of tools, techniques and thought patterns. Those tools were designed to solve the set of problems that existed with computers at a certain point in their evolution. But computers evolved from that point! Quickly! The programmers are doing the equivalent of hunting for rabbits with weapons designed for hunting Mastodons, blissfully unaware of what's appropriate for computers the way they are today!
Software is hard and isn't getting easier
Software isn't a problem that gets solved -- oh, now we know how to do it, finally. It doesn't get solved because the underlying reality (computers) evolves more quickly than anything else. The examples I mentioned are things that "everyone" is sure must apply to software. How could they not? The fact that they yield consistently horrible results seems not to break the widespread faith in the mainstream approach to software.
These and many other broadly accepted falsehoods explain why so many things about software are broken and (worse) don't seem to get fixed, decade after decade, when superior methods have been proven in practical application. Why is everyone so resistant to change? Is everyone stupid?
Aw shucks, I admit it: "everyone is stupid" was my working hypothesis for explaining things like this. But I now have a more satisfying hypothesis: everyone is so used to dealing with "normal speed things," EBC's, that they just can't help themselves from applying to computers the methods and patterns of thought and behaviors that work reasonably well in most of their lives. Since nearly everyone gets the same lousy results, the conclusion everyone draws is that there's something about computers that's just miserable.
Conclusion
There is nothing comparable to computers in the rest of our human experience. Nothing evolves at anywhere close to the speed of computers, getting more powerful while getting cheaper at hard-to-comprehend rates. We apply methods and patterns of thought that work well with practically everything, and those methods fail when applied to computers. But they fail for everyone! The conclusion everyone draws from this is that computers are just nasty things, best to stay away from them and avoid blame. It's the wrong conclusion.
Computers are understandable. The typical failures are completely explained by the mis-matched methods we bring to them, like trying to catch butterflies with a lasso. When people use methods that are adapted to the unprecedented evolutionary speed of computers, things go well.
Yet, in spite of all the advances in computing scientists are still trying to get the computer to do some things as efficiently as a human brain can. http://www.foxnews.com/tech/2013/10/07/how-to-build-human-brain-with-computer/.
Posted by: Helen | 01/05/2014 at 07:53 PM