In software, history is ignored and its lessons spurned. What little software history we are taught is often simply wrong. Everyone who writes or uses software pays for this, and pays big.
But we know about history in software -- there's Babbage, the ENIAC, etc.
Yes, we've all heard about various people who are said to have invented modern computing. A shocking amount of what we are taught is WRONG.
Babbage is a case in point. People just love to go on and on about him. There are problems, though. I'll just mention a couple.
One problem is that his machines simply didn't work, even after decades of work, and huge amounts of skilled help and money. He must have known they wouldn't; although he was personally wealthy, it was other people's money he spent on his famous dalliance.
Another problem is that his best idea wasn't his. The idea of using punched cards
to contain the program was invented in France and was a key aspect of the Jacquard Loom -- a machine that pre-dated all his work, and a machine that actually worked and was in widespread use.
The ENIAC is another good example of what appears to be the typical pattern in computing, which is someone invents a good thing, makes it work, and then someone else steals it, takes credit for it and tries to cover up the theft, often without delivering results as good as the original.
If you only read the standard literature, you would still be convinced that the ENIAC and its inventors were giants of the field. Once you read everything, you discover that reality is more interesting. It turns out that the inventors of the ENIAC were "inspired" by prior inventions, much like Babbage and the Jacquard Loom. In this case, the inspiration was the Atanasoff-Berry Computer.
Here is an excerpt from the ruling in the patent dispute that settled the issue:
Judge Larson had ruled that John Vincent Atanasoff and Clifford Berry had constructed the first electronic digital computer at Iowa State College in the 1939-1942 period. He had also ruled that John Mauchly and J. Presper Eckert, who had for more than twenty-five years been feted, trumpeted, and honored as the co-inventors of the first electronic digital computer, were not entitled to the patent upon which that honor was based. Furthermore, Judge Larson had ruled that Mauchly had pirated Atanasoff's ideas, and for more than thirty years had palmed those ideas off on the world as the product of his own genius.
Other fields don't need history -- why should software?
Not true. Other fields are saturated with history.
Politicians study history in general and the last election in particular. Fiction writers frequently read fiction, current and historic. Generals study old battles for their lessons; even today at West Point, they read about the Civil War. Learning physics is like going through the history of physics, from Galileo and Newton and through Planck and Einstein to the present. Even the terms used in physics remind you of its history: hertz, joules and Brownian motion.
Software, by contrast, is almost completely a-historical. Not only are most people involved uninterested in what happened ten years ago, even the last project is unworthy of consideration – it’s “history.”
Consequences of the lack of history
War colleges study past wars for the highly pragmatic purpose of finding out how they were won or lost. What was it the winner did right? Was it better weapons? Better strategy? Better people? Some combination? And how exactly did the loser manage to lose? Was it a foregone conclusion, or was defeat snatched from the jaws of victory? People who conduct wars are serious about their history -- they want to win!!
In software, no one is interested in history. Everyone thinks they know the "right" way to build software, and thinks that the only possible source of loss is failing to do things the "right" way -- the requirements weren't clear; the requirements were changed; I wasn't given enough time to do a proper design; there was no proper unit testing; the lab for testing was insufficently realistic. The list of complaints and excuses is endless, and their net effect is always the same: crappy software and whining: I need more people, more time and more money. Because studying history is so rare, few are exposed to the software "wars" that are fought and won by teams that didn't follow their rules.
There is only one conclusion to be drawn: software people would rather lose with lots of excuses than win by doing things the "wrong" way. Ignoring history is a great way to stay in this comfortable cocoon.
When software history becomes as important a part of computer science education as physics history is of physics, we'll know it's approaching credibility. Until then, everything about computer science, education and practice will continue to be a cruel joke.
As a political sccenie major, I am right there with you. I do not know a lot about the scientific process that goes into making a computer. I also agree that the increase in demand World War II created allowed for a speedier innovation in computers. An increase in demand was inevitable, but it made the demand come sooner than expected. I think it took a long time for the masses to understand the uses of computers or realized they could utilize them for personal use. As we learned in lecture, people called ENIAC the “big brain.” They did not understand how it worked; it was like a miracle machine for them. I disagree with you when you say that people soon realized that computers could have a practical application in their life. I think they knew computers were too expensive and large for their personal use. I think this realization did not occur until computers were made more compact.
Posted by: Sultan | 04/27/2012 at 11:11 PM