Computers are so exact and numbers-based that we tend to think of the people in charge of them as white-jacketed scientists with esoteric knowledge driving towards calculated optimal results, much the same way we imagine that the scientists and engineers calculated the path of the Apollo capsule to the Moon. If the Apollo program were run the way most computer projects are, it would have been a miracle if the capsule made it off the launch pad, much less traced an incredibly exact path from Cape Canaveral to the chosen landing spot on the Moon, a journey of over a quarter million miles.
The reality is that the application of computers and software to automation is painfully slow, usually failing to achieve optimal results by large margins, and often lagging behind what is achievable by decades. The reasons appear to be a semi-random mixture of commercial interests, technology fashion trends, ignorant and risk-averse managers, and technical specialists rooted in the past and wearing blinders. That’s all!
Here’s the story of a typical example.
Microfilm, document imaging and laser disks
Long before computers, microfilm was the preferred medium of librarians and archivists to preserve paper documents for future use. Microfilm was more compact than paper, and lasted much longer. What happened when computers came along is a curious and educational story, a prime example of how computer use and software evolve.
Disclosure: I lived and worked as a participant in this history during the late 1980’s and early 1990’s. The story I will tell is not systematic or comprehensive, but like a traveler’s tale of what he did and saw.
A side show in the long evolution of computer storage is the laser disk. A laser disk has data recorded onto it either as a whole, during manufacturing, or incrementally, as part of a computer system. The laser disk had quite a run in the consumer market in the form of CD’s, and then DVD’s. They were an improvement on existing methods for distributing digital content of all kinds.
In the computer industry, they took the form of WORM (write-once-read-many) drives, and were used to record archival data that should not be updated, but could be read as often as needed. The technology, as often happens, went searching for problems it could solve. In an effort to reduce human labor more than microfilm could, jukeboxes were invented. Each jukebox had a disk reader and many shelves for disks, with a mechanism to load a selected disk into the reader. FileNet, now part of IBM, built one of the first of these systems in the mid-1980’s.
At around the same time that WORM drives became practical, paper document scanners were evolved from systems to capture images of the paper onto microfilm. In each case, light was shown onto paper and focused onto a target; a scanner replaced the film with an optical sensing device.
How could we get people to buy WORM drives and jukeboxes, the creative marketing people wondered? The departments that put documents onto microfilm weren’t going for it – the new systems were much more expensive, microfilm wasn’t accessed often enough to make it worth connecting it to computers, and the records retention people were understandably skeptical that any computer disk would be readable 100 years from now, as microfilm will be.
Adding workflow to document imaging
I have no idea who made the audacious leap, but some creative person/group came up with the idea of doing computer document scanning at the start of the process, when paper arrived at a location for processing, instead of microfilmed at the end for archival purposes. But WHY??? The creative types scratched their heads until they were losing hair rapidly. Why would anyone make responding to loan requests or whatever even longer than it is today by introducing the step of scanning?
GENIUS TIME: LET’S INVENT “WORKFLOW”!!!
Everyone has an image for what “workflow” is. For documents, it’s the path of processing steps from arrival to filing. In larger organizations, there are often multiple departments involved in processing a document, so there are in-boxes and out-boxes and clerks transporting documents from one to the other.
Rubbing their hands and cackling, the scammers worked out their strategy: We’ll scan the documents and convert them to images so that we can move images from place to place electronically instead of piles of paper! We’ll call it “document imaging workflow.” It will deliver massive benefits to the whole organization, to everyone who touches the paper, not just to the tiny group who files and archives at the end – and storing the scanned documents onto WORM drives will do their job too! Hooray! We will leap-frog the old-fashioned, paper-flooded back office into the modern electronic age. What’s not to like?
It was audacious and brilliant. It was a scam that would wither if confronted with just a little common sense or cost accounting. Which, true to the pattern of such fashion-driven trends, it never had to confront!
Implementing workflow
A typical target environment for implementing workflow was large offices with large numbers of documents, often forms of some kind, arriving every day. The forms would go from the mail room to the relevant people’s desks for processing. The person handling the form would often have an IBM 3270 terminal for interacting with a mainframe computer. When the person was done with the form, they would place it in one or more outboxes for further processing at other desks, or for filing. Sometimes documents would go into file cabinets for a period of time, but all would end up being archived, and sometimes microfilmed for permanent storage.
Implementing workflow first of all required scanning the documents as a first step, and storing them immediately on normal computer disks, and ultimately on WORM drives, thought to be the equivalent of microfilm. Once scanned, the documents would be managed by workflow software, which would route them to the appropriate desk in the appropriate department for processing. Every desk needed to have the old computer terminal replaced or augmented with a large, high-resolution image terminal, capable of handling at least the display of the document, and sometimes also the computer screen. The old, slow connections between terminals and mainframe needed to be replaced with a fast local area network, and there needed to be substantial servers for handling the local image handling and workflow routing.
Of course, lots of analysis and programming was involved in addition to the equipment. Workflows had to be analyzed and programmed, and all the management, reporting and exception handling taken care of.
The benefits of workflow
When a movement like document imaging workflow has a head of steam up, no one seems to ask whether it’s a good idea, and if so, how good of a good idea it is – quantitatively. When I was involved, everyone threw around there would be at least a 30% productivity improvement due to workflow, and that would make all the expense and disruption worth it. I never encountered a single study that measured this.
Think about it for a minute. Would looking at an image of a document make you work faster than you would looking at the original paper? Would picking the next paper from the in-box be slower than getting the system to show you the next image? What about the labor of the clerks moving stacks of paper from one person’s outbox to the next one’s inbox? It would certainly be saved, but chances are a single clerk could handle the paper movement for many dozens of people. And remember, there’s the scanning and indexing that’s been added and the massive computer and software infrastructure that has to be justified.
It’s obvious why no one EVER did a cost-benefit analysis of workflow – the back-of-the-envelope version easily shows that it’s a couple zeros away from making sense.
I remember a couple of out-of-it, tech-fashion-disaster know-nothings mildly thinking out loud about how workflow could possible be worth it financially. The immediate response from workflow fashion leaders was upping the ante – don’t you know, the wonderful workflow tool from (for example) FileNet lets you program all sorts of efficiencies for each stage of work; it’s something we really need to dive into once we get the basic thing installed. End of subject!
So who fell for this stuff? Just a who’s-who list of major organizations. Each one that fell for it increased the pressure for the rest to go for it. None of them did the numbers – they just knew that they couldn’t fall too far behind their peers.
Conclusion
It’s hard to think of a field that’s more precise and numbers-oriented than computers. Who has a clue what actually goes on inside those darn things? And the software? There are millions of lines of code; if a single character in any of those lines is wrong, the whole thing can break The impression nearly everyone has, understandably, is of a field that is impossibly precise and unforgiving of error. When the software experts in an organization say that some new technology should be adopted, sensible people just agree, particularly when there’s lots of noise and other major organizations are doing it. It can be even more gratifying to get known as pioneering, as one of the first organizations to jump on an emerging tech trend!
Somehow, the smoke and noise from the software battlefield is so intense that the reality is rarely seen or understood. The reality, as I’ve illustrated here, should result in everyone involved being sent to the blackboard by a stern teacher, and being made to write, over and over: “I pledge to try harder to rise above the level of rank stupidity next time.”
Comments