Wasn't it George Carlin who defined "vuja de" as the strange feeling you get when you know you've never been somewhere before? I had a serious case of vuja de recently when I went back to my favorite Hallmark card shop, looking for a sympathy card for a friend whose front porch had collapsed.
I didn't find the card. But lord almighty, I did discover something else: Hallmark had bought a copy of Card Shark! People who read this column regularly may recall my October 1991 musings on a "personal vertical application" called Card Shark, that would make customized greeting cards on a typical PC system. I suggested that computer stores were not the places to sell such a product, but that Hallmark Card Shops would eat it up.
They did--just not quite as I had predicted. There on the wall was a PC in a pastel plywood coffin beside a rack of "blank inside" cards of various designs. For a not-so-nominal fee, shoppers could create a customized message (such as "Sympathy on the collapse of your front porch") and have it laserprinted on the inside of a card with a picture of a front porch collapsing.
Well, not quite. The art on the cards isn't customizable yet. I suspect, however, that they're working on it.
I find it interesting that Hallmark decided to sell cards rather than card-creation software. The system is very much what I had envisioned (and I honestly, truly had never seen the Hallmark system when I wrote October's column last summer!) except that it hadn't been taken quite as far. I was right that Hallmark liked the system--so much so that they decided to keep it to themselves. Was this smart? We'll find out when hordes of hungry programmers decide to clone the system for the home marketplace. (You get one guess which side I'm betting on....)
Hallmark's somewhat shortsighted action isn't difficult to understand when you remind yourself that Hallmark is in the card business, not the software business. The Hallmark honchos use computers to do things. The granularity level of their computer thinking is the system; that is, a machine with software to do a specific thing. To them, buying a whole bunch of computers to create cards in their stores is a simple extension of the ordinary business practices they had been using all along. Cutting a deal with a programmer to resell a software package that creates cards on a home computer would just plain smell wrong.
The failure of Hallmark Cards is not in their software design, which (from what I saw while making a card) is quite nice. Their failure lay in the analysis that produced the system. It was plainly the right answer to the wrong question. So I think it's time we went back to the issues of software design for a while; for now, to the difference between analysis and design.
The difference between analysis and design is a lot like those inkblots that look like John F. Kennedy. Until some right-brain insight pops in your head, they're just inkblots. But once you see (or once someone points out) the president's face in the inkblots, you can never understand how other people don't see it immediately.
It's this simple: Analysis is the process of describing the problem to be solved. Design is the process of describing the solution. Programming (the necessary third leg of a three-legged stool) is the process of implementing that solution.
Self-taught programmers have a lot of trouble with this triad, in part because they have no one to point out the faces in the inkblots, and in part because of an insidious sort of square-cubed law of program complexity. When you write your first useful programs, they're often little utilities that do one thing and one thing only. The problem to be solved can be stated in one sentence. The design of the program is a one-page sort-of-a-flowchart in Burnt Sienna crayon. The implementation is three hundred lines of Turbo Pascal.
As programming projects grow more ambitious, the three legs of the stool grow unevenly. The design grows faster than the number of lines of code, because smart programmers learn early how to create general-purpose software tools that can contribute to many aspects of an application's functionality. However, the leg that grows the fastest is neither the code nor the design, but the statement of the problem. The interconnectedness of the problem's elements grows as the cube of the number of problem elements, and the assumptions underlying that interconnectedness are part of the problem.
It took me years to figure this one out.
Simply put (and in most consultant-style programming projects), it's the analyst's job to describe the way things are done now in the area to be automated (that is, without taking any future automation into account). This includes obvious things such as the nature of the information that passes from hand to hand in the course of getting things done. But it has to include a whole lot of less obvious things, such as why things are done the way they're done. This "why" includes all of the subtle assumptions that people on the inside often take for granted--if they truly understand them at all. Getting everything is the essence of analysis--and it is murderously difficult.
On the flipside, the analyst has to avoid "optimizing" current processes consciously or subconsciously. In any process there may be activities that don't seem necessary or connected in any way with anything else. A hasty, egotistical, or insufficiently observant analyst may assume that these puzzling activities are unnecessary, or else part of some unrelated process and simply write them out of the picture.
The end result of an analysis is a document that will allow an intelligent outsider to obtain a correct understanding of the current state of a process. As I'll say again later, it's a how-to-do-it book for the business being automated.
Analysis is plainly the toughest part of software development. It is the least amenable to automation and provides very little feedback or self-validation. You can totally blow an analysis, and nobody will suspect until the company goes belly-up. It's often very hard to see the wrongness of the question beyond the glow of a correctly implemented answer.
Programmers are supposed to make lousy analysts. This is true only in that design and programming are more fun, and we'd rather be doing that (or probably anything) than analysis. Apart from this very human failing, however, programmers make ideal analysts, for these reasons:
So. Do I recommend analysis methods such as Structured Analysis (Yourdon) or Object-Oriented Analysis (Yourdon/Coad)? Emphatically not. My reason is that both methods end up imposing sets of computer-scented biases on the description of a process or a situation. The problem with centering an analysis around data flow diagrams is the unstated assumption that data is the predominant and most important element being described. Worse, those vague, subtle, human-y interactions that can make or break an automation project don't fit into bubbles very well. Human systems are best described in wholly human terms.
This problem exists in Yourdon-style Structured Analysis, but given some care and restraint it can be dealt with. Not so with Object-Oriented Analysis, which is as wrong-minded an analysis philosophy as I've ever seen. Right from the start, OOA imposes a template on a description that assumes a particular software design methodology, and even a programming paradigm and coding scheme. Sheesh, OOA encourages analysts to impose classes on a system description, and insists that everything in a system be assigned to a computer-friendly cubbyhole such as data or method. Danger, danger, Will Robinson! Anything that doesn't look like data or methods will either be pounded into one or the other or else swept behind a file cabinet, to emerge months later with fangs and brass knuckles.
The whole purpose of OOA (as the authors hint in their book of the same name) is to fold the analysis stage into the design stage in the interest of productivity; plainly (if not explicitly) to do away with it altogether. OOA is an impatient method, and impatience is its own punishment.
Finally, 200 pages of bubble charts can hide a lot of incompetence. They can be an excellent smokescreen, and to many phony analysts, cranking out endless incomprehensible charts becomes a substitute for observing objectively, probing insightfully, and describing things well.
I'll accept nastygrams on the above subject gracefully, but I'm unlikely to be swayed. I have my own method, and while I can't present it rigorously in a single magazine column, I'll try to state it in broad if informal terms.
My method rests on two fundamental principles. Here's Principle #1: Write an analysis as though it were a book. That is, make your description a text description. Keep the diagrams to no more than 20-25 percent of the bulk of the document. Writing up an analysis as a text description means you have the medium to convey all elements of the process being described, whether they have computer analogs or not.
Important corollary: If you can't write clearly, you have no damned business being an analyst. Analysis is fundamentally a communications skill. Programmers can make terrific analysts, but only if they know how to write in English as well as C.
And Principle #2: See only the computers that are already there. Where analysts get in trouble is where they begin mixing analysis and design. The interface between analysis and design is a tricky one, to which I'll return at the end of the column. There is a place in an analysis for the analyst to suggest the shape of a solution, but that can't be done until the problem has been separately and completely described.
A subtler hazard is to subconsciously ignore aspects of an analysis that don't plug well into a computer framework. The Updates Clerk may tell you, "We batch updates for phone verification on Fridays, because on Fridays the salesmen are in the weekly sales meeting and we can always get an outside line." You may laugh in sympathy, shake your head, and ignore that vital piece of information. After the new office automation system is in place, you discover that there aren't enough phone lines to make it happen, except (sometimes) on Fridays. You missed a clue because it wasn't really a data flow, and it wasn't really a process....
A good analysis document has these parts: an overview, a structured description, a recommendations summary, a warnings summary, and a glossary. I separate the document into these parts for a number of reasons, but perhaps the least obvious is that when the project as a whole is designed and implemented, the overview, structured description, and glossary can be readily edited into the system documentation.
The overview is just that: Look at the problem from a height, give some history of the evolution of things up to the current day, and describe in the broadest possible terms what the process being described involves. This is "orientation" for the outsider. Make it plain that there is a glossary and that any jargon in the overview will be covered in the glossary. Then enumerate the several (rule of thumb: no more than ten) largest elements of the process being described, and the broad relationships between them.
Considered as a process to be analyzed, a small magazine publishing company might have the following major elements: ad sales, circulation marketing, circulation fulfillment, editorial planning, art and production, office management, personnel, and accounting. Your overview would explain briefly what each of these elements is, how they are different, and how they relate to one another. Circulation marketing, for example, is the process of gathering subscribers, whereas circulation fulfillment is the process of getting magazines into their hands.
Many magazine people lump these two areas together, but I consider them separate because whereas all magazines are distributed in essentially the same way, there are radically different kinds of magazine circulation (paid and controlled, primarily) that require very different mechanisms. This is the sort of thing you would learn while doing your analysis. Needless to say, you can't write the overview until you've done a great deal of looking, listening, and probing.
The structured description is the largest single part of the analysis document. Here, you break down each of the major elements of the process into smaller elements, in hierarchal fashion, describing in text with figures only when necessary.
Your skills at structuring program code can come in handy here, as long as you keep in mind that what you are structuring is human activity and not program code. And as always in analysis, you must resist the temptation to design the system while you're analyzing the problem.
A lot of the descriptions will deal with inputs and outputs and processes. Describing these should be straightforward.
That's not enough, however: You must also state why things are done the way they are. The "why" material is in many respects the real value-added of an analysis--it contains the constraints that will critically shape the design later on.
A somewhat simpleminded example involves the shipping class of magazines versus other mailed materials. Bills and renewal notices are mailed first class, magazines are mailed second class, and subscription offers are mailed third class. A naive system designer might think that these divisions exist strictly for cost reasons, when in fact second class postage is limited to magazines alone, and magazines with a certain minimum number of editorial pages to boot. He might include a menu option to mail subscription offers second class, when in fact the Post Office would forbid the mailing. Postal rules, not simply mailing costs, dictate the division of mailing materials by postal class.
Another thing to watch out for in terms of "why" material are "industry-customary" things that may not be matters of law but are still outside the control of the process being analyzed. For example, in analyzing a small magazine publishing company, you might describe that newsstand distributors are given dollar-for-dollar credit on covers torn from unsold magazines and returned. A naive system designer might consider this primitive, and may put together a form to be filled out by the newsstand distributor in lieu of returning actual torn covers. Some distributors might comply and some might not--but the important fact is that newsstand distributors have their ways of doing things and they do not generally take orders from small publishers!
A good outline processor works very well in creating the structured description, although in a large analysis you may have to break the file into a couple of chunks. The mental process I use is an observational sort of "stepwise refinement" not unlike that championed by Niklaus Wirth. You begin with one of the major functional areas of the process being described, and make it a major heading. Then you discern all of the next-level components within it, and make those minor headings. Then you begin with the first of the minor headings and discern all of ~~ ~~~~-level components, and so on.
Under each heading, describe the nature of the element being described, at its level only. In other words, until you hit the "bottom" level in the outline, you're going to be writing summary information of some kind. My rule is that a header with subheads should describe only those things held in common by all of the subheads. Details always sink to the bottom.
Some analysts are pathologically afraid of duplicating description. You'll find that there are common threads in many elements of a process, and sometimes it seems you're covering ground you've covered before. And why not? If a subprocess happens as a separate entity, describe it--even if it's largely identical to numerous other processes. Identify what differences there are, but describe the whole thing.
An analyst's job is to describe the current problem, not to design a solution. On the other hand, once the analysis is done, the analyst has a pretty fair grasp of things, and may well have had some inspirations on what's to be done. If the analyst is a programmer, he or she may well be ~~~~~~~ at the ~~~~~ ~~~~~~~ to suggest things, and this part of the document exists for that purpose.
I insist on this section of the document because it short-circuits the temptation to design within the structured description, which can be fatal. The recommendations summary can be as freeform as the analyst wishes. It acts mostly as an idea cauldron, and gives the designer some seeds to crystallize a design around.
The warnings summary is a lot more important. In my design philosophy, a system design is shaped most potently by constraints; that is, things that cannot be done. Major constraints are often obvious, but minor ones can hide very well. The ones that hide the best are human-founded constraints, often imposed informally, sometimes imposed solely by coincidence. Infrastructure constraints are very important (phone lines, power service, noise, network access) and are often ignored because they're "outside the bounds of the system." This is exactly why they are constraints.
Constraints should be mentioned in the structured description, but I feel they are important enough to be gathered together in a separate portion of the document. Organize them as you like. A simple list may be enough.
Most processes have acronyms and insider jargon, and the more corporate the culture, the more acronyms and jargon you're going to find. Back at Xerox we had TRICCs, TRDRs, RDCs, IMOs, FWSSes, LRSs, and lord knows what else. Beware of insider biases: Even jargon that might seem obvious within an industry should be defined for the sake of outsiders who might have a hand in designing and implementing the system. Every magazine person knows what CPM means. (Cost Per Thousand.) In a physical therapy office, however, CPM is just as well-known but stands for Continuous Passive Motion. And if I recall there was once an operating system...
As you encounter it, place acronyms and jargon in a file, along with a short description of each. Each time you add some explanatory material on a phrase or acronym to the structured description, add a pointer to the appropriate glossary entry that points back to the subsection number containing the explanatory material:
CPM: Cost Per Thousand. A measure of relative value of advertising media or mailing lists, given as the cost in dollars per thousand readers or list names. See 9.2.4.12 and 4.7.7.
If I were called upon to do another analysis, I think I would keep the glossary in a text database of some sort, and handle it the same way I would handle the index of a book. After all, this is a book!
That's the nature of the document. Now, how do you create it? There are three major skills involved:
Observe objectively. In Stranger in a Strange Land, Robert Heinlein gave us the concept of a Fair Witness. As he put it, a Fair Witness would look at a red barn and say, "The side of the barn facing me is painted red. I cannot comment on the color of the other three sides." In other words, as you observe a process, see what's there. Don't make too many assumptions, and when you do, verify those assumptions by probing; in other words, when you can't see clearly, go take a walk around to other side of the barn.
Probe insightfully. The only dumb question is the one you didn't ask. Still, a clumsy questioner will receive clumsy answers, or, worse, answers to questions that weren't asked. Frame questions in terms of things you already know, in the hope that the answer will extend the set of facts you already have. Don't jump around. Lord knows, take good notes.
Also, have some empathy for the people who provide you with information. The people at the bottom of a corporate hierarchy are generally overworked, underpaid, harassed, and without authority or sufficient time to get their jobs done. Try to get in their way as little as possible, keeping in mind that they're the only ones in a company who really know how anything works.
Describe things well. Writing well is essential. Don't try to make the analysis sound weighty or important. Just make it clear. Write as though you were describing something to a client across your desk. Tell it straight. Leave out the legalese and academic weasel-talk. Keep a light heart if it won't get you fired. (And if it does, you were too good to be working there!)
I can't tell you precisely how to put all these elements together. You have to gather information, organize it, and write it down. The only process that will work is the one that mirrors the way you think, organize, and express.
My earlier criticism of Hallmark is in fact a little unfair. What they had to do before they created their system was market analysis, which is something like the process analysis I've described here, except that the process doesn't exist yet. Market analysis requires market research, which is something I've never had to do.
But at the core of it, I feel that Hallmark's market analysis failed by being insufficiently detached from their current way of operating. It's a little like the difference between being a railroad company and a transportation company. The key is getting stuff from here to there, not the shape of the thing that carries it.
In the card business, the key is getting a card into the hands of the consumer. You can sell them a finished card, or else a cardmaker and supplies. Selling cards may be more profitable now--but you can never discount the possibility that somebody else will begin selling cardmakers down the road. Hallmark's in-store card customizer system is an interesting answer. But it's only one answer, shaped by the choice of the question. That's what analysis is: choosing the question. We'll soon see how well they chose.
Copyright © 1992, Dr. Dobb's JournalRight Answer, Wrong Question
The Inkblot Effect
The Nature of the Question
The Toughest Part of the Job
Analysis Methodologies
The Duntemann Analysis Method
The Analysis Document
The Structured Description
The Recommendations Summary
The Warnings Summary
The Glossary
The Three Skills of Analysis
Market Analysis