|Column Tag:||Modula-2 and the Macintosh, PART II
The Caverns of Computing
By John R. Bogan, Microcomputing Consultant
The Caverns of Computing
S: January 26, 1985
E: February 14, 1985
Often I think of computing as an extraordinary, Real-Life, Adventure game beckoning the computer warrior to descend into a yawning cavern riddled with many dark and intriguing passageways all begging to be explored. At the end of each corridor there lies a treasure, usually small and mostly ordinary but sometimes great and wonderous. These treasures are all jewels of knowledge and enlightenment about the marvels of logic.
I honestly believe that we (the computing community) have barely scratched the surface of our particular Adventure and some of the passageways that extend in front of us obviously go on for miles, twisting and turning with many small and alluring side alleys to explore or ignore as we choose.
It is up to each of us to decide which of these passageways we will pursue and to sift through the myriad of clues we discover along the way to decide which treasures to seek. Last year about this time several of my favorite and well trod corridors of computing converged and I found at their juncture a wealth of riches. These corridors which I have eagerly explored for more than half a decade are low-cost professional microsystems, the Motorola 68000 and Pascal. All the clues I had accumulated these past five years told me there would be something special waiting when these paths crossed. The glittering jewel of logic I found at their confluence now occupies the position of honor on my desk. It is the Apple Macintosh.
The reason the Mac is such a valuable jewel is that it sheds a piercing light on such gloomy corridors as user friendly and software integration and computing for the masses. And as any Adventure fan knows any bright light is a most valuable tool in the caverns of computing. As an early Macintosh enthusiast who has eagerly watched the hi-res graphics, mice, windows, icons, pulldown menus and cut and paste integration passageways blossom into well mapped, well trod aisles of acceptance and imitation over the past year I could argue the wisdom and majesty of the Apple strategy and direction long after I ran out of rational things to say. After all the Mac is clearly the product of wizards and elfin folk who possess great magic (One look at Burrell Smith proves that). But the purpose of this column is not to look back and congratulate Apple that it made it this far but to use our new bright light to illumunate some of the dark passages that lie ahead of us.
It is one of the best kept secrets in Computer Science and Software Engineering (CSSE) that the proper role of High Level Languages (HLLs) is to permit and encourage wide ranging experimentation and evaluation of new ideas in a cost efficient and timely fashion. Since the time I learned the difference between FORTRAN and COBOL I have followed and participated in the religious wars over which computer language is Best and I have concluded that most of the time the arguments are largely irrelevant since they concentrate on the question - What language should our software be written in when it is shipped to the consumer? As someone who is more interested in exceptional software requiring extensive R&D I have consigned that question to the trash can. The microscope people who can focus only on maximizing speed and space will insist on Assembler (ASM). The puzzle fanatics will choose Forth. The compromisers who cant quite give up the chance to get their hands dirty with a little register optimization will probably choose C. But the thoughful scientists and engineers in the R&D labs who desire to conserve the scarce R&D dollar and even scarcer R&D minute will choose - Modula-2.
HLLs are invaluable because they permit the Software Engineer to build complex and innovative programs, which break new ground, in the shortest possible time and for the fewest R&D dollars. This means that within a given budget and deadline more ideas can be explored and evaluated and thus more progress can be made toward achieving an ideal piece of software. This is not a theory, it is an observation of the history of microcomputing. Lets look at some of the most important software breakthroughs in the past few years and examine the language influences on the R&D of these milestone products.
o CP/M. Designed in large part in PL/M, a structured HLL. Responsible for the early success of 8080 and Z-80 S-100 small business systems.
o Visicalc. Designed largely in BASIC. Responsible for the immense success of the Apple //.
o Lotus 1 2 3. Cloned from MBA Context which was written in UCSD Pascal. Responsible for the flood of integrated software and the dominance of the IBM PC.
o UCSD Pascal P-system. Contributed to pull down menus and menu driven Operating Environments.
o Xerox Altos testbed. This system never made it into the commercial world at all yet it has to be considered the Grandfather of the graphic, iconic, windowed personal workstation. This system was developed in the HLL Mesa - a Pascal derivative.
o Apple LISA. The first commercial microcomputer to make it to the marketplace bearing the fruit of the Xerox Alto. LISA was designed and written in Pascal.
o Apple Macintosh. Designed in Pascal and translated into hand optimized ASM. The third milestone microcomputer, the most successful introduction of any professional micro at any time.
o Kildals GEM and Tramiels Jackintosh. Cloned from the Mac.
These are not idle opinions, this is not some wierd hallucination that can be dismissed with a snort and a wave of the hand, these are verifiable historical facts - which can not be ignored. They clearly illustrate the major role of HLLs and in particular - the structured language Pascal - in the software R&D labs throughout the micro world. The reason for this success is obvious and worth restating. With a structured HLL it is easy to break new ground and if this new ground is genuinely helpful and makes small computers more productive and easier to master then the consumer will eat them up.
Why are structured languages going to be the bright light of choice in the R&D labs of the micro world for the indefinite future? The answer lies in history and economics and logic.
Programming as Engineering
Back in the mid 1960s computer specialists noticed a troubling trend. While hardware was getting exponentially cheaper, the cost of producing software was taking off as programs got more complex to take advantage of the improved hardware. The following graphic illustrates this point.
Following the realization that software costs were getting out of hand the computer industry gradually started to divert resources from the hardware side of the industry to the software side. Along with this attention came some discipline ... the discipline of engineering. A rough definition of engineering is that it is the art of measuring and optimizing resources.
Thus the productivity boys came in to the Data Processing departments with their legal pads and stopwatches. The metric (unit of measure) they decided on was the number of lines of code per day per programmer and to their horror here is what they found.
o The average programmer could crank out only ten lines of debugged code a day!
o It didnt matter what language the programmers were coding in, if they were programming in Assembly the daily output was ten ASM statements, if the language was COBOL or FORTRAN - ten statements!
The first, and obvious, conclusion that was drawn was that the COBOL and FORTRAN programmers were ten to one hundred times more productive than the ASM programmers since each line of FORTRAN could generate ten lines of ASM and some lines of COBOL could generate 100 ASM statements. I believe that these widely reported results are in no small measure responsible for the fact that most large DP departments are now and forever locked into COBOL as their language of choice. So much for the benefits of Software Engineering!
But before we dismiss Software Engineering as merely the tool of the devil lets look at what else was learned in their productivity studies.
When you ask a talented hacker how long he took writing a 1,000 line program dont be surprised if he scratches his beard and replies, Oh, I did that one night a couple of months ago. This would seem to contradict the ten statement/day finding ... by orders of magnitude. But then ask your hacker how long he spent making his program bug-free. He might just reply, Well, I expect to get the last bug out of it anytime now. One night writing the code and two months debugging it, this is more common than you might think. Anyways this was the second major conclusion of the Software Engineers. The piechart below illustrates the traditional division of labor and hence cost in creating significant programs.
Careful study of the above graphic reveals just how obscene it is. In traditional programming two thirds of the effort, and money, goes into the toilet known as maintenance.
(* Declare I/O from Modula-2 Standard Library *)
FROM Terminal IMPORT ClearScreen;
FROM InOut IMPORT WriteLn, WriteString, WriteCard, ReadCard,
CONST Start = a;
Int = b;
Finish = c;
VAR DiskCount: CARDINAL; Done: BOOLEAN;
(* Get number of disks or set terminate flag *)
PROCEDURE GetInput (VAR NumberOfDisks: CARDINAL;
VAR Quit: BOOLEAN);
WriteString(Enter number of disks (between 3 and 9));
WriteString(To quit - enter number out of range);
IF (NumberOfDisks < 3) OR (NumberOfDisks > 9)
THEN Quit := TRUE
ELSE Quit := FALSE
(* The recursive guts of the programs ... calculate moves. *)
PROCEDURE Hanoi(n: CARDINAL; StartNeedle, IntNeedle,
Hanoi(n-1, StartNeedle, FinishNeedle, IntNeedle);
WriteString(Move disk -);
WriteString( from );
WriteString( to );
Hanoi(n-1, IntNeedle, StartNeedle, FinishNeedle);
(* Mainline ... control main loop ... get input & do it. *)
WHILE NOT Done DO
Hanoi(DiskCount, Start, Int, Finish);