PROGRAMMER'S BOOKSHELF

The Theory and Practice of Computer Design and Implementation

Ray Duncan

John L. Hennessy and David A. Patterson's new book, Computer Architecture: A Quantitative Approach, is a uniquely lucid and accessible presentation of the theory and practice of computer design and implementation. First, the authors explain basic principles of computer performance measurement and cost/benefit analysis. They then apply these principles to instruction-set design, processor implementation, pipelining, vectorization, memory-hierarchy design, and input and output. Each chapter contains real-life examples drawn from four computer architectures -- the IBM 360/370, the DEC VAX, the Intel 8086, and the "DLX" (a sort of hypothetical composite of the currently popular RISC machines) -- and each chapter concludes with three entertaining sections entitled "Putting It All Together," "Fallacies and Pitfalls," and "Historical Perspective."

It's almost impossible to convey the authority, scope, and vigor of this book in a brief review. The book is highly technical, of course; nevertheless, it's as engrossing as a novel because of the incredible depth and breadth of the authors' knowledge and experience. It's literally packed with fascinating architectural vignettes, on topics as diverse as the use and misuse of benchmarks, interrupts, microcode, silicon die yields, cache strategies and trade-offs, and the IBM 3990 I/O subsystem.

In addition, virtually every point in the book is supported and clarified by historical anecdotes which reveal how the important features of modern computer architectures were invented and refined. For example:

IBM brought microprogramming into the spotlight in 1964 with the IBM 360 family. Before this event, IBM saw itself as many small businesses selling different machines with their own price and performance levels, but also with their own instruction sets. (Recall that little programming was done in high-level languages, so that programs written for one IBM machine would not run on another.) Gene Amdahl, one of the chief architects of the IBM 360, said that managers of each subsidiary agreed to the 360 family of computers only because they were convinced that microcoding made it feasible -- if you could take the same hardware and microprogram it with several different instruction sets, they reasoned, then you must also be able to take different hardware and microprogram them to run the same instruction set. To be sure of the viability of microprogramming, the IBM vice president of engineering even visited Wilkes [who built the first microprogrammed CPU in 1958] surreptitiously and had a "theoretical" discussion of the pros and cons of microcode. IBM believed the idea was so important to their plans that they pushed the memory technology inside the company to make microprogramming feasible.

Stewart Tucker of IBM was saddled with the responsibility of porting software from the IBM 7090 to the new IBM 360. Thinking about the possibilities of microcode, he suggested expanding the control store to include simulators, or interpreters, for older machines. Tucker coined the term emulation for this, meaning full simulation at the microprogrammed level. Occasionally, emulation on the 360 was actually faster than the original hardware. Emulation became so popular with customers in the early years of the 360 that it was sometimes hard to tell which instruction set ran more programs.

In spite of such diversions, the book is so well-organized, and the material is developed so logically, that each conclusion as it is reached seems self-evident -- even inevitable.

One of the particular strengths of the authors is the deceptive ease with which they transform seemingly simple rules of thumb into razor-sharp analytical tools. For example, they introduce Amdahl's Law within the first few pages: "The performance improvement to be gained from using some faster mode of execution is limited by the fraction of the time the faster mode can be used." At first glance, this law appears to be as trivial (and as useless) as the classic syllogism "Socrates is a man, men are mortal, therefore Socrates is mortal." But the authors demonstrate otherwise; they return to Amdahl's Law again and again throughout the book to demonstrate why well-intentioned changes in hardware or software don't always yield the hoped-for benefits. For example:

Suppose we could improve the speed of the CPU in our machine by a factor of five (without affecting I/O performance) for five times the cost. Also assume that the CPU is used 50% of the time, and the rest of the time the CPU is waiting for I/O. If the CPU is one-third of the total cost of the computer, is increasing the CPU speed by a factor of five a good investment from a cost/performance standpoint?

The speedup obtained is

                
              1         1   
Speedup = --------- = ----- = 1.67                   
                0.5    0.6
          0.5 + ---                   
                 5

The new machine will cost

2/3 * 1 + 1/3 * 5 = 2.33 times the original machine

Since the cost increase is larger than the performance improvement, this change does not improve cost/performance.

At this point, the thoughtful reader might be inclined to reflect on the application of Amdahl's Law to more mundane, practical matters, such as the installation of 80386 accelerator boards into classic IBM PCs with 4.77 MHz, 8-bit I/O buses.

In the last chapter, the authors speculate on future directions in computer architecture, with particular emphasis on multiprocessors and compiler technology. The book ends with a set of appendices that would more than justify the book's price in themselves: A succinct but comprehensive essay on computer arithmetic by David Goldberg of Xerox's Palo Alto Research Center, detailed profiles of instruction set frequencies and execution times for the VAX, IBM 360, and Intel 8086, and a comparative survey of four of the most popular RISC architectures (Intel 860, MIPS, Motorola 88000, and SPARC).

Patterson, a professor at the University of California-Berkeley, was responsible for the design and implementation of RISC I -- the direct ancestor of Sun's SPARC processor. Hennessy, a professor at Stanford University, was one of the founders of MIPS Computer Systems and is still that company's chief scientist. The authors are, consequently, legendary figures in the RISC movement, but their discussion of RISC technology in this book is balanced and dispassionate. Much of their profiling and cost/performance data supports RISC concepts, but it's strictly a soft sell -- the reader is left in peace to draw his own conclusions. Thus, the book's advocacy of RISC is quite indirect, but is made all the more powerful by the authors' command of every facet of CISC architectures.

Computer Architecture: A Quantitative Approach is a tour-de-force on several levels. The book is a masterpiece of technical writing -- Hennessy and Patterson's clear, direct style is absorbing and effective, and their enthusiasm for their subject is contagious. The design and production, too, are impeccable. Furthermore, because the book presents a hardheaded and pragmatic approach to computer design, based on real examples, real measurements, and lessons learned from the successes and misadventures of the past, it should revolutionize the teaching of computer architecture and implementation.

Although this book was not written primarily for programmers, it is a thorough and extraordinarily wide-ranging education in that magical interface between the programmer's intentions and the electron's actions. It should be read by every software craftsman who cares about wringing the last drop of performance from his machine.