I believe that the pace of change in programming methodology is increasing, and that those who learn new paradigms will keep up, while those who do not will be left behind.
I am encouraged in this belief by a rereading of Robert Floyd's acceptance lecture on receiving the 1978 Turing Award. Floyd contended then that "continued advance in programming will require the continued invention, elaboration, and communication of new paradigms."
Floyd took the term paradigm from Thomas Kuhn's The Structure of Scientific Revolutions and used it to refer to general models of problem solving and the shared conventions and traditions of a discipline. Structured programming is a general paradigm, recursion a narrower one.
In his lecture, Floyd described how he invents new paradigms. Having solved a problem, he next resolves the problem from scratch, then looks for the general rule for solving problems of this sort, ultimately deriving a broad problem-solving paradigm. "Most of the classical algorithms to be found in texts on computer programming can be viewed as instances of broader paradigms," Floyd said. "Simpson's rule is an instance of extrapolation to the limit. Merge sorting is an instance of the divide-and-conquer paradigm. For every such classic algorithm, one can ask, `How could I have invented this,' and recover what should be an equally classic paradigm."
And that's what you need to do if you want to advance the field and your place within it, according to Floyd: "I believe that the best chance we have to improve the general practice of programming is to attend to our paradigms."
I believe that attending to our paradigms is more imperative today than a decade ago, for two reasons.
First, there are simply more paradigms that we must understand today. Consider this list of vogue topics: logic programming, production systems, expert systems, blackboard systems, functional programming, object-oriented programming, event-driven programming, neural nets, associative memory models, machine learning paradigms, MIMD, SIMD, and data-flow programming. Many of these topics overlap and some may be synonyms, but how many programmers can sort them out? Or predict which will be important two years from now?
Second and ultimately more far-reaching, I believe that a very deep paradigm, the Von Neumann model of sequential processing, is in the process of being supplanted by many parallel-processing paradigms. If this is true, it will be the most fundamental change in programming since the development of high-level languages, and it will radically affect the way we think about the process of writing software. With certain limited and constrained exceptions, all software is written within the Von Neumann paradigm. As programmers we scarcely know how to think in parallel terms. Our algorithms will not transfer. We will need a paradigmatic approach in order to find our way in a parallel world.
The parallel world is bigger, and hairier, than the sequential world.
True, there are grounds for skepticism about parallel processing. There is no architectural platform for parallel programming outside certain specialized areas, such as numerical analysis and graphics. The kind of parallel processing I am talking about--multiple instruction, multiple data (MIMD) programming--is difficult, with significant unsolved problems.
True, there is no parallel equivalent of the IBM PC. But desktop multiprocessor architectures based on the transputer chip exist as commercial products today; they are just not yet cost-effective. The biggest cost factor is the price of the transputer, which is a function of demand and competition. This barrier could start falling within the year. It is not too early to imagine what you could do with a true parallel-processing system.
True, decomposition of a problem into MIMD parallelizable components is hard. That is precisely why a paradigmatic approach is necessary: finding the right paradigm can give you the solution to a broad class of problems. The divide-and-conquer paradigm, for example, is well adapted to MIMD parallelization, so if you can cast your problem in that form, you should be able to find a good parallel solution. And a good parallel solution is one that increases throughput radically.
I believe that the most successful programmers in the next decade will be those who carry in their toolkits, among their shiny metric and nonmetric algorithms, a rich set of paradigms.
Michael Swaine editor-in-chief