Chicken Little is Always Right

Michael Swaine

Life is a series of catastrophes, punctuated by the occasional accident. The best-laid plans of mice and men always gang agley, given enough time. Nothing rises but to fall. The solid ground on which we walk is fractured plates floating on molten magma. What we call walking is really falling and catching yourself, over and over and over again. Until you miss. Nature keeps her forests tidy with lightning bolts and forest fires, tweaks her designs via typos in the genetic code, and is the inventor of the concept of planned obsolescence. We exist between the last giant meteor and the next. We're overdue for the next ice age, but the next plague'll probably get here first. The sky is falling, but it's also burning off into space. Anything that can go wrong, already has. Anything that can't go wrong, will. Murphy was an optimist.

In May, I wrote about self-deluded programmers who feel in their bones that they ought to be able to write error-free code on the first pass. That column prompted Peter Pavlovic in Sydney, Australia, to share what happens when self-deluded managers try to enforce error-free coding:

Funny you should say that programmers seek perfection first time 'round.... [It reminds me of] a project I worked on for IBM Australia as a consultant in 1994.

The boys in blue were feverish about a methodology...named "Zero Defect Development," in which programmers were not allowed to have compilers or debuggers installed on their machines. They were required to write native code in C...and then present the code to a tester. The tester...compiled the code, with all compiler output going to an ASCII file. This file was then printed out and returned to the programmer. If there were any errors, these were added up and marked against the consultant as poor performance.

I was engaged on this project after they'd slugged it out for two years with no code working. Unbeknownst to them, I installed a compiler and debugger....

They thought I was the chosen one.

Pavlovic's chosen-one status didn't last, though:

I was dismissed from the project after some months for my refusal to wear a white business shirt with my suit...so it was with some glee I learned, just the other day, that they've spent over $14 million AUS dollars and now they're going to have another go.

Peter Pavlovic has seen the bump on Chicken Little's head.

The Millennium Chicken

Let me give you another example.

I saw this analyst from the Gartner Group on C-SPAN testifying before a Senate committee about a $6-billion crisis in the making. You know about the crisis; it's recently been in the news, even making the New York Times, but programmers have seen it coming for 25 years-it's the millennial date crisis. Any software that uses a 6-digit date format will fail on January 1, 2000, when the year rolls over from 99 to 00.

The failures won't be spectacular. The programs will mostly just start quietly generating erroneous data. In most cases, this data will not immediately make itself evident. The 00 date may get interpreted as a valid date of 1900, or as an invalid date, or as missing data. The software will do what it's supposed to do with a date of 1900 or an invalid date or a missing date value, and go on about its business.

The effects of the failures will be spectacular. We can predict some of them. Too-large checks may be cut, and some may get sent out and be cashed and spent before the error is noticed. Too-small checks may be issued, and people may suffer financial difficulties while they try to clear up the error. Financial records may contain hidden errors that do not surface for years. Medical records may be produced with dates that make no sense, and advice based on the erroneous data in the records may threaten lives. Critical equipment may fail. I keep saying "may," but catastrophes like these will happen, without doubt.

And the effect will proliferate, thanks to the Internet, and to other networks of computers. World financial information, in particular, is deeply networked, and bad data anywhere in the network can spread like a virus. Software that does not use this six-digit date format will not be immune: There is no way, from looking at a piece of data, to determine that it was computed from erroneous input. There is no blood test for this infection. The only safe sex among programs is abstinence, and it's too late for that. Lawyers are already preparing for this crisis as their big litigation opportunity of the new millennium.

This is not news to the software development community or to readers of DDJ; both Jon Erickson and Al Stevens have discussed it in the past. And the action required is pretty obvious: Clean up that code. Get the word out. Monitor progress in getting everyone to weed their legacy code of six-digit dates and six-digit date assumptions. And it's equally obvious that it won't work. Even if Congress passed a law making the use of six-digit dates a felony, half of the offending software is not in the United States. Not all of the six-digit date code is going to get fixed, and when January 1, 2000 rolls around, because of the aforementioned proliferation effect, even a few bad dates will make a very big mess. We can reduce the size of the catastrophe, but we can't prevent it.

The End of Work as We Know It

Jeremy Rifkin, author and president of the Foundation on Economic Trends, in Washington, D.C., has made a career of warning people about this coming catastrophe or that impending crisis. But just because you're paranoid, it doesn't mean they're not after you; and just because Rifkin's a professional Chicken Little, it doesn't mean the sky isn't falling. Rifkin's latest book is The End of Work (G.G. Putnam's Sons, 1995), which may not sound like a catastrophe to you, but Rifkin makes it seem pretty scary.

It's your fault, of course. Rifkin maintains that technological innovations and other forces "are moving us to the edge of a near workerless world." Current changes in opportunities for employment are nothing short of revolutionary, he says, and they are also unprecedented. Past revolutions in work don't prepare us for the one we're experiencing now. Rifkin says:

In the past,when new technologies have replaced workers in a given sector, new sectors have always emerged to replace the displaced laborers. Today, all three of the traditional sectors of the economy-agriculture, manufacturing, and service-are experiencing technological displacement.... The only new sector emerging is the knowledge sector, made up of a small elite of entrepreneurs, scientists, technicians, computer programmers, professionals, educators, and consultants.

And that's an exclusive and relatively small group.

Outside of the relatively few new knowledge worker jobs, any new jobs being created are McJobs-low paying entry-level positions. Rifkin confidently predicts over the coming decades "massive unemployment of a kind never before experienced." He quotes The Wall Street Journal prediction that corporate re-engineering will continue to eliminate 1 to 2.5 million jobs per year for the foreseeable future.

The small business job boom is a myth, he says. Don't count on it. Nor on the Chinese market.

The cause is not more women in the labor force or cheap foreign labor, Rifkin claims, citing statistical studies, it really is automation. The fears of the 1950s that automation would steal jobs from human workers were accurate.

Well, that sounds pretty scary, but the end of work through automation needn't be a catastrophe; it could be heaven on earth. The End of Work recognizes this. Rifkin presents both the catastrophic scenario and the idyllic one in which humans are freed from labor to pursue the pleasures of life in a new Eden with robotic slaves freeing us of all drudgery. Sure, sign me up.

Trouble is, Rifkin is a lot more convincing when presenting the former (more practice, I guess), and his ideas about how to avoid the catastrophe and steer toward heaven-on-earth are not terribly convincing. To some, nothing Rifkin has to say is all that convincing. Rifkin has presented some of the ideas in this book in speeches and in magazine articles, and is always met with objections to some of his bolder predictions. Not everybody buys his catastrophic vision. But he does marshal a lot of data to support his conclusions, and there is surely something of major proportions happening in the job market. Tell those people losing jobs and not finding new ones that there's no catastrophe. Well, at least we knowledge workers will have jobs.

Earth Died Screaming

I love the Tom Waits song "Earth Died Screaming" from his Bone Machine album, and wanted to use it for a subhead in this column, but I can't really justify it here. I guess that's good, huh? The following brief observation and its implications may send you screaming, though. The Zero Defects Development methodology that Peter Pavlovic described seems obviously inappropriate applied to human programmers, but what if the programmers and the tester were artifacts?

What if the "programmers" are programs that produce simple algorithms to solve simple problems and the "tester" is a program that tests simple algorithms against various data sets and chooses the best. That's not unlike some work that's been done in recent years under the name of neural nets or one of several other labels.

Okay, I guess I haven't sent you screaming from the room in fear that programmers are going to be made obsolete by Zero Defects Development Methodology in Neural Networks Choosing Among Automatic Algorithm Generators. But does it make you just a little nervous that such algorithm generators could be produced in huge numbers and that they work for essentially nothing? And that the tester program can borrow Mother Nature's favorite algorithm: Build a million and throw away 999,999? Nah, you're right. Couldn't possibly happen.

I Was Confused Because I Thought Haskell's Little Buddy Was the Beaver

Murphy struck here in May. In my efforts to bring somewhat obscure languages to light, I obscured the facts surrounding the language Haskell. What I described was Gofer, a Haskell variant created by Mark Jones. Gofer-not to be confused with the Gopher Internet protocol, or that guy from Love Boat who's now a congressman, or Jerry Mathers, or any other small mammal or marsupial-is reputedly an acronym for GOod For Equational Reasoning. Don't blame me for that.

The Gofer system provides an interpreter for a small language closely based on the current version of the Haskell report. It supports lazy evaluation, higher-order functions, polymorphic typing, pattern matching, and support for overloading. The latest version can be found at ftp://ftp.cs.nott.ac.uk/nott-fp/languages/gofer. Mark Jones is now at the University of Nottingham; his Web page is at http://www.cs.nott.ac.uk/Department /Staff/mpj. Haskell itself was invented by a committee. Now, conventional wisdom doesn't care much for committees, holding that a camel is a horse designed by a committee, but this horse flies in the face of that conventional wisdom. Conventional wisdom holds that any language designed by a committee will have all the elegant simplicity and purity of form of Ada, but this horse laughs at that conventional wisdom. Okay, I'll stop now.

Apparently the committee was set up, and Haskell was designed, precisely because there was no standard, non-strict, purely functional programming language. The language-design committee was established in 1987 and released its report on April Fool's Day-I am told with the e-mail equivalent of a straight face-1990. A revised "Version 1.2" was published in SIGPLAN Notices 27(5) (May 1992), along with a tutorial. The full Haskell language is supported by three compilers: hbc/lml from Chalmers in Sweden (ftp://ftp.cs.chalmers.se/pub/haskell), ghc from Glasgow (ftp://ftp.dcs.glasgow.ac.uk/pub/haskell), and the Yale Haskell compiler (ftp:// ftp.cs.yale.edu/pub/haskell). (Mark Jones formerly worked in the Haskell group at Yale.)

The factorial function I presented would have to look like this to exemplify lazy evaluation and infinite lists as I intended: fact n = (products [1..])!!n.

I also tossed off a glib crack about a "LET statement" in Haskell. A LET statement-in the spirit of Basic's LET A=3-would certainly be a serious breach of purity of concept in Haskell, but Haskell doesn't have a LET statement in that or any other spirit. What it has is a LET expression, which does nothing more than bind a name to a value. This, of course, is something else entirely. Thanks to Stephen J. Bevan and Duke Briscoe at Harlequin for setting me straight on Haskell.

Paradigms Past

Haskell didn't make it into History of Programming Languages II, edited by Thomas J. Bergin, Jr. and Richard G. Gibson, Jr. (ACM Press, 1996). It's too new. This book contains the proceedings of the second ACM Special Interest Group on Programming Languages Conference on the History of Programming Languages (HOPL), held in Cambridge, Massachusetts, in April 1993. Kim King reported on the 1993 conference for DDJ in his article, "The History of Programming Languages" (DDJ, August 1993). The first HOPL conference was held way back in 1978, so there was a lot of history to cover in the intervening 15 years.

There is some fascinating material in this 864-page volume. Languages covered include ALGOL 68, Pascal, Concurrent Pascal, Ada, Lisp, Prolog, FORMAC, CLU, Smalltalk, Icon, Forth, C, and C++. In general, for each language there is the original paper by the language's author, the actual transcript of the talk given and of the question and answer session that followed, and a biography of the author. In other words, this book can list as its authors Niklaus Wirth, Guy Steele, Richard Gabriel, Alain Colmeraur, Alan Kay, Adele Goldberg, Chuck Moore, Dennis Ritchie, and Bjarne Stroustrup.

There are also some introductory and closing papers on language design and history by the likes of Fred Brooks and Jean Sammet. Historian of Science Michael S. Mahoney, the conference's consulting historian, contributed a nice essay on what makes history, telling a story of Piaget's work with children to illustrate how quickly our understanding of how we learned something disappears, leaving us with only the knowledge and no mental history of how it was acquired. Here's Mahoney again:

Several years ago I asked someone at Bell Labs responsible for maintaining software, what sort of documentation she would most like to have but did not. "I'd like to know why people didn't do things," she said."

I don't know about you, but I think that's profound.

And then there is this from Noel Nyman at Microsoft:

Your mention of the Geniac brought back fond memories of my lost childhood. I actually owned one of those devices when I was in junior high.

They were advertised in small ads in the back of Scientific American. For months I tried to convince my family that Geniac would be a wise investment in my future-to no avail. I don't remember what it cost, but the price seemed astronomically high in the 1950s.

Then Sputnik was launched and stunned politicians decried our lack of engineering resources. In an effort to set me on the fast track to helping our country regain lost prestige, the Geniac was purchased.

The entire Geniac was just a series of multipole switches.... The device was 'programmed' by wiring the switches together. To anyone clever enough to understand how the device was programmed, operating it was trivial and unnecessary.

It must have done the job, though. We beat the Russians to the moon and now they work cooperatively with us in space. And I work for Microsoft, making the future of America secure by bringing Windows NT to the marketplace.

Things Fall Apart

This final note on the inevitability of things falling apart and the center's inability to hold: As I was finishing this column, the May issue of Scientific American arrived, with a cover blurb above the title announcing "Crash-proof Computing." The article, it turned out, was not about crash-proof computing, but, rather, about designing networks to survive the inevitable crash of a program on one of the networked machines.