Dear DDJ,
DDJ readers might be interested in knowing that Rod Price, the big winner of the Abbadingo DFA Learning Competition (http://abbadingo.cs.unm.edu), learned about the contest in DDJ's May 1997 "News & Views." A paper on the results of the competition is being reviewed by Kluwer's Machine Learning journal, and a shortened version of it was accepted by the ICGI-98 conference. (Both are available from the aforementioned web page.) Thanks for mentioning the contest. It made a difference.
Barak A. Pearlmutter
bap@cs.unm.edu
Dear DDJ,
In the July 1998 issue of DDJ, Greg Wilson reviewed Robert Glass' book Software Runaways: Monumental Software Disasters. Glass attributes much of the problem to the lack of "a conscientious effort to develop histories of past project costs."
I have not yet read this book, but I may have to take a look at it, even if it doesn't offer any solutions (as Greg Wilson points out). However, the question it raises (why can't we estimate accurately?) is one that I have pondered for most of my 20+ year career in programming.
I believe there are several factors that interact to put us in this predicament. The first is that we are eternally optimistic. In the book Controlling Software Projects (Yourdon Press, 1982), Tom DeMarco states that the default definition of "estimate" is "...the most optimistic prediction that has a nonzero probability of coming true." Obviously, we will be disappointed time after time using this method, but we continue to do so. Sometimes we base that optimism on the latest and glitziest technology in an eternal (and hopeless) quest for the "silver bullet" that will solve all our problems. The fact that these problems persist, whether we are using Cobol, VB, C, C++, or whatever, shows that the problem is not rooted in technology but in our understanding of the software development process itself.
A second factor is the reluctance of management to understand and accept the cost of software development. If an accurate estimate was given for a project, the project would be rejected. But management will accept an unreasonably low estimate, approve a project and then continue to sink many more dollars into it. In the end, after numerous budget revisions and schedule slippages, the project will be completed and touted as a great success. This pattern will continue even after decades of revised budgets and schedule slippages. It is unacceptable to admit failure.
A third factor is the reluctance of IS people to compare final budgets and schedules with the original estimates after a project is completed. I believe part of the reluctance to do this is political. Once a project has been deemed a "success," it wouldn't be acceptable to go back and [examine the] mistakes. Since we fail to learn from our history, we are condemned to repeat it.
Fourth is the lack of a good paradigm for measuring systems and their functionality. The use of Function Points may hold some promise here. At least it is an attempt to measure functionality delivered to the user, rather than lines of code, number of programs, or some other equally meaningless measure. We quite often measure that which is easy to measure, not that which is meaningful.
Contributing to all this is a fundamental misunderstanding of the use and value of data and systems in a business. Most companies persist in viewing computing systems as overhead rather than recognizing their true nature. See "Information, Computer Systems and Manufacturing" at http://www.serv.net/~glasgow/techsoc.html#comp&mfg for more on this topic.
Even if we did keep better histories as Glass suggests, and even if we did review them, the fundamental question would remain unanswered: "Knowing what we now know, how would we estimate differently the next time?"
Gordon Glasgow
glasgow@serv.net
Dear DDJ,
Following up on Jonathan Erickson's July 1998 "Editorial," I'd like to add: So Windows CE is a bona fide RTOS? I'll certainly keep that in mind when I decide to hook up my PalmPilot to my oil refinery. In the old days, it was either "real time" or it wasn't. Perhaps Microsoft has been studying fuzzy logic for too long, and is applying the concepts to its entire product line. When we think "WinCE," we might well lowercase the "CE."
Regarding Open Source and Richard Stallman: Perhaps O'Reilly's problem with Richard Stallman is that his notion of "Open Source" includes "free for the taking," which cuts into somebody's potential profits. But O'Reilly has been a major supporter of Perl, which is also free. I agree with Erickson's assessment of Stallman's place in the Hall of Fame (well, there should be one. After all, Berners-Lee finally got a MacArthur genius grant).
Finally, regarding the Software Success Study: So much for the "Microsoft stifles competition and drives out the little guys" argument. Companies come and go (like the East India Tea Company). With the incredible acceleration in technology these days, it's only natural that they will come and go more rapidly than before. Another measure of software economics is a chart of software package price versus year, from the '60s to the present (using package categories: an office suite from the '70s compared to one from today). I think you'll find that prices have come down considerably.
Mike Zorn
zorn@dms-1.ana.bna.boeing.com
DDJ responds: Thanks for your comments, Mike. It should be noted that Richard Stallman has also received a MacArthur grant.
Dear DDJ,
Regarding Jonathan Erickson's July 1998 "Editorial" concerning the open-source movement, Donald Pederson, the mentor of SPICE, was adamant about keeping the source for SPICE in the public domain in the early '70s. "Learning GNU Emacs" states that Richard Stallman wrote the first version in 1975. My first exposure to SPICE, however, was in early 1974. For more information, the June 1998 issue of IEEE Spectrum includes an article on Pederson's work.
One of the back issues of DDJ that I've decided to keep is March 1985 with Stallman's "The GNU Manifesto." Figured it is almost as historic as the January 1975 issue of Popular Electronics, which introduced the Altair 8800.
Finally, Jerry Pournelle was ragging on the high cost of software back in the early '80s, suggesting that eventually people would give away the software and make money on the documentation. Seems to be happening with GNU/Linux.
Erik Magnuson
erik@cts.com
Dear DDJ,
I was reading Al Stevens' "C Programming" column about hacking assignment of reference data members (DDJ, February 1998) and another C++ solution occurred to me. In the assignment operator, call the object's destructor (yes, I know this is generally bad). After calling the destructor, use a placement new to construct a new object in the same memory since the constructor can set the reference data member.
Dennis Payne
payned@rpi.edu
Dear DDJ,
Robert Moore and Gregory Foley suggest several schemes in their article "Date Compression and Year 2000 Challenges" (DDJ, May 1997). Here's one that can maintain a certain backward-compatibility with existing data: In the standard MMDDYY format, the most significant month digit uses only a single bit (for months 10, 11, or 12). Even in packed BCD format, that leaves another three bits in which century information can be encoded. This scheme is simple to decode: The upper-three bits are taken as a century code (where 0 = 1900) and used to adjust the usual MMDDYY data, which can be easily extracted by masking.
The beauty of this compression format is that humans can still read the compressed data when displayed by normal BCD display routines. If the month values are 21 to 32, subtract 10 to get the true month and interpret the two-digit year as 20YY instead of 19YY. Months 41 to 52 are for years from 21YY, 61 to 72 for years 22YY, and so on.
For example, 01-01-2000 would be encoded as 21-01-00, while 12-31-2099 would become 32-31-99. Even in the worst case where only digits 0-9 can be displayed, this scheme will handle dates up to 12-31-2399 = 92-31-99. Hopefully, the old mainframes will have blown their last tubes by then.
Bob Masta
tech@daqarta.com
Dear DDJ,
It appears Michael Swaine's "Programming Paradigms" April fool's joke went further than even he designed. It included a neat bit of revision of mathematical history. While soberly explaining the joke, Michael repeated of Vieta: "...he is the father of algebra, he did hate the word..." Perhaps he hated the word because it carries within it the fact that he most certainly is not the father of Algebra, which predates him by over a millennium.
A far more likely choice for that title is the ninth-century Muslim mathematician Muhammad ibn Müsä al-Khwärizmi, whose name was latinized into "algorithm," which at first was used to mean "arithmetic" in many European languages, and eventually gained its current meaning. His most famous work was "Al-Jabr," which was latinized to "algebra" in the west, and which was the first work in the west and near east on the topic. But even the works of al-Khwärizmi owed much to previous Muslim scholarship and his study of Indian sciences. It is through his work that the Indian numerals, and the concept of "zero" came to the Islamic world, and so to west, and this is why they are called "arabic numerals."
Uche Ogbuji
uche@poetic.com
Dear DDJ,
The irony was delicious in the March 1998 "skunkworks" letter from Lockheed Martin's attorney. The people who "derived" the term from a comic strip and made it exclusively their own (talk about stealing candy from a baby!) now feel wronged by its generic use. Perhaps the rest of us should start using the term "sk{}nkworks."
Achal Shah
minal@interlog.com
DDJ