Dear DDJ,
Jeff Duntemann's October 1991 "Structured Programming" column about programming for small vertical markets really hit the target. I started designing and programming computers in 1955. Now I am an antiques dealer. The two careers meet in a column in an antiques trade journal telling fellow dealers how to use computers. Software for antiques dealers is submitted to me for review almost monthly. None of these products has gained general acceptance, and most are just bad.
Much of this software is a gussied up version of a program originally commissioned by one dealer. Invariably these products represent a thin stripe of the vertical market. The rest of it is written by people who do not know Chippendale from Limoge but think they know how our businesses ought to work. Either approach is bound to fail in an industry that has as many individualists, ways of selling, and bookkeeping methods as this one does. It is especially insulting to see a program that does not properly account for inventory value, when the antiques industry is completely inventory-driven. Give us a break, guys: Get out in the field.
Jeff's support of Clarion for application development is on the mark too. Several programs I have reviewed and use regularly are written in it. One communicates with an on-line database service that has no more than 600 users. The programming costs have to be recovered from a $100 annual fee. With a modest development cost, Clarion provides a good looking piece of software for the basically nontechnical users.
Clarion lacks one thing: a GUI or text-mode GUI look-alike. Of seventeen horizontal applications I use regularly in my business, eleven have gone to the common look in the last year. That is another thing vertical market developers need to think about.
John P. Reid
Bear, Delaware
Dear DDJ,
I am a programmer specializing in real-time financial information and price charting. I really enjoyed Jeff Duntemann's June 1991 "Structured Programming" column about the new Turbo Pascal for Windows. I've been using the product for a month now and already have put up a 5000-line MDI application. I agree with your first impressions about it. Before TPW, I was looking for something to develop Windows applications. I tried C, but I can't really think using !,*,&, and ->. Begin and end just feel better. I was very excited the first time I saw TPW advertised in your magazine, and it surpassed my expectations.
TPW makes it easy to port code from Pascal 6.0 but also allows porting of C code. I have done both. I ported SNAP3 C code (see DDJ, February 1991) to OWL in about an hour. I also ported 3000 lines from my Pascal 6.0 charting program. TPW lets you use both OWL and the conventional C Windows structure.
I have only two complaints about TPW: First, I would like to see the same rich set of code examples of TP 5.5 and 6.0. Of course I understand Borland was in a hurry to release this hot product. But an example like MICROCALC would be a very good source for reference. Examples are the ultimate source of information for complicated environments like Windows. Particularly, I think they should have provided some example of DDE Server. This is a good subject for the "Structured Programming" column. DDE is just too complicated for average programmers like myself.
This DDEServerWindow object would have methods for handling Windows DDE messages (WMDDEAdvise, WMDDERequest, WMDDEAck, and so on). These methods would call virtual methods like:
TopicAvailable(Topic:String):Boolean;
ItemAvailable(Topic,Item:String):Boolean;
GetItem(Topic,Item:String;var CFFormat:
word;PValue:Pointer;var
Length:integer);For warm and hot links, the method:
ChangeItem(Topic,Item:String;PNew
Value:Pointer;Length:integer);and so on. This way it would hide the complexities of atoms, global memory blocks, Advise, AckReq, DeferUpd, and God knows what. I really don't have the technical skills necessary to write such an object.
I tried to port DDEPOP from Petzold's book with no success. I had a particularly hard time with BOOL flags in DDEAdvise structs. DDE Servers are to Windows what TSRs are to DOS: hard to understand and debug. You get a lot of "Unrecoverable Application Error" (UEA) messages.
This leads to my second complaint. These Windows error messages don't say much about what generated the error. Even for just a common runtime error, Pascal cannot find the error point in the source code unless you type in the address. That's a dumb thing for an environment that's supposed to integrate applications. Sometimes, you have to start the debugger just to find a simple UAE.
About the communication ports: I had the same problems you did. I solved them by writing my own interrupt-driven communication services. I used the same code my old TP 6.0 comm application did. The only change I had to do was to use the DATA segment for the circular buffer and head. My old ISR used the CODE segment for these variables, a clear protection violation. My interrupt routine is in assembly language but I believe you can write one in Pascal as well.
Apparently, Microsoft doesn't want to enforce the use of Windows communications facilities. This fact is confirmed by the permission to access serial port registers directly (I don't know much about protected mode but I know it can avoid such accesses) and the absence of documentation about the use of Windows comm functions in SDK and Petzold's book.
My communications program works fine in real, standard, and 386 enhanced modes. Windows even warns you if you try to start a non-Windows app that uses the same serial port. The only problem with this approach is that if you get a UAE (a very common occurrence while developing), your application terminates, leaving the interrupt uncovered. In this state, one byte coming in from the comm port is enough to hang Windows. (Actually, Windows aborts to DOS.) I couldn't find anything like Pascal's ExitProc in the Windows documentation.
I will try to put the communications routines in a DLL. DLLs don't terminate violently like applications and have initialization and exit procedures, which may be used to set and restore the interrupt vector.
Turbo Pascal for Windows is really an important product and I'm happy to see important magazines like yours interested in providing information about it.
Omar F. Reis
Sao Paulo, Brazil
Dear DDJ,
I just read Al Stevens's September "C Programming" column and I think that he is giving the ANSI C committee a bad rap. I think the problem is with his code.
Al seems to have been using a C compiler which made some peculiar decisions about how to interpret "preprocessor" lines in macro replacements. The 1978 Kernighan and Ritchie seems to be silent on this subject. But Harbison and Steele (C: A Reference Manual) say explicitly that Al's code should fail: "If a macro expands into something that looks like a preprocessor command, that command will not be recognized as a command by the preprocessor." I have always used Harbison and Steele as gospel when trying to write portable code for pre-ANSI compilers, since they based their book on many different dialects of C. The ANSI spec just codifies this behavior.
If Al wants to create macros which look like #define lines, it should be pretty easy using ANSI C as long as he is willing to create complicated make files. Using a file like
#define POUND #
#define defMacro(macro, replacement)
POUND define macro replacement
#include "whatever.h"and sending the output of the preprocessor to another .h file should give him what he wants.
But I think that the real problem is with his style of coding. I also like to use the C preprocessor for exotic purposes. But after being bitten a number of times by incompatibilities between different C dialects, bugs in the preprocessor, and overflowing internal buffers, I have learned to avoid abusing the preprocessor. I think any C programmer who is producing supposedly portable code ought to follow this rule: "If it looks like it might fail, it probably will on some compiler. Would I rather spend my time studying the ANSI spec and experimenting with my compiler, or would I rather write my own preprocessor and know exactly how it works? (And, if I don't do it now, I will have to rip all this code out and write my own preprocessor when I port it!)"
Alan B. Harper
Oakland, California
Dear DDJ,
I am perplexed by "The Programmer's Soapbox" at the end of Al Stevens's September column. If language is declining, I ain't noticed it. (Is this oxymoronization?) A few points:
William R. Ockert
Carrington, North Dakota
Dear DDJ,
I found Kenneth Roach's article "Using the Real-Time Clock" (June 1991) very informative, but I would like to make some comments.
One issue that disturbs me is that Mr. Roach suggests replacing the system services for getting and setting the system time by directly using the hardware clock. Subverting system services is never a good idea unless there is some overwhelming reason to do so. Mr. Roach gave the reason that he needed an accurate timing mechanism. My suggestion would be to use his own Turbo Pascal Clock() function for this purpose. Otherwise, I fail to see a reason why getting or setting the system time would be a time critical operation; the less than one millisecond overhead is simply not going to be an issue with the value of time kept by the system, or be perceived by the user unless done repeatedly in test loops.
My objection to subverting the operating system in this case is that MS-DOS provides you with the ability to override the clock device driver such that it can be done in a device-independent way. This device driver is used by DOS for the get/set system time and date services. I have developed clock device drivers that do this both for the real-time clocks commonly found in XTs and the AT real-time clock. Sadly, the default clock device driver in MS-DOS relies on the value updated by the timer tick interrupt. Few DOS users know that by using a clock device driver designed for a real-time clock, you can provide the convenience of the ability to set the hardware clock by using the DIS time and date commands.
Mr. Roach also noted that the get time service on a LAN was considerably slower than with no LAN installed. I believe this is due to the LAN using a synchronous time base for all connected machines, thus the request is handled via the network. This situation would be desirable when comparing time stamps of network files and other network related activities.
Mr. Roach's use of the AT's periodic interrupt may be a potential problem. The AT BIOS makes use of this interrupt with the event wait service (interrupt 15h, function 86h), which is intended for use by a multitasking operating system.
Robert Mashlan
Norman, Oklahoma
Dear DDJ,
The article entitled "Software Patents" was basically fear-mongering propaganda, and so seemed out of place in the usually placid technical pages of DDJ.
Principally lacking in the article is any recognition of the economic environment in which products compete. For example, patent license fees are almost always royalties. Unless a patent holder is an idiot, he or she has no desire to put a manufacturer out of business, or to force a product to be crippled in the marketplace. In fact, the pressure is on the patent holder to negotiate a reasonable fee, so that a new product can compete successfully with established products, and thus create maximum royalties. Very few patents are so vital that absolutely no marketplace alternatives are possible.
Deceptively absent from the article is the identification of those who are most advantaged by patents: independent individual inventors. Without patent protection, any new idea can be taken and used by those who have the largest staffs of programmers, the largest production, marketing, and sales organizations, and the largest advertising budgets; an individual cannot realistically hope to compete with such organizations other than in small niche markets. With patent protection, the individual has some amount of leverage to restrain or harness large organizations and thus reap the rewards of his or her own efforts. Patent protection can be obtained directly by individuals, for modest fees.
The best handbook available is Patent it Yourself, Second edition, by David Pressman ($32.95 ppd. from Nolo Press, 950 Parker St., Berkeley, CA 94710; 800-992-6656). Self-patenting is a lot of hard work, but is probably within the range of any technical person willing and able to put out the effort.
Although the article begins by pointing out that a patent is a grant of monopoly in return for public disclosure, it is embarrassingly silent with respect to the lack of exactly that sort of disclosure in software, and the problems thus caused. It is no accident that one of the oft-mentioned problems in software is that programmers continually "reinvent the wheel." It must be that way: Virtually all of the "good" or economically important software is available only as object code, rather than source.
Because programs are not generally protected by patent, precious source code is kept as a vital trade secret; consequently, any especially good techniques within the source generally remain unavailable to the public forever, instead of just the limited lifetime of a patent. And when the software product eventually dies, any special techniques in it die as well.
Because economically important techniques are not publicity disclosed, ordinary programmers cannot incrementally build upon previous work; most programmers will not even see that work. In contrast, large organizations can afford to disassemble competitive code; the secrets thus revealed are, again, trade secrets, and again unavailable to the general public. Trade secret software techniques are thus available to large organizations, to make them even more powerful competitors. (This clearly happened during the early years of DOS.)
Another point the article overlooks is that patent protection encourages the investment in research necessary to develop new ideas. True, some developments are easy, cheap, and obvious. But others may involve extensive library research, theoretical development, and experimental trial-and-error; such work can be very expensive. If expensive results are not protected, such research will be a poor investment, one which will not be made again. In an unprotected environment it is far more efficient for large companies simply to wait for someone else to come up with an idea, then steal it. Patents restrain this, and are thus a tool to help recover research expenses (although most patents do not earn back their issue fees). Failure to recover such expenses means less research. Is that what we want?
When we are young and in school, information is provided for us, and the vehicles of such information are freely available. But as mature individuals we must understand that we are part of a capitalist society, and all of the information we have was found, accumulated, and paid for, through the direct effort of previous generations. One of the ways this was accomplished is by patent protection. Patents have thus been proven in practice to be an important tool for encouraging public disclosure in a society which respects private ownership. Moreover, patents automatically provide economic support only for worthwhile market applications of research and development, a situation which seems far better than the idea of tax-supported research grants under arbitrary and political bureaucratic control.
The natural application of patent concepts to software has the potential for improving the industry for individuals and small businesses, by allowing them to restrain the giants. Naturally, large companies may see this as a threat. Perhaps they will even support "The League for Programming Freedom" to try to keep this threat at bay.
Terry Ritter
Austin, Texas
Copyright © 1991, Dr. Dobb's Journal