LETTERS

Hidden Secrets

Dear DDJ,

The S-CODER algorithm Robert Stout presented in the January 1990 issue of DDJ seems reasonable, but the programs using it (Listings One and Four) don't give adequate protection against Kerckhoff's superposition. This 19th-century technique is mentioned in the cryptographic hobby literature (perhaps in Secret and Urgent by Fletcher Pratt or The Codebreakers by David Kahn). The method is not very computationally intensive, as one might expect given its age.

Consider Listing One. Suppose NF files are encrypted with the same key. Since cryptext[] does not depend on the plaintext, all the files are encrypted by XORing with the same byte stream X[1], X[2], . . . If P[i,j] is the ith byte in the jth plaintext file and C[i,j] is the ith byte in the jth ciphertext file we have

C[i,j] = P[i,j] XOR X[i]

If we form the set of bytes C[i,j], j = 1, ... NF for some fixed i, all these bytes are encrypted by XORing with the same byte. We can readily find the value of X[i] that decrypts this set of bytes most resembling plaintext. For example, if we assume ASCII plaintext, the space character will be overwhelmingly the most common (ordinarily) and X[i] = (most common byte in the set C[i,j], j = 1, ..., NF) XOR <space> will be right most of the time. We can do this for each value of i separately without worrying about the recursion relating the X[i]s.

For Listing Four the first thing to observe is that the transposition is keyed with a sequence of random numbers whose seed is in the cryptanalyst's possession. He can thus remove the transposition at will. The other problem is that crypt_ptr is initialized to (file length) mod (key length). The file length is not concealed from the analyst, however, and the key length might well be less than 20. With several hundred files encrypted with the same key in his possession, the analyst can assume a key length and then group together sets of files whose lengths do not differ modulo the key length. He can then still use the superposition method on these sets of files. If he has assumed the wrong key length he will quickly notice that the sets of bytes C[i,j], j = 1, ..., NF will have frequency distributions that are too smooth.

I suggest two changes. First, the random number seed should depend on cryptext[] so the analyst will need to deal with a nontrivial transposition. Second, the initial values of the bytes in cryptext[] should be modified using the sum of many or all of the bytes in the plaintext so that the chances of many files having identical key streams X[i] will be negligible. This sum of bytes can be put openly in the file header along with the length, or some more devious method chosen.

  Stewart Strait
  San Diego, Calif.

Bob responds: First of all, thank you for your comments. As suggested in the article, S-CODER isn't necessarily an end in itself, but an embeddable encryption engine. The final listing in the article begins to suggest ways to create more secure applications, although the example I used is subject to fairly straight-forward cryptanalysis. This, incidentally, was deliberate, leaving open the option of embedding S-CODER in a more secure wrapper as a challenge to critics.

Aside from the principle that S-CODER should be considered more of a building block than anything else, the two primary points I'd like to stress are that:

    1. The unembellished S-CODER engine was not suggested as being particularly secure when faced with known plaintext analysis, but is rather more suitable for short-term security or unknown plaintext applications. In this sort of application, the only requirement would be that the key is longer than the longest item of plaintext, which might be known by implication (for example, words like "the," a company name, etc.);

    2. A truism of data security is that a theoretically secure method known to everyone (e.g., DES) is often less desirable than a theoretically less secure method (e.g., an enhanced algorithm using an embedded S-CODER engine) which is not known. S-CODER buried within an unknown block transposition cipher as suggested in the article, or something equally (or even more) obscure presents the cryptanalyst with the immediate problem of classifying the algorithm before being able to break it. Knowing only that S-CODER is buried in there somewhere is of little help since S-CODER is a stream cipher and any enhanced application will randomize the serial dependencies of the S-CODER encryption.

Addressing your specific objections, cryptext[] is only independent of the plaintext during the first pass through it. The ability (suggestion, actually) to set crypt_ptr at some arbitrarily random starting point within cryptext makes the kind of analysis you suggest much more difficult. If we assume that the initial value of crypt_ptr is determined by some unknown feature of the plaintext, the relationship between i and j in your example will be randomized. In your second point, you're absolutely correct that the implementation in Listing Four is flawed in that the information used to provide the additional security is openly available to the cryptanalyst. Any conceivable method used to conceal the file length would remove this objection. Once this vital piece of information is concealed, the next step is to modify the actual function within which S-CODER is embedded. Ideally it should be fast, as is the block transposition cipher shown in the listing, but the exact method should be known only to the implementor. This is what I meant above when I said that an unknown method may offer more security than a better, yet known, method. The real value of Listing Four is in showing that such methods needn't sacrifice any significant performance over the raw S-CODER algorithm to offer significant levels of security. One of the purposes of the article was to encourage people to think of how to write practical software rather than worry about theory so much. I welcome your contribution and will consider your suggestions in future applications.

Fighting Software Patents

Dear DDJ,

Recently your magazine lightly touched on the subject of software patents. In the April 1990 issue of Technology Review (Building W59, MIT, Cambridge, MA 02139), Brian Kahin presents a frightening view on the subject. Even if part of what he envisions comes to pass, software as an entrepreneurial business could be destroyed. I recommend this article to all who work in the art and science of software.

I do not understand why we have put up with this kind of silliness. An aroused software community (users and authors) using the networks (USENET, BITNET, CompuServe), bulletin boards (both local and national, like yours), and their personal word processors can make a Very Loud Noise. These communications resources can be used for information exchange and coordination. If everyone then processes one letter a week to some politician somewhere, the very idea of software patents can be quickly forced out.

DDJ is a technical magazine. The whole idea of software patents seems to have arisen from technical naivete in legal and political circles. We need to have solid technical, intellectual, and philosophical arguments against software patents circulated amongst ourselves and in any circles that are affected. I urge DDJ readers to carry out this idea and start a word processing campaign.

Jim Iwerks

Rio Rancho, New Mexico

Pick-A-Fight Interfaces

Dear DDJ,

I am surprised that someone with as much computer experience as Bob Canup would write such a one-sided and narrow minded article as "Pick-A-Number Interfaces," which appeared in the February 1990 issue of DDJ. Mr. Canup's entire argument seems to be based upon a poorly written mailing list program that he had had experience with. A properly written application level interface should be intuitive enough for any new user to feel comfortable with and should also assist the experienced user into getting the job done as quickly as possible. The ultimate goal in assisting a new user is to make the application as "real world" as possible. An example of this would be to use the letter "B" as the backup option. The relationship between the letter "B" and the word "backup" is simple and intuitive and lends itself to concepts the user already understands. Bob proposes forcing the user to learn a new and unfounded relationship between the number six and "backup."

Webster defines the word "friend" as "A person who one knows well and is fond of," this is exactly the type of response that a programmer wants to achieve with a user interface. Modern UI techniques help users feel like they are working with objects they know and understand. A piece of paper can be represented by a window; just as someone may lay another form down on their desk, they can lay down a new window on the screen. A windowed interface can also shield the user from absorbing information they do not presently need and at the same time graphically represent their "position" within the application. Although pointing devices such as a mouse can be a hindrance to the experienced user, they are often helpful to the new user by allowing them to operate an application without having to know a predefined set of relationships between the application and keyboard. A mouse user can simply click their pointer on the word "backup." It may take slightly more time to execute the command, however, the knowledge required to select an option is reduced to understanding the operation of the pointing device. As the user becomes more proficient at their own pace they may begin to use the keyboard correlation commands. On the other hand, the keyboard user must not only know how to operate the keyboard but also must learn relationships between keys and options in the application before they can operate it at all. Modern interface techniques provide a much more supportive interface for the user. Mr. Canup makes the assumption that the use of these tools is the problem, when really it's their misuse. I may be able to saw a piece of wood in half using a power drill; although there are much more effective means, does this mean that the power drill is a useless frill? Obviously not! If used properly, a power drill can save a great deal of time in a project. I am certainly glad that Mr. Canup is not the guiding light behind innovative and forward thinking user interface design.

David Bennett

CompuServe: 74635,1671

Standards Revisited

Dear DDJ,

Since you gave Tex Ritter, P.E., a full page, I hope you will print a few words of mine. Tex Ritter, P.E., takes a couple of thousand words to tirade about the standards process ("Letters" DDJ, February 1990), but reveals that he has forgotten those portions of the scientific method he learned.

He didn't bother to learn that the "draft standard" is not an early, highly changeable document but the next-to-last step in acceptance and is nearly frozen in place with people using it as a basis for software.

He spent many hours responding to everyone's comments that he received in June '89 during the 15-day response time and considered the response in September "appallingly condescending." Again he didn't do his research. At the stage he did all that work, he only had the right to respond to his own previous comments. Other people merely complain that it takes too long to search for the changes that caused their own previous comments.

His greatest complaint seems to be that he wanted a lot of features that violate the Pascal philosophy of rigidity and easy teaching/grading (which is why I don't like Pascal) and he wanted strong engineering math types. In the latter case he should be using Fortran, except he likes the lazy ease-of-use of never-standard Turbo Pascal.

After stating that voting is not the way to select a standard, he decries the "back room maneuvering" that put together the standard. He is ignoring the hundreds of hours put in to create a consensus document that could be voted on. It isn't democracy, it is consensus building among people who are scattered across 3.6 million square miles. Of course, Ritter would probably object if the meetings were held anywhere but at his Austin home.

All I know about the standards process is the descriptive articles that have appeared in Dr. Dobb's, Unix Review, and C User's Journal. Obviously, I know more about the process than Mr. Ritter, who tried to participate but effectively entered the pool with a belly flop. I hope that he will research his next project (and his P.E. work) in a more professional way, so he does not waste his time.

Mike Firth

Dallas, Texas

Great Minds...

Dear DDJ,

It is most interesting that Michael Swaine has created an article on the War on Bugs as a pun on the national War on Drugs ("Swaine's Flames," DDJ February 1990). I enjoyed the sarcastic humor and appreciate the work that went into making that article fly.

About a year ago, I too noticed that bugs and drugs had many similar qualities (for example, their tenacity and their ability to break down morale), and decided to include the following in the origin line on my FidoNet BBS:

*Origin: Programmer's Oasis/ 919 - 226 - 6984 -- Say NO to Bugs! (1:151/402.0)

Strangely enough, that is the only pun that was not utilized in the article! Oh well. Thanks for an entertaining column in a highly enjoyable magazine.

Chris Laforet

Graham, North Carolina

Ritter Redux

Dear DDJ,

After reading Tom Turba's response (April 1990 "Letters") to my published letter (February 1990), I have the feeling that he missed his true calling: He has a great future in politics. Although the individual facts he cites may well be true, the impression he leaves is deceptive. Essentially he says: "We have rules which protect your interests; if you did not take advantage of them, it is your own fault." But it is one thing to leave the doors open to participation, and quite another to announce where the doors are, and what is on the agenda. A casual half-hour with my old programming magazines turned up no less than eleven articles over the past two years on or about ANSI C standardization issues; in contrast, I found no articles on ANSI Extended Pascal. It appears that Pascal users have had no notification whatsoever of the Extended Pascal effort.

Tom says that he invites our "participation," but there is participation -- where one may have the chance to speak, only to be ignored -- and PARTICIPATION -- where one's concerns are satisfactorily addressed. I obviously did participate, and certainly did not approve of the Extended Pascal specification, and yet Tom tells us that consensus and even unanimity was reached. Such participation reminds me of the communist party -- all comrades are equal, but some are on the central committee. I guess if you don't physically walk through the doors you must not be participating, and maybe if you do, you get to be a "visitor" or "observer." Clearly, this scheme was not designed to increase participation.

Apparently Tom has now gotten a clue: A possible future project (object-oriented extensions) may be announced to the user community. Great, Tom, but the problem is the current specification, not extensions to it. As I see it, Extended Pascal was developed without significant involvement from the user community, so the process, as arduous as it may have been, was fatally flawed. The resulting "standard" is invalid. It is time to throw this one away, and start on one we can all accept.

I understand that those may be chilling words to some people. To the individuals who have participated in the many long debates which may have taken place, this might seem like the destruction of all their work. Yet if persuasive arguments exist for the various decisions, they should again prevail; all that would be lost are those political victories which were not technically warranted. To software companies who see standardization as a marketing tool, building consensus within the user community must seem like a costly and unnecessary delay, but what good is a standard if most users reject it? As important as these views are, they pale beside their effects. For example, high school students are being subjected to "Standard Pascal," because they will eventually take placement exams which use this dialect. Without getting into whether a likely pencil-and-paper exam can have any bearing on the interactive use of a language within a fast development environment, it is clear that even a marketplace-rejected standard can make its obnoxious presence known.

Tom simply failed to address the biggest problem with the Extended Pascal specification: It is not really a specification at all. Look at it this way: Suppose you are a programmer in a small company hoping to write portable programs for several different platforms. After buying or licensing a "standard" compiler for each, your "portable" programs do not compile on one or the other. Now, given the Extended Pascal specification, will you be able to decide which compiler is at fault?

Pascal is a STRUCTURED programming language. In contrast, the Extended Pascal specification is "spaghetti code" throughout. In the version I bought, terms were poorly defined, making the specification inherently ambiguous. The spec did not stand alone, but appeared to rest on previous work, perhaps upon unnamed compilers on unnamed machines. The shear complexity of it led me to question whether The Committee had ever dealt with an actual complex Pascal program, for if they had, they would have been used to partitioning complexity into understandable chunks. The specification is a turkey, not just because it does not have the right things in it, but because it is virtually impossible to deeply understand what is specified, and therefore impossible to understand the ramifications of the various features in a final product.

But you don't have to take MY word for it: Tom invites your participation, which is great! Take him up on it! If you think my comments are mostly sour grapes, well, just get Tom to send you copies of all the comments and responses for the past two rounds, and make your own decision! If you think that the Extended Pascal specification cannot possibly be as bad as I say, then Hey! Get Tom to send you a copy and read it! If the spec looks fine to you, tell me I'm all wet. If you resent the idea that major standardization decisions gave gone on without widely available printed discussions on the issues, let Tom know, let ANSI and ISO know, and also let your compiler vendor know.

Terry Ritter, P.E.

Austin, Texas

Rhealstone Suggestions

Dear DDJ,

I have just finished reading through the April 1990 issue of Dr. Dobb's and was very pleased to see that you had an article with an actual implementation of the Rhealstone benchmark. May I suggest a couple of additions which I would like to see added to the Rhealstone benchmark:

    1. Intertask message latency between tasks that are on different nodes of a network. This metric is important to developers who work on networked real-time control system applications.

    2. Message/datagram throughput between tasks as a function of message size. This should be measured between two tasks that are on the same node and also between tasks that are on different nodes.

Nick Busigin

Stratford, Ontario

Canada

Animation Algorithm Update

Dear DDJ,

After reading the letter sent by Peder Jungck ("Letters," April 1990), I felt that some clarification was necessary. In his letter, Mr. Jungck stated that a faster algorithm would be to separate the pixel data into logical planes before the data is moved to the EGA's physical display planes. Mr. Jungck was apparently assuming that my sample program translated the initial one byte/pixel format. This is to maintain a rough compatibility to some future VGA sprite driver. This first part is only run once. The second part uses the format that Peder suggested. It is the structure that is used every screen refresh. Although I'm sure Mr. Jungck's program is a peach, it does not use an algorithm different from what I provided.

Rahner James

Sacramento, Calif.