Last year, Syncronys Softcorp was the darling of the fledgling Windows 95 software industry. Its SoftRAM 95 program had received glowing reviews, was selling well, and Syncronys stock was a hot buy. But all that was before it became widely known that SoftRAM 95 didn't do what Syncronys claimed.
A number of software programs use compression techniques to improve the performance of Windows' virtual-memory system. SoftRAM 95 is one that does not. Except for a few minor fudges, the software's most significant feature was a pretty user interface that provided misleading information about what SoftRAM 95 was doing.
Many news reports made it clear that Syncronys was misleading its customers. What's less apparent is how an empty shell garnered good reviews from two independent testing organizations and a number of magazines. One of the independent tests was performed at the request of Syncronys. The testing company in question later "clarified" its initial report of SoftRAM's efficacy by pointing out that it was only asked to perform cursory testing under conditions carefully specified by Syncronys.
The second test was performed to qualify SoftRAM 95 for Microsoft's coveted "Windows 95" certification. Microsoft's reaction to the subsequent scandal was somewhat contradictory. On the one hand, Microsoft stated that the Windows 95 logo is not evidence that a product actually does what it claims. On the other hand, Microsoft quickly removed SoftRAM 95 from the list of certified products. Perhaps Microsoft should clarify exactly what the Windows 95 logo does and does not mean, both for software purchasers and for software developers.
The glowing magazine reviews are a symptom of other problems. If it is done right, reviewing software is a tough job. It's difficult within any reasonable budgetary and time constraints to thoroughly test software under a wide variety of circumstances. This is especially true of early reviews of highly technical applications such as memory-management utilities. Even if a program appears to do nothing in a test, it's conceivable that the reviewer misinstalled, misused, or even misunderstood the product. As a result, reviewers tend to overlook many problems when writing reviews.
Magazines that publish negative reviews often end up with big-time headaches. The popular German computer magazine c't was taken to court when it published an early article expressing doubt about SoftRAM 95. A software-distribution company obtained a temporary restraining order preventing c't from using the headline "Placebo Software?" in regard to SoftRAM 95. The judge granted the order because c't had not provided conclusive evidence of its claims.
No doubt, the early reviews of SoftRAM 95 relied primarily on the software's own control panel to determine whether or not the product was working. Less technical reviewers who failed to fully understand the claims being made can almost be forgiven for their oversight. Conclusive proof of the product's ineffectiveness had to wait until PC Magazine developed a torture test for RAM compression tools and a number of independent investigators (including c't and Mark Russinovich, Andrew Schulman, and Bryce Cogswell) disassembled the code to find out exactly what SoftRAM 95 did and didn't do.
Ultimately, the blame for the affair must be assigned to Syncronys, and it has received the brunt of the criticism. It has recalled existing copies of SoftRAM 95, made numerous promises to ship free "bug fix" updates to existing customers, and is facing class-action lawsuits and investigations by federal and state agencies. Still, there were many companies that supported Syncronys' claims. Whether that support was inadvertent or not, it does raise questions about independent testing, reviewing, and certification programs.
Tim Kientzle
technical editor