Implementing Games for Windows

Using the WinG API and the WaveMix DLL

James Finnegan

James is a developer specializing in operating-systems internals. He can be reached at P.O. Box 436, Falmouth, MA 02541 or via Internet at FINNEGANJ@delphi.com. Reprinted courtesy of Microsoft Systems Journal (C)1995 Miller Freeman.


At first glance, it seems that Windows, with its graphical nature, device independence, access to boatloads of memory, and various levels of multitasking, would be a perfect environment for games. But game developers have not been flocking to Windows. To improve Windows' video and sound performance, Microsoft released two APIs: WinG and WaveMix, both of which are available for free on CompuServe (GO WINMM LIB 10) and the Internet (ftp.microsoft.com). WinG provides high-performance, device-independent graphics capabilities in the form of DLLs and a GDI device driver. WaveMix, is a DLL that lets you mix sound files or resources into a single sound output at run time and, optionally, to hook sounds up to events within your application.

In this article, I'll review some traditional PC game-animation techniques and show how WinG can be used to implement them in Windows. I'll also use both WinG and WaveMix in an application called "WinGTest," which should give you enough of a foundation to start working with these APIs on your own. You'll see that together, these APIs let you develop powerful games and other multimedia apps.

Game Animation

Most game animation implemented on PCs and video games is cast based, whereby a game player actively manipulates movable screen objects (members of the cast). This differs from frame-based animation, as in a Video for Windows AVI clip, where precomposed full-screen images are animated.

The movable screen objects in cast-based animations, commonly referred to as "sprites," are bitmapped images that are usually animated against a background image to add realism. Any developer who has tried to implement this type of animation knows that getting visually acceptable results (smooth, flicker-free sprite movement) requires quite a bit of work.

On dedicated home and coin-operated video games (and some computers), specialized hardware is used to implement sprite animation. This makes the programmer's job easier, since less knowledge of a particular animation technique is required to move an object from place to place.

On PCs, specialized sprite-animation hardware is generally not at your disposal. You therefore have to roll your own routines to continually place an object on the screen, remove it, and place it at a new location. Under DOS, direct access to the video adapter's memory and access to the adapter's controller registers gave you explicit control over various attributes and operations of the adapter (such as its palette of colors), enough control to implement your own animation routines.

With direct access to the pixels that make up the screen image, you had to build routines that manipulated these pixels to implement sprite animation. These routines were necessary to "hide" the removal and replacement of an animated sprite, which would otherwise create flicker. Typically, one of three different animation techniques was used: XORing, page flipping, and double buffering.

XOR animation involves writing a sprite directly to the video adapter's memory by XORing the source (the sprite) and the destination (the memory that makes up the screen). Removing the sprite is simply a matter of XORing the same image in the same place. Doing this successively gives you sprite animation. Since the video adapter's memory is being accessed directly, the on-screen results are virtually instantaneous. However, this method only works against a solid background, since the XORing alters, rather than hides, any background scene that the sprite is placed over.

Page flipping involves dividing the video adapter's memory into two or more "pages," where a page represents an entire screenful of graphics. While one page is being displayed, another page is being constructed. When the page is ready or when a certain amount of time has elapsed, the visible and hidden pages are swapped by altering the graphics base address of the video adapter. Since the screen is continually being refreshed, changing the memory address results in the display changing.

The third technique is double-buffering, in which an application-managed buffer is used to construct the screen image. When the buffer is complete, the image (or a portion of it) is copied to the video adapter's memory (which directly defines the screen image), updating the display accordingly. This is conceptually similar to page flipping, except the performance is somewhat different, since memory is being copied en masse to the display hardware.

All of this direct hardware access trades device independence for raw performance. Although this was acceptable under DOS, it isn't under Windows, where device independence, rather than specific hardware access, yields broad compatibility.

Memory Woes

As if direct hardware access isn't bad enough, games have another hurdle to overcome. Graphics of any kind usually hog memory. Bitmapped images tend to be large, and the recent demands for increased screen resolution, with increased game complexity, don't help the situation much (particularly when dealing with DOS). People have traditionally addressed the problem of increased memory demands with DOS extenders.

Windows has gained acceptance as a replacement platform for DOS for business applications. Unfortunately, this has not been the case with games.

The two key concerns facing PC game developers are: providing smooth, fast animation, and access to a lot of memory. Memory management is easy in Windows. It is the graphics performance--particularly when trying to implement animation routines--that is the problem.

Even though Windows offers high-level graphics primitives through GDI, it is not well suited for high-performance animation. This is largely because of Windows device independence. To support different hardware in a consistent manner, you need device drivers. This means two things: First, the device driver has to do something to convert a generic function into something device specific. That extra code takes time to execute. Second, your application's performance is at the mercy of those device drivers. Even though some device drivers stink, their poor performance often goes unnoticed on machines running only business applications.

For animation, particularly with games, extra code and poor performance are enemies. Combine that with Windows' lack of flexibility drawing display contexts (you can only use GDI functions), and Windows seems like a bad choice for games.

Even if GDI and its device drivers performed exceedingly well, the functions Windows offers are not always appropriate for all types of graphics. For instance, GDI does no 3-D transformations. Nor does it do anything specific for animation. Good or bad, the direct memory access that DOS provided to developers yielded a huge number of software-based graphics solutions.

DIBs versus DDBs

Windows 3.0 introduced the device independent bitmap (DIB) to address bitmap portability issues. A DIB defines a bitmap's dimension, colors, and pixels in a single structure. Since the characteristics of a DIB are self encapsulated, rendering it on different devices usually yields visually comparable results. In addition, you have complete access to the entire DIB, which means that you can fool with anything, including the pixels that make up its bitmap, at will.

The GDI API, however, deals with device dependent bitmaps (DDBs) that are represented by a device context (DC), which GDI uses to do most of its manipulations. This means anything you want to do to a DDB must be done through GDI. This is primarily because the DC is allocated and maintained by the output device's device driver. The memory used to define an image, particularly with video and printer drivers, may be physically inaccessible to your application. In addition, particularly with video drivers, the image may not be stored contiguously, or in a format that you can determine. For example, some video adapters divide up their memory into bit planes, where the bits that make up the pixels are divided into individual color values and stored separately. This is done for quicker access within the physical memory frame that the video adapter uses. Other video adapters use a packed-pixel format (where pixels are stored linearly, much like a DIB). Manipulating the DDB without the aid of GDI (really the device driver) is not possible.

In short, you have the DIB, which you can fool with any way you want, and the DDB, which Windows manipulates. There is little "glue" in between. For instance, there are GDI functions that move a DIB to a DC; however, they perform poorly and inconsistently across some device drivers. In addition, there is no way to call GDI functions, such as Ellipse or Polygon, to manipulate the DIB at a higher level.

To alleviate the latter problem, Windows 3.1 ships with DIB.DRV, a GDI device driver with no associated output device. To GDI, DIB.DRV looks just like another output device. This driver allows you to allocate a DIB and create a memory DC to go with it, so you can manipulate the DIB directly while still manipulating it with GDI function calls.

Although DIB.DRV is useful, it does not help move a DIB to the display easily and quickly. You may think that since DIB.DRV allows you to associate a DC to a DIB, you could call GDI's BitBlt to move the bitmap from one DC to the other, but the actual BitBlt function is implemented in the device driver, not GDI. Thus, the video driver's BitBlt can only copy from DCs that it knows; DIB.DRV's DCs are not among them. You are then left to contend with StretchDIBits. If StretchDIBits is not implemented in the device driver, GDI will fake it by calling SetBitmapBits and StretchBlt. Needless to say, this results in inconsistent performance across different hardware.

Enter WinG

What you really need is DIB flexibility (with DIB.DRV-like functionality as a bonus), with the speed of the DDB BitBlt. WinG ("G" for games) provides that and more. The WinG toolkit provides access to a DIB, which provides DOS-like double-buffering flexibility in a device-independent fashion. Using WinG, your app copies the DIB to the screen so quickly that you get DOS-like performance on most hardware.

WinG offers a number of advantages over programming in DOS. First, Windows offers all of the memory benefits of a DOS extender and more. Second, video-device independence lets you consistently access resolutions higher than those under DOS.

DOS games usually have to take a lowest-common-denominator approach to video graphics, so that most DOS games rarely go beyond VGA's standard mode X (the "undocumented" 320x240x256 mode used in many games). To fill all of that new-found screen real estate, WinG offers fast Blt stretching.

WinG gives you direct-access performance by utilizing the best path to your hardware. To determine which path will be used, WinG analyzes your PC upon installation. If WinG recognizes your hardware (as a video chipset from one of about eight different manufacturers, such as Tseng Labs and Western Digital), it will obtain a pointer directly to the video graphics memory (traditionally at A000), which it will write to. WinG obtains this pointer by employing DVA.386, the VFlatD device, which creates a selector to this memory address. In the future, this interface will be replaced by the Display Control Interface (DCI), which is designed to supply a consistent method of obtaining the video memory pointer (among other things) for use by APIs like WinG.

WinGBitBIt and WinGStretchBlt exist only to get your bits from the DIB to the screen. In the absence of a known video card, the WinG profiler times the various ways to get bitmaps of different sizes to the screen. It determines whether a top-down or bottom-up DIB rendering is better. For each case, WinG will use the fastest combination of GDI functions and driver calls. On some cards this might involve using direct video access (DVA.386); on others StretchDIBits is optimized for pipelined data transfer to the video card. WinG doesn't care; the fastest road is the right one. These results, along with the current video driver's name and version number, are stored in a setting within your Windows configuration (WIN.INI under Windows 3.1) information. This performance analysis is done at installation time, and is only performed again if the video driver or its version changes.

The WinG API

Fortunately, the actual run-time configuration is largely hidden in a small, device-independent API. This API is conceptually similar to DIB.DRV, but unlike DIB.DRV, it includes a high-level BitBlt routine to quickly copy DIBs to a given display DC. This optimized BitBlt function, called WinGBitBlt, is the core of WinG.

The WinG API also includes two functions for implementing a halftone palette. This type of palette selects a set of colors that will emulate 24-bit true color in an 8-bit, 256-color device. See Appendix A of "Writing HOT Games for Microsoft Windows" included on the FTP server in GAMESUM.ZIP, for a detailed description of each API.

Despite all these features (and its name), WinG isn't a high-level gaming API. Things such as bitmap animation or collision detecting are not part of WinG. This keeps WinG (much like DOS) from being tied to a particular animation method. For instance, many games are 2-D, while others (like Atari's Marble Madness) are isometric (2-D with a 3-D-like background-- a form of fake 3-D), while still others (like id's DOOM) are true 3-D, with translation and scaling of objects in a 3-D scene. Since many of these types of routines have already been developed for DOS applications, porting these techniques to WinG is relatively easy.

Unfortunately, WinG lacks some routines that just about all games need. For instance, the routine to copy a sprite with a transparent color (since bitmaps are rectangular by definition, a transparent color allows you to display arbitrarily shaped sprites) is not part of WinG. DIB-oriented manipulation functions are also absent. However, many of these routines are included with the WinG sample applications, so you can just cut and paste.

WinG implements a double-buffering scheme, so techniques such as page flipping cannot be implemented. This is not a big deal, since page flipping and buffering are very similar. Their differences under DOS have to do only with performance issues. Porting existing page-flipping code shouldn't be difficult. Something similar to page flipping will be available (in a device-independent form) in DCI, although this most likely will be hidden behind the WinG API.

Of course, you can't access low-level video registers under WinG either. You really shouldn't need to, since most access has to do with direct palette manipulation, changing memory base addresses, and so on, none of which have to be done with WinG. These limitations are only a concern if you are doing a straight port from DOS. WinG does not include any timer-oriented functions, but these are supplied by other parts of Windows. For instance, the multimedia API supplies many of the preemptive timer functions that you would need to implement games. Also, WinG has no support for sound.

32-Bit Hacking

One important note: WING.DLL and WINGDE.DLL contain highly optimized, 32-bit code. Examination of either of these two DLLs with the EXEHDR utility shows that many of their code segments are 32 bit. This is great for performance, but you may wonder how it is done. For instance, if you could do the same in your WinG code for critical functions, performance might improve significantly. The secret is hidden within CMACRO32.INC, included on the MSDN CD and in the WinG-toolkit samples. When you create an assembly function using the cProc macro, the code in Example 1(a) is placed at the beginning of your function.

Sixteen-bit Windows ignores the 32-bit segment flag that EXEHDR sees, which means that the code segment is loaded as a 16-bit segment. When AX is added to itself, the carry flag is set. The jumped-to code looks like Example 1(b). This sets the descriptor in the LDT to USE32, causing the prologue code in Example 1(b) to be interpreted as Example 1(c).

The LDT hacking code is not called again unless the code segment is discarded and reloaded by Windows. The cEnd macro overrides the RETF instruction with the operand override (66H) byte. You must set the linker option "/NOPACKCODE" for this to work successfully. If the USE32 object file is packed with your 16-bit C or C++ functions, the LDT hacking would mess up your 16-bit code: Usually, your application will GP fault after returning from the called 32-bit function.

Getting to Work

Developing a game of any type is fairly complex. Collision sensing, keeping tabs on all those screen images and their states, rotation and translation of sprites, and the like are pretty involved. To keep things simple, I'll present a sprite-animation program called "WinGTest," which demonstrates how to construct WinG DCs, associate bitmaps to the DCS, and shuttle data between DIBS, WinG DCs, and the display DC. The program (available electronically, see "Availability," page 3) allows users to drag a sprite across a background with the mouse, updating the off-screen buffer and ultimately the screen as needed. WinGTest is made up of two modules: WINGTEST.C and UTILS.C, which is sample code included with the WinG SDK for DIB manipulation. Reviewing its code should give enough clues to get you started on your own animated game projects.

The first thing to do is pull together your DIBS. In my example, I load three DIBS, one for the window background, and the other two for sprites. I use the DibOpenFile function from UTILS.C. DibOpenFile will load either a disk file or an embedded resource, returning a pointer to the loaded DIB's BITMAPINFOHEADER structure. UTILS.H defines this structure as a PDIB, which it uses in other API calls and macros to extract relevant information for you.

The next step is the actual creation of the WinG DIB and its associated DC. Here you determine the best DIB format (top down or bottom up), as well as the identity palette for your application. I have placed this code within the processing of my WM_SIZE message, so I can dynamically resize the WinG DIB accordingly. For the most part, the DIB orientation is not tremendously important to you; it is there if you need to know (in case you are implementing your own bitmap-manipulation functions). The WinG-API calls hide the bitmap orientation from you.

The first function to call is WinGRecommendDIBFormat; see Example 2(a). This function takes a pointer to a BITMAPHEADERINFO structure and returns the optimal format for the BitBlting DIBs to your display's DC. This information assumes that you won't be stretching or using complex clipping regions, and that you will be using an identity palette.

The only interesting bit of information returned is the DIB orientation. The biHeight member of the BITMAPINFOHEADER structure will be 1 if the DIB should be in a top-down format; otherwise, this field will be --1 to indicate a bottom-up format. In the future, the biBitCount member will indicate the bits per pixel for the output device. Keep in mind that, for now, this field will only be 8, since this version of WinG only supports 8-bit, 256-color output devices. For longer-term compatibility and optimal performance, you will want to check this field and deal with it accordingly.

The next step is the creation of the identity palette. Windows reserves 20 colors within the 256-color palette for system-wide static colors. These colors include the colors for title bars, push buttons, window frames, and so on. These colors take up the first and last ten entries of the palette. They are placed on either end so each can be XORed with its complement to allow inversion. To be as friendly as possible to other applications, you should leave these 20 colors in place. That leaves 236. If you need more colors, you can get 254 of the 256 (black and white cannot be taken) by calling GDI's SetSystemPaletteUse with SYSPAL_NOSTATIC, although this can make other applications ugly. If you do this, I recommend that you make your games full-screen to hide the hideous screen colors.

Once you have a suitable palette within a DIB, simply load the 236 colors into an array of PALETTEENTRY structures; see Example 2(b).

To create the identity palette, each color should be flagged as PC_NOCOLLAPSE (or PC_RESERVED, if you are going to be doing any palette animation). This will keep the palette manager from combining identical colors into one entry. The 20 system colors should also be derived at run time and saved in their appropriate places, since the display driver determines these colors (they are not fixed across all platforms); see Example 2(c). Once all 256 colors are in place, the palette can be created as in Example 2(d). This palette can then be selected and realized to the window's DC, just like any other GDI palette; see Example 2(e).

This code is called in various places to ensure that the identity palette is realized whenever the application is in the foreground.

WinG DCs

In a typical WinG application, you would create a single WinG DC, which you would use as your off-screen buffer. In my example, I will create two. One is for my background bitmap, which I will stretch into the DC using standard GDI calls. I am using this DC as a buffer so I can quickly restore my background when a sprite is moved. The second WinG DC is for my off-screen buffer, which I will use to create the image that will be Blted to the display.

My first WinG DC is created by calling WinGCreate(); see Example 3(a). I then create a bitmap using the WinGCreateBitmap call. The dimensions of the bitmap are contained within WinGDIBHeader, as in Example 3(b). This function returns a DDB HBITMAP for use with GDI calls, as well as a live pointer to the bitmap's actual bits. I place this value in a huge pointer to seamlessly allow access to bitmaps larger than 64 Kbytes. This huge pointer can subsequently be used either in assembler as an FWORD (16:32) pointer or with the C run-time library calls that support it. The new bitmap is then selected into the WinG DC; see Example 3(c).

Finally, the bitmap loaded at the start of this program is stretched to fill the DC. Like all GDI devices, this standard GDI call is actually implemented by the device driver, WINGDIB.DRV. As you will see, when working within WinG, you can still rely on a few familiar GDI calls. I've used these GDI calls for expediency. You may have more stringent performance requirements, in which case you should roll your own BitBlting routine to copy source to destination DIBS. Be aware that WinG is not considered a palette device to GDI. That's why DIB_RGB_ COLORS is used for the index parameter. Also note the use of the UTILS.C functions here. The DibXxx functions are all implemented in Example 3(d). I am stretching the bitmap once here so that I can use BitBlt to quickly copy portions of it later. The same thing is done for the double buffer that will be used for constructing the screen image, as in Example 3(e).

After creating the double buffer, the entire background image is copied in; see Example 3(f). Once again, a standard GDI function is used. This is also implemented in WINGDIB.DRV. Be aware that many of these GDI functions only work between DCs of the same type. In other words, WINGDIB.DRV's BitBlt function cannot be used to BitBlt to a printer or to the screen.

Within the WM_PAINT message processing, the offscreen buffer is copied to the window through a call to WinGBitBIt; see Example 3(g).

WinGBitBIt currently copies only from WinG DCs to display DCs. The identity palette is selected and realized again, for safety's sake. If the palette is already realized, these functions have very little overhead. With the identity palette in place, WinGBitBIt copies the DIB to the window quickly, as quickly as a BitBlt using a DDB.

WinG Wrap-Up

Even though WinG is billed as an API for the future, it does not support DCI. This is not a big deal yet. If you stick to the WinG API now, you are likely to get DCI compliance without doing anything.

Even though WinG 1.0 ships with WING32.DLL, this DLL uses the Win32sg Universal Thunk, which is not supported under Windows NT. This means that your 16-bit WinG applications will run under Windows NT, but your Win32 ones won't (yet).

One of the key decisions to make is whether to use GDI calls against the HBITMAP and DC or roll your own drawing routines. As the WinG documentation suggests, assume nothing! When performance is the only goal, you will almost always be able to beat GDI by rolling your own functions. My experience shows that directly manipulating the DIB and copying the results is almost always faster than using GDI calls. There are, however, cases when it simply isn't worth the time or money to roll your own code. My favorite example is TrueType and text. Yes, you could probably write a slightly faster, more-specialized version of the TrueType font-rendering code, but is it worth it? WinG's primary benefit is that it enables you to work around the limitations of GDI, all the while leveraging existing code and drivers.

Both Windows 95 and Windows NT 3.5 include a GDI API call named CreateDIBSection, which essentially gives you the functionality of WinGCreateDC and WinGCreateBitmap. CreateDIBSection will be the interface to DCI in the future and existing WinG calls will map directly to CreateDIBSection. Therefore, if you use the WinG API calls, you don't have to worry about future compatibility. Also, if you are targeting Windows 95 or Windows NT 3.5 exclusively, I suggest taking a good look at CreateDIBSection. Its use isn't as obvious as WinG's API, but the power is there.

WaveMix

If you run WinGTest and have a sound card, you'll notice that WinGTest also produces sound. Each sound is composed completely from WAV files. Most sound cards only support a single WAV output. To play multiple WAV files simultaneously, WinGTest uses a new DLL.

The WaveMix DLL, which first appeared in Microsoft Arcade, is available on the Microsoft Multimedia JumpStart CD, as well as from Microsoft's Internet ftp site and CompuServe forum. It's also included under the unsupported tools section of the MSDN CD. WaveMix allows you to simultaneously play up to eight PCM-sampled sounds. These sounds can be any uncompressed PCM (pulse code modulation) wave file or resource. PCM is the method used to digitize analog sounds. The amplitude of a sound is converted into an 8- or 16-bit number and stored into a file. The amplitude sampling occurs at various intervals: 11,025, 22,050, or 44,100 times per second. When played back, the samples are converted back into an analog waveform, which, depending on the sampling resolution and rate, will be a close facsimile of the original sound.

WaveMix works by capitalizing on the low-level audio services of the Windows Multimedia API. It takes a series of PCM samples (up to eight) and algebraically sums them, creating a single new PCM waveform to output to your audio device. Some compromise occurs. First, to achieve this in real time, it only supports an 8-bit sample output. This type of sample tends to be noisy, but it should be fine for games. In addition, on most of today's hardware, supporting over a 22-kHz sample in real time is probably not realistic. WaveMix allows you to output a 44.1-kHz waveform, but you will probably notice moments of silence between sounds (the process of mixing cannot keep up with the playing of the sound).

For more information about PCM encoding, see "The Multimedia Extensions for Windows-Enhanced Sound and Video for the PC," by Charles Petzold (MSJ, March 1991).

WaveMix does more than just add the waveforms together. To support real-time mixing, it forms output buffers in a circular buffer, so while one buffer is playing, the next one can be premixed. This premixing introduces some problems when you want to insert another sound immediately into the output wave. To facilitate this, WaveMix allows you to flush the premixed buffers and remix them with the new sound in place.

Finally, WaveMix manages the mixing of different-length wave inputs. It will appropriately stop playing on a specific channel when there is no more PCM data to mix. All of this functionality is hidden behind a reasonably simple API.

Pulling It Together

To use WaveMix API, you first open a session with a call to WaveMixInit or WaveMixConfigureInit. WaveMixInit will configure the session using the parameters in the WAVEMIX.INI file.

The WAVEMIX.INI file is broken up into a few sections. In the [General] section, the WaveOutDevice setting specifies the multimedia output device to utilize starting at 0 (most people will only have one sound output device). If this value is --1, WaveMix uses the wave mapper. The wave mapper is a Control Panel applet that allows a user to define the best wave-output devices for particular wave formats. The wave mapper adds a layer of processing overhead that you'll most likely find unacceptable.

WaveMixInit obtains the product name of the wave-output device by calling the waveOutGetDevCaps function from MMSYSTEM, the 16-bit Windows multimedia API. It uses this string to look up the appropriate subheading in WAVEMIX.INI for a particular product. This is important to note, since the stock WAVEMIX.INI includes settings for only five sound cards. If yours isn't one of the five, you have to add it; otherwise some rather unimpressive defaults will be used. WAVEMIX uses these settings, filling in missing settings with defaults from the [Default] section, to configure the wave-output device. WaveMixConfigureInit permits you to override the SamplesPerSec setting with either 11, 22, or 44. In addition, WaveMixConfigureInit permits you to specify whether the output will be mono or stereo. This cannot be set in WAVEMIX.INI. In my sample app, I call WaveMixConfigureInit to set the number of channels to 2 (stereo). The rest of the settings are derived from the WAVEMIX.INI file; see Example 4(a).

Once you have a session handle from WaveMixInit or WaveMixConfigLireInit, open your PCM-encoded WAV files. WaveMixOpenWave supports the loading of both disk files and embedded resources; see Example 4(b). The current version of WaveMix only supports PCM-encoded WAV files. Since sound cards supporting various types of wave compression are proliferating, I expect WaveMix to support these formats in the future.

Before you can play a loaded WAV, you must open a channel in WaveMix. You can open them one at a time, or open a series of them at once. In Example 4(c), I open all of the channels that I need up front. WaveMix then has to be activated with a call to WaveMixActivate, which allocates and releases the sound output device, since only one application can use it at a time. The best place to put this call is in the processing of the WM_ACTIVATE message. That way, it will always be called when the application first starts, and WaveMix will allocate and free the output device as the application goes from foreground to background; see Example 4(d). After all of this, actually playing the sounds is a snap. A call to WaveMixPlay, see Example 4(e) is all that is required.

The code in Example 4 can be called from any event, which makes hooking up sounds to various application events easy. It's important to note that WaveMix uses a hidden window with a WM_TIMER message to continually mix output buffers. This means that if your application does not relinquish control to Windows periodically and consistently, the output sound will start to skip. If you have code in your application that does not relinquish control periodically, then you must call WaveMixpump periodically (which I do in the processing of WM_CLOSE).

Conclusion

Between WinG and WaveMix, you have at your disposal a rich set of tools to develop Windows-based games. Best of all, they are both free. Although neither API is an exhaustive implementation of everything that they can be, they do show an evolutionary path toward improving game development under Windows. With an eye on the future, these APIs give you the flexibility and performance you need today, with compatibility that you'll want tomorrow.

Example 1:

(a) Function header generated by cProc macro; (b) jumped-to from header, sets the CS descriptor in the LDT; (c) the header in (a) effectively becomes this.

(a)

xor ax,ax
mov ah,80h
add ax, ax ;;Will o/flow in 16 bits
jc  short &n&_fix_cs

(b)

mov bx,cs
 ...
mov es,ax
mov di.sp
mov ax,000Bh
int 31h ;;; DPMI: Get the CS descr
;;; change the following to USE32
or  byte ptr es:[di+6],40h
mov ax,000Ch
int 31h ;;; DPMI.  Set the CS descr

(c)

xor eax,eax
mov ah,80h
add eax,eax ;; Doesn't overflow in
                             32 bits
jc  short &n&_fix_cs

Example 2:

(a) Function prototype for WinGRecommendDIBFormat; (b) loading 236 colors into an array of PALETTEENTRY structures; (c) saving the 20 system colors; (d) creating the palette; (e) selecting the palette.

(a)

WinGRecommendDIBFormat((BITMAPINFO far*)&WinGDIBHeader.bmiHeader);

(b)

for(iColorIndex = 10;iColorIndex < 246;iColorIndex++)
{
WinGDIBHeader.bmiColors[iColorIndex].rgbRed                     =
                 LogicalPalette.palEntries[iColorIndex].peRed   =
                 pColorTable[icolorIndex].rgbRed;
        WinGDIBHeader.bmiColors[iColorIndexl.rgbGreen           =
                 LogicalPalette.palEntries[iColorIndex].peGreen =
                 pColorTable[iColorIndex].rgbGreen;
        WinGDIBHeader.bmiColors[iColorIndex].rgbBlue            =
                 LogicalPdlette.palEntries[iColorIndex].peBlue  =
                 pColorTable[icolorIndexl.rgbBlue:
        WinGDIBHeader.bmiColors[!ColorIndexj.rgbReserved        = 0;
                 // This flag includes PC_NOCOLLAPSE
        LogicalPalette.palEntries[iColorlndexl.peFlags          =
                 PC_RESERVED;
        }

(c)

// Get the 20 static colors
hDC = GetDC(O);
GetSystemPaletteEntries(hDC,0,10,LogicalPalette.palEntries);
GetSystemPaletteEntries(hDC,246,10,LogicalPalette.palEntries + 246):
ReleaseDC(0,hDC);

(d)

ghMSJIdentityPalette = CreatePalette((LOGPALETTE far*) &LogicalPalette):

(e)

SelectPalette(hDC,ghMSJldentityPalette,FALSE);
RealizePalette(hDC);

Example 3:

(a) Creating a WinG DC; (b) creating a bitmap ; (c) selecting the new bitmap into the WinG DC; (d) stretching the bitmap; (e) constructing the double buffer; (f) copying the background image into the DC; (g) copying the off-screen buffer to the window.

(a)

ghWinGBackgroundDC = WinGCreateDC():

(b)

hBitmap = WinGCreateBitmap(ghWinGBackgroundDC,
                          (BITMAPINFO far*)&WinGBackgroundDIBHeader,
                          (void far*)&ghpWinGBackgroundBitmap);

(c)

gbm0ldBackgroundBitmap=(HBITMAP)SelectObject(ghWinGBackgroundDC,hBitmap);

(d)

StretchDIBits(ghWinGBackgroundDC,0,0,giWindowX,giWindowY,0,0,
  DibWidth(gpdibBackgroundBitmap),DibHeight(gpdibBackgroundBitmap),
  DibPtr(gpdibBackgroundBitmap),DibInfo(gpdibBackgroundBitmap),
  DIB_RGB_COLORS,SRCCOPY);

(e)

ghWinGDC = WinGCreateDC();
hBitmap = WinGCreateBitmap(ghWinGDC,(BITMAPINFO far*)&WinGDIBHeader,
                                     (void far *)&ghpWinGBitmap):
gbm0ldBitmap = (HBITMAP)SelectObject(ghWinGDC,hbitmap);

(f)

BitBlt(ghWinGDC,0,0,0,giWindowX,giWindowY,ghWinGBackgroundDC,0,0,SRCCOPY);

(g)

hDC = BeginPaint(hWnd,&ps);
SelectPalette(hDC,ghMSJldentityPalette,FALSE);
RealizePalette(hDC);
WinGBitBlt(hDC,0,0,giWindowX,giWindowY,ghWinGDC,0,0);
EndPaint(hWnd,&ps);

Example 4:

(a) Configuring the wave output device; (b) opening WAVfiles; (c) opening 4 channels, one per file; (d) activating the device; (e) playing sounds by calling WaveMixPlay.

(a)  mcConfig.wSize     = sizeof(MIXCONFIG);
     mcConfig.dwFlags   = WMIX_CONFIG_CHANNELS;
     mcConfig.wChannels = 2:                   // Start up 
     Wavemix
     ghMixSession       = WaveMixConfigureInit(&mcConfig);

(b)  glpMixl=WaveMixOpenWave(ghMixSession,"1.wav",NULL,WMIX_FILE);
     glpMix2=WaveMixOpenWave(ghMixSession,"2.wav",NULL,WMIX_FILE);
     glpMix3=WaveMixOpenWave(ghMixSession,"3.wav",NULL.WMIX_FILE);
     glpMix4=WaveMixOpenWave(ghMixSession,"4.wav",NULL,WMIX_FILE);

(c)  WaveMixOpenChannel(ghMixSession,4,WHIX_OPENCOUNT):

(d)  case WM_ACTIVATE:
          // WA_INACTIVE == FALSE;
         WaveMixActivate(ghMixSession, wparam);
         break;

(e)  MixPlayParams.wSize = sizeof(MIXPLAYPARAMS);
     MixPlayParams.hMixSession = ghMixSession;
     MixPlayParams.hWndNotify = NULL:
     MixP]ayParams.dwFlags=WMIX_HIPRIORITY;
     MixPlayParams.wLoops=0;
     MixPlayParams.iChannel=3;
     MixPlayParams.lpMixWave=glpMix4;
     WaveMixPlay(&MixPlayParams);


Copyright © 1995, Dr. Dobb's Journal