Windows 3.x VDDVGA

While working on my Windows 3.x display driver, I ran into a vexing problem. In Windows 3.1 running in Enhanced 386 mode, I could start a DOS session and switch it to a window. But an attempt to set a mode in the DOS window (e.g. MODE CO80) would destroy the Windows desktop, preventing further drawing from happening properly. It was possible to recover by using Alt+Enter to switch the DOS window to full screen again and then returning to the desktop, but obviously that wasn’t going to cut it.

Oddly enough, this problem did not exist in Windows 3.0. And in fact it also didn’t exist in Windows 3.1 if I used the Windows 3.0 compatible VDDVGA30.386 VxD shipped with Windows 3.1 (plus the corresponding VGA30.3GR grabber).

There was clearly some difference between the VGA VDD (Virtual Display Driver) in Windows 3.0 and 3.1. The downside of the VDD is that its operation is not particularly well explained in the Windows DDK documentation. The upside is that the source code of VDDVGA.386 (plus several other VDD variants) was shipped with the Windows 3.1 DDK.

First I tried to find out what was even happening. Comparing bad/good VGA register state, I soon enough discovered that the sequencer registers contents changed, switching from chained to planar mode. This would not matter if the driver used the linear framebuffer to access video memory, but for good reasons it uses banking and accesses video memory through the A0000h aperture.

But how could that even happen? The VDD is meant to virtualize VGA registers and not let DOS applications touch the real hardware. Something had to be very wrong.

I also suspected that the problem was likely caused by my driver doing something wrong, or perhaps not doing something necessary to correctly set up the VDD. The Video 7 sample driver that I based my code on was intended to work with its own custom VDD, not with VDDVGA; judging from the source code in the Windows 3.1 DDK, I suspect that V7VDD.386 was effectively forked from the Windows 3.0 VGAVDD and at most slightly updated for Windows 3.1. That might also explain why my driver worked with VDDVGA30.386 but not with the newer VDDVGA for Windows 3.1 (VDDVGA.386 is normally built into WIN386.EXE and does not exist as a separate file, although a standalone VDDVGA.386 can be used).

After poking through the VDDVGA source code for a while, I realized that it almost certainly wasn’t register access from a DOS session leaking through. It was the VDD itself!

And I also found that the missing link was a small section of code that was explained as “Call VDD to specify latch address” in the Windows 3.1 VGA driver. It is protected-mode service entry point 0Ch in VGAVDD, and it’s called VDDsetaddresses in the VGA display driver (VGA.ASM) but DspDrvr_Addresses in the VDD (VMDAVGA.INC).

The Windows 3.1 DDK does not appear to document the DspDrvr_Addresses function. Although due to the inconsistent naming, it’s difficult to be entirely certain.

At the same time, I tried to approach the problem from a different angle. The Windows 3.1 DDK does document a set of INT 2Fh calls, some of them with promising descriptions, such as “Save Video Register State (Interrupt 2Fh Function 4005h)” and the corresponding “Restore Video Register State (Interrupt 2Fh Function 4006h)”.

But there I hit the opposite problem. Even though the DDK documents those functions, and the VGA display driver implements 4005h/4006h callbacks, I could not find any code in the VDD calling those functions! And the debugger showed no sign that anyone else is calling them, either.

Note: It is possible that the save/restore registers INT 2Fh callbacks were specified for OS/2. Indeed the OS/2 2.1 DDK defines INT2F_SYSSAVEREGS (0x4005) and INT2F_SYSRESTOREREGS (0x4006) in the virtual video device driver source code… but again there is no sign of those being used in the code.

There is also “Enable VM-Assisted Save/Restore (Interrupt 2Fh Function 4000h)” and “Disable VM-Assisted Save/Restore (Interrupt 2Fh Function 4007h)”. The VGA and Video 7 call these functions and name them STOP_IO_TRAP and START_IO_TRAP. And VGAVDD.386 really implements these in VDD_Int_2F (the INT 2Fh intercept in VGAVDD.386). Interestingly, STOP_IO_TRAP corresponds to “VM knows how to restore the screen” logic, and START_IO_TRAP naturally corresponds to “VM doesn’t know how to restore the screen”.

But how does that make any sense? Why would the hardware access from the Windows display driver ever be trapped?

Why Oh Why?

Although I could not find any explanation in the DDK documentation, eventually I realized what the reason had to be: Windows/386 (aka Win386).

Windows/386 was essentially an add-on for Windows 2.x, adding the ability to pre-emptively multitask DOS sessions. Only, in the Windows 2.x days, Windows itself was effectively one of those DOS sessions.

That is, Windows 2.x display drivers had (almost) no clue about Win386. That only came with Windows 3.0. Therefore the Win386 VDD had to manage Windows itself as just another DOS session, save and restore all EGA/VGA registers, and also manage video memory contents. In fact in the “normal” Windows 2.x Adaptation Guide, there is almost no mention of Win386 (there was a separate development kit for Win386 which covered virtual device drivers).

I/O trapping was especially important on EGA adapters which did not have readable registers. As a consequence, it was impossible to read the current EGA hardware state; the only way was to shadow the state of EGA registers as they were written.

Windows 2.x display drivers did implement one interesting piece of functionality, switching Windows to/from the background. This was not at all intended for Win386 but rather for OS/2 (that is, OS/2 1.x, at least initially). The switching was implemented in the display driver by hooking INT 2Fh and watching for focus switch notifications.

In Windows 3.0, Enhanced 386 mode implemented the previously OS/2-only INT 2Fh callbacks that indicated switching out of and back to the Windows desktop. On the way out, the display driver could restore some kind of sane VGA state, and on the way back to the desktop it could re-establish the necessary hardware register state. In addition, the display driver could force a redraw of the entire screen, which avoided the need to save any video memory (which was good, because the video memory could be relatively big).

Unfortunately I don’t have the Windows 3.0 DDK (and no one else seems to, either) so I can’t look at the 3.0 VDDVGA source code. But it’s clear that whereas Windows 2.x display drivers knew very little about Win386, Windows 3.0 drivers typically have some level of cooperation with the VDD through the INT 2Fh interface.

Windows 3.1 VDDs

In Windows 3.1, Microsoft added a whole new level of complexity to VDDs. Namely, video memory can be paged. Microsoft article KB80901 states the following:

In Windows version 3.1, the standard virtual display device (VDD) for VGA is modified to demand page video memory. Thus, you can run graphical MS-DOS-based applications in a window or in the background on VGA systems. This VDD must track video memory usage, so it is not compatible with any of the super VGA display drivers that must access more than 256 kilobytes (K) of video memory. To run these display drivers, a user must use either the VDD provided by the display adapter manufacturer or the VDDVGA30.386, which is included with Windows version 3.1. Demand paging of video memory may break TSRs that worked with Windows version 3.0. The difference is that the VDD virtualizes access to video memory; in Windows version 3.0, the display driver had full reign over memory.

I am not entirely certain why Microsoft did that. It seems to add a lot of complexity in return for not a lot.

The Windows 3.1 VDDVGA.386 introduced a new concept of ‘CRTC VM’ and ‘MemC VM’, that is, the VM that owns the graphics card’s CRT controller (what is displayed on the screen) and the VM that owns the graphics card’s memory controller, i.e. what is read from and written to video memory.

In the typical case, the CRTC VM is also the MemC VM; that can be the Windows desktop (aka System VM) or a full-screen DOS box. Things get interesting for windowed DOS boxes. The desktop remains the CRTC owner because the desktop is what needs to be displayed. But a DOS box can temporarily become a MemC VM, directly accessing video memory.

Needless to say, this gets quite complicated. VDDVGA.386 needs to save the old MemC VM state, merge the new MemC VM state with it and update the hardware registers, let the DOS box execute, and then restore the original MemC VM state before the System VM can do any drawing to the Windows desktop.

As far as I can tell, of the drivers shipped with Windows 3.1 only VDDVGA.386 has this complexity. None of the other VDDs, including the Video 7 specific V7VDD.386, implement this logic. As mentioned above, I strongly suspect that the Video 7 VDD in the Windows 3.1 DDK (source code in VDDV7VGA directory) is actually very close to the Windows 3.0 VDDVGA.386, and thus to the Windows 3.1 VDDVGA30.386.

It’s a Tie

Needless to say, the register saving/restoring logic in VDDVGA.386 is quite fiddly and difficult to debug. In the end I have not been able to find out why register changes “leak through” to the System VM (i.e. Windows desktop). I found out where in the code that happens, but not why, or how to prevent it.

What I did find is that the DspDrvr_Addresses function does not at all do what the comments suggest. The function is supposedly used “to specify latch address” in video memory. Closer examination of the Windows 3.1 VGA display driver showed that while it does define a byte for the latches, and sends its address to the VDD, the display driver does nothing with that byte.

But even more interesting is that VDDVGA.386 does not use the latch byte either. Instead, VDDVGA.386 assumes that the latch byte lives somewhere very close to the end of the video memory used by the display driver, and expects that any following pages can be used by the VDD. (That logic likely comes from the Windows 2.x EGA/VGA drivers.)

A corollary is that passing 0FFFFh as the latch byte address to the VDD (something that SVGA256.DRV does) tells VDDVGA.386 that there is no video memory to share. In that situation, VDDVGA.386 does not try any hair-raising schemes to modify the VGA register state behind the display driver’s back.

It’s not perfect either. The system does survive MODE CO80 in a windowed DOS box without trouble, but starting (in a window) a DOS application which uses multiple pages of video memory triggers an interesting warning:

A disturbing but seemingly harmless warning

The warning appears to be harmless. Once it’s dismissed, the application works fine. The warning also only pops up the first time the application is started (in the same windowed DOS box). It’s not ideal, but it’s something I can live with.

I consider this fighting VDDVGA.386 to a draw. I am not impressed with the Windows 3.1 DDK documentation—it omits certain things while documenting other things that appear to be fictional. That said, the actual DDK source code saves the day, at least in the video area, because it is possible to see more or less all of the code involved.

And the Windows 3.0 DDK would be really nice to have.

This entry was posted in 386, Development, Documentation, Graphics, PC history, Windows. Bookmark the permalink.

56 Responses to Windows 3.x VDDVGA

  1. Roy says:

    there is VDDVGA source in Win3.1 (PR2) DDK

  2. Michal Necasek says:

    Yes, but that doesn’t really help much, since there is VDDVGA source in the final 3.1 DDK as well. The PR2 source code is a little different but not a lot.

    On the other hand the VDDV7VGA source code is more or less identical between the PR2 and final DDK release. The Video 7 display driver is slightly different but only slightly.

  3. Sammy Fox says:

    Can we have binaries yet? ;-; I’ve got this old panasonic toughbook that has two working batteries and it’d be really neat to run windows in SXGA+ (or even just 1280×1024) on it

  4. Michal Necasek says:

    I could give you binaries… but I can promise that they won’t work on any real laptop, and I have no interest in working on that.

  5. Roy says:

    this just showed up on betawiki
    password is `infected’

  6. MiaM says:

    Off topic, re comments, RSS and whatnot:

    For some weird reason it seemed like the bug where URLs for comments was fixed. It started working for the blog post “The future that newer was”, i.e. the latest comments has an url in the RSS feed to …/comment-page-2/…

    However this post has RSS feed for comments where the URL points to …/comment-page-1/… even though the actual comments are on page 2.

    Maybe what actually displays comments and what generate the RSS URLs are set to a different amount of comments per page? I haven’t looked into each comment URL but that would be a plausible explanation of the bug.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.