Windows 3.x VDDVGA

While working on my Windows 3.x display driver, I ran into a vexing problem. In Windows 3.1 running in Enhanced 386 mode, I could start a DOS session and switch it to a window. But an attempt to set a mode in the DOS window (e.g. MODE CO80) would destroy the Windows desktop, preventing further drawing from happening properly. It was possible to recover by using Alt+Enter to switch the DOS window to full screen again and then returning to the desktop, but obviously that wasn’t going to cut it.

Oddly enough, this problem did not exist in Windows 3.0. And in fact it also didn’t exist in Windows 3.1 if I used the Windows 3.0 compatible VDDVGA30.386 VxD shipped with Windows 3.1 (plus the corresponding VGA30.3GR grabber).

There was clearly some difference between the VGA VDD (Virtual Display Driver) in Windows 3.0 and 3.1. The downside of the VDD is that its operation is not particularly well explained in the Windows DDK documentation. The upside is that the source code of VDDVGA.386 (plus several other VDD variants) was shipped with the Windows 3.1 DDK.

First I tried to find out what was even happening. Comparing bad/good VGA register state, I soon enough discovered that the sequencer registers contents changed, switching from chained to planar mode. This would not matter if the driver used the linear framebuffer to access video memory, but for good reasons it uses banking and accesses video memory through the A0000h aperture.

But how could that even happen? The VDD is meant to virtualize VGA registers and not let DOS applications touch the real hardware. Something had to be very wrong.

Continue reading
Posted in 386, Development, Documentation, Graphics, PC history, Windows | 57 Comments

Learn Something Old Every Day, Part VII: 8087 Intricacies

The other day I investigated a report that a C runtime library modification causes programs to hang on a classic IBM 5150 PC with no math coprocessor. The runtime originally contained two separate routines, one to detect the presence of an FPU and the other to detect the FPU type.

Someone noticed that the code in the two routines looked really similar and decided to merge them. The reworked code runs just fine on 386 and later processors, with or without FPU (I’m unsure of its status on 286 machines). But it does not work on an FPU-less 8088; it causes the system to hang.

The old code looked like this:

    push  BP                 ; save BP
    mov   BP,SP              ; get access to stack
    sub   AX,AX              ; start with a preset value
    push  AX                 ; allocate space for ctrl word
    fninit                   ; initialize math coprocessor
    fnstcw word ptr -2H[bp]  ; store cntrl word in memory
    pop   AX                 ; get control word
    mov   AL,AH              ; get upper byte
    pop   BP                 ; restore BP

If the routine returned the value 3, a math coprocessor was found, otherwise there wasn’t one.

The new code looks like this:

        push    BP                  ; save BP
        mov     BP,SP               ; get access to stack
        sub     AX,AX
        push    AX                  ; allocate space for status word
        finit                       ; use default infinity mode
        fstcw   word ptr [BP-2]     ; save control word
        fwait
        pop     AX
        mov     AL,0
        cmp     AH,3
        jnz     nox87
        ...

It’s almost the same, but hangs on an 8088 without an 8087. Why does that happen?

Continue reading
Posted in 8086/8088, Development, IBM, Intel, PC history, x87 | 10 Comments

A House of Cards

As one step in the development of the Windows 3.x/2.x display driver, I needed to replace a BIOS INT 10h call to set the video mode with a “native” mode set code going directly to the (virtual) hardware registers. One big reason is that the (VBE 2.0) BIOS is limited to a predefined set of resolutions, whereas native mode set code can set more or less any resolution, enabling widescreen resolutions and such.

Replacing the code was not hard (I already had a working and tested mode set code) and it worked in Windows 3.1 and 3.0 straight away. When I got around to testing Windows 2.11, I noticed that although Windows looked fine and mouse worked, the keyboard didn’t seem to be working. Windows was just completely ignoring all keyboard input.

No keyboard input for you!

Curiously, the letters I fruitlessly typed in Windows popped up on the DOS command prompt as soon as I quit Windows (which was not hard using a mouse). This indicated that the keyboard input was not exactly lost, but it was not ending up in the right place somehow.

After double and triple checking, I assured myself that yes, using native display mode setting code instead of the BIOS broke the keyboard in Windows 2.11 (but not in Windows 3.x). That was, to put it mildly, not an anticipated side effect. How is that even possible?!

Continue reading
Posted in Bugs, Development, Microsoft, Windows | 49 Comments

Win16 Retro Development

Several months ago I had a go at producing a high resolution 256-color driver for Windows 3.1. The effort was successful but is not yet complete. Along the way I re-learned many things I had forgotten, and learned several new ones. This blog entry is based on notes I made during development.

Windws 3.1 running Word in a usable resolution

Source Code and Development Environment

I took the Video 7 (V7) 256-color SuperVGA sample driver from the Windows 3.1 DDK as the starting point. The driver is written entirely in assembler (yay!), consisting dozens of source files, over 1.5MB in total size. This driver was not an ideal starting point, but it was probably the best one available.

The first order of business was establishing a development environment. While I could have done everything in a VM, I really wanted to avoid that. Developing a display driver obviously requires many restarts of Windows and inevitably also reboots, so at least two VMs would have been needed for a sane setup.

Instead I decided to set everything up on my host system running 64-bit Windows 10. Running the original 16-bit development tools was out, but that was only a minor hurdle. The critical piece was MASM 5.NT.02, a 32-bit version of MASM 5.1 rescued from an old Windows NT SDK. The Windows 3.1 DDK source code is very heavily geared towards MASM 5.1 and converting to another assembler would have been a major effort, likely resulting in many bugs .

Fortunately MASM 5.NT.02 works just fine and assembles the source code without trouble. For the rest, I used Open Watcom 1.9 tools: wmake, wlink, and wrc (make utility, linker, and resource compiler). I used a floppy image to get the driver binary from the host system to a VM, a simpler and faster method than any sort of networking.

With everything building, the real fun started: Modifying the Video 7 driver to actually work on different “hardware”.

Continue reading
Posted in Debugging, Development, Microsoft, Windows | 25 Comments

Undefined Isn’t Unpredictable

The other day I discovered that 32-bit FreeBSD 11.2 has strange trouble running in an emulated environment. Utilities like ping or top would just hang when trying to print floating-point numbers through printf(). The dtoa() library routine was getting stuck in an endless loop (FreeBSD has excellent support for debugging the binaries shipped with the OS, so finding out where things were going wrong was unexpectedly easy).

Closer inspection identified the following instruction sequence:

    fldz
    fxch   st(1)
    fucom  st(1)
    fstp   st(1)
    fnstsw ax 
    sahf
    jne ...
    jnp ...

This code relies on “undefined” behavior. The FUCOM instruction compares two floating-point values and sets the FPU condition code bits. The FNSTSW instruction stores the bits into the AX register, where they can be tested directly, or the SAHF instruction first copies them into the flags register where the bits can be conveniently tested by conditional jump instructions.

The problem is the FSTP instruction in between. According to Intel and AMD documentation, the FSTP instruction leaves the FPU condition codes in undefined state. So the FreeBSD library is testing undefined bits… but it just happens to work on all commonly available CPUs, in a very predictable and completely deterministic manner, because the FSTP instruction in reality leaves the condition bits alone. What is going on?

Continue reading
Posted in AMD, Development, Documentation, Intel | 22 Comments

Does (E)IP Wrap Around in 16-bit Segments?

The 8086/8088 is a 16-bit processor and offsets within a 64K segment always wrap around. If a one-byte instruction at offset FFFFh is executed on an 8086, execution will continue at offset 0. This is simply a consequence of the Instruction Pointer (IP) being a 16-bit register.

Funny things happen when an access crosses a segment boundary. On an 8086, it will also wrap around; accessing a word at offset FFFFh will access one byte at offset FFFFh and one byte at offset 0 in a segment. Again, that is a consequence of 16-bit address calculations.

The 80286 got a lot smarter about this. Segment protection prevents accesses that wrap around the end of a segment, for both data and instructions. The 80386 continued using the same logic.

The 286 and 386 support one special case, stack wraparound. When the 16-bit Stack Pointer (SP) is zero, pushing (say) a word on the stack will wrap around and the new SP will be FFFEh. This feature was required for 8086 compatibility, because a full size 64K stack needs to start with SP=0 (the pushes and pops must be aligned for the wraparound to occur; unaligned accesses will cause protection faults).

Does the instruction pointer also wrap around in a way similar to the stack segment?

Continue reading
Posted in 386, 8086/8088, Intel, x86 | 9 Comments

PC Disk Sector Sizes and Booting

Everyone knows that the IBM PC established 512-byte sectors on floppies and hard disks as the standard, which survived for several decades until the advent of “native” 4K-sector drives.

Of course what “everyone knows” is not necessarily the whole story.

PC Floppy Sector Sizes

The original 1981 IBM PC Technical Reference says: The [IBM PC floppy] drives are soft sectored, single sided, with 40 tracks. They are Modified Frequency Modulation (MFM) coded in 512 byte sectors, giving a formatted capacity of 163,840 bytes per drive.

But that was never really true; while PC DOS 1.0 indeed used 8 sectors per track, resulting in 163,840 bytes (512 × 8 × 40 bytes) on a floppy disk, PC DOS 2.0 supported 9 sectors per track on the same hardware, with increased 184,320 bytes disk capacity. This was possible in large part because the BIOS in the IBM PC was fairly flexible when it came to floppy disk formats.

Continue reading
Posted in BIOS, DOS, IBM, PC history, Storage | 15 Comments

IBM AIX for IA64 (Itanium) aka Project Monterey Runs Again!

(This is a guest post by Antoni Sawicki aka Tenox)

Project Monterey was an attempt to unify the fragmented Unix market of the 90s in to a single, cross vendor Unix OS that would run on the upcoming Intel Itanium (and others) CPU. The main collaborators were: IBM, who brought its AIX, SCO brought UnixWare, HP was supposed to bring parts of HP-UX and Sequent DYNIX/ptx. Ironically the project shared fate of the Itanium CPU—it totally failed. In the end Linux took spot of the “single Unix OS”. IBM donated AIX pieces to Linux instead and the main legacy of Project Monterey was the famous SCO vs IBM lawsuit.

IBM did however produce AIX version for the Itanium architecture! According to Wikipedia, some 30+ licenses were sold in 2001-2002. For years a dedicated group of individuals was trying to locate a copy of the legendary OS. It seemed that the OS was lost forever…

Continue reading →
Posted in IBM, Intel, SCO, UNIX | Tagged , , , , , , , | 11 Comments

Slovenian OS/2 Warp 4

This is a guest post written by Marko Štamcar from the Slovenian Computer Museum in Ljubljana. Additional context and commentary from the OS/2 Museum can be found at the end of the article.

Slovenia being a tiny country with a population of just 2 million, IBM OS/2 Warp 4 was one of the few non-Microsoft operating systems to be localized to Slovenian in the mid-90s and a big deal for the local IT community back then. But nearly 3 decades later, when OS/2 disappeared from the last ATMs in the country, the even rarer Slovenian version was as good as completely gone. Or was it?

The Slovenian Computer Museum

Cue the Slovenian Computer Museum and our software heritage/conservancy activities. I have been part of the museum for the last 5 years and am the head of the laboratory (responsible for getting old machines to work for them to be shown off) and vice president of our non-profit organization. Our museum was founded in 2004 as part of the local hackerspace Kiberpipa/Cyberpipe but has since outgrown its humble beginnings and is now located in a dedicated space with 700 square meters of useful room on three levels dedicated to museum storage, exhibition and event space, and two classrooms. (More info on the museum’s website.)

Continue reading
Posted in IBM, OS/2, PC history | 14 Comments

Antique Display Driving

Here’s a preview of something I’ve been slowly working on, bit by bit:

Windows 1.04 Reversi

That screenshot surely looks a little funny. That’s because it is Windows 1.04 running with a heavily modified 256-color Windows 3.x display driver, using resources from a Windows 2.0 VGA driver.

This is mostly the same driver, running real-mode Windows 3.0:

High-res Real Mode Windows 3.0
Continue reading
Posted in Development, Microsoft, Windows | 36 Comments