The OS/2 Display Driver Zoo

I have recently explored (again) the possibility of writing a high-res display driver for virtualized OS/2. But I ran (again) into a dizzying array of possible solutions, each with its own advantages and a good deal of drawbacks.

OS/2 display drivers underwent something of a rapid evolution in the 1992-1996 timeframe. The OS/2 Warp 4 DDK comes with no fewer than four significantly different display driver code bases, which reflect this evolution.

Warning: This article is long! It contains notes from research into the evolution of OS/2 display drivers, DDK sample code, and accompanying documentation. Much of the article is something of a signpost, showing where to find what.

OS/2 1.x

In the days of OS/2 1.x, things were simple. Display drivers were 16-bit, written in assembler (out of necessity more than anything else), and there was only one driver model. The drivers for OS/2 Presentation manager (PM, code name Winthorn) were circa 1987 cloned from Windows 2.x display drivers, since the inner workings (bitmaps, patterns, brushes, fonts, ROPs, and all that jazz) were very similar. Like Windows drivers, OS/2 display drivers worked by “compiling” (dynamically constructing) code to implement various drawing functions.

800×600, big enough for OS/2 1.1 command prompt window

OS/2 1.1 (1988) supported device driver interface (DDI) version 1.0, being the first Presentation Manager release. The interface resembled Windows 2.x and was just as ugly.

In OS/2 1.x, the Presentation Manager shell (PMSHELL.EXE) directly linked against the display driver, which had to be named DISPLAY.DLL–much like in Windows 3.x and earlier the display driver had to be called DISPLAY.DRV.

The 1989 release of OS/2 1.2 looked much easier on the eyes than 1.1 and visually resembled the not-yet-released Windows 3.0. The DDI was upgraded to version 1.2. The newer DDI supported color icons and cursors (OS/2 1.1 was limited to monochrome icons), and the driver needed to supply lots more icon and bitmap resources. However, the driver model was essentially unchanged and the driver code was correspondingly largely identical.

The Presentation Driver (this encompasses both display and printer drivers) documentation from the OS/2 1.x era is available, with some overlap between Microsoft and IBM documentation. Sadly, neither Microsoft nor IBM bothered documenting the display driver interface changes between OS/2 1.1 and 1.2. If one has the source code to an OS/2 1.1 display driver, it will not work well on OS/2 1.2 or 1.3, but finding out what exactly needs to be changed is… challenging.

Microsoft published a DDK with display driver sample code for OS/2 1.1. However, it is unclear if such a DDK existed for OS/2 1.2 or 1.3; if it did (which seems likely), it hasn’t been found. Since IBM only supported IBM hardware at the time, there was no need for IBM to provide a driver development kit; supporting OEMs was Microsoft’s job back then.

OS/2 2.0

The initial release of OS/2 2.0 was in some ways an odd duck, a not-quite-finished hybrid. The Graphics Engine (GRE) was still 16-bit, and display drivers were too. Although they were architecturally unchanged from OS/2 1.x drivers, the OS/2 2.0 display drivers had a number of additions and modifications.

This primarily included support for VDDs (Virtual Device Drivers) needed to support DOS sessions, especially full-screen and background ones (windowed DOS boxes were standard applications from the PM display driver’s perspective).

There were also other changes; for example, in OS/2 1.x the display driver was responsible for managing a code segment for dynamically generated code, and its corresponding read-write alias. In OS/2 2.0, the PMDD.SYS driver took over much of the work and could allocate a “magic” segment with the right attributes.

The OS/2 2.0 Presentation Driver reference from March 1992 describes 32-bit display drivers, but it also comes with a rather interesting disclaimer saying that it is “for planning purposes only”:

32-bit drivers not quite ready yet

As far as I’ve been able to establish, there was no actual support for 32-bit display drivers in OS/2 2.0. Out of the box support was provided for IBM adapters such as EGA, VGA, and 8514/A, but also XGA; all these drivers were 16-bit, effectively upgraded OS/2 1.3 drivers.

OS/2 2.00.1 and Service Pak XR06055

Just a few months after the April 1992 release of OS/2 2.0, circa in August/September 1992, IBM shipped Service Pak XR06055 and started preloading OS/2 2.00.1 on some systems. These updates versions came with a new 32-bit Graphics Engine (GRE), something that IBM clearly wasn’t able to get done in time for OS/2 2.0.

The 32-bit GRE continued to support existing 16-bit drivers, but naturally also worked with new 32-bit drivers. The updated GRE also implemented a few additions that had nothing to do with the 32-bit rewrite.

For 256-color drivers, the new GRE supported the Palette Manager. This gave applications more control over the hardware color palette. Additionally, OS/2 drivers now could support seamless Win-OS/2 operation. To implement seamless Win-OS/2 (that is, Windows 3.x applications running on the OS/2 desktop), IBM chose to use slightly modified Windows drivers running in the Win-OS/2 session. Which meant that a Windows driver and an OS/2 driver had to draw on the screen at more or less the same time, cooperating with each other. While this approach was perhaps technically questionable, it did work, but required a bit of extra work on the OS/2 driver side.

Correction: Seamless Win-OS/2 support was not new in the 32-bit GRE. It existed previously in OS/2 2.0, but the IBM VGA driver was the only driver with seamless support, at least from IBM. SVGA, XGA, and 8514/A seamless Win-OS/2 was only supported in the 32-bit IBM drivers.

To support the 32-bit GRE, IBM wrote with not just one but two new drivers, or rather driver sets.

One was IBMVGA32. This was a driver supporting VGA and SVGA-style hardware, written entirely in 32-bit assembler. The driver was split into two parts: Hardware independent (IBMVGA32) and hardware dependent. IBM supplied two hardware dependent implementations: IBMDEV32 supporting 16-color VGA, and SVGA256 supporting a number of SVGA chips running in 256 colors.

The SVGA256 driver was shipped in three variants for different resolutions (all in 256 colors): 640×480 (SV480256.DLL), 800×600 (SV600256.DLL), and 1024×768 (SV768256.DLL).

The 256-color driver supported several then-common SVGA chips from ATI, Video 7, WD, IBM (“Speedway”), Tseng, Trident, and Cirrus Logic.

The other 32-bit driver was a merged XGA and 8514/A driver. This driver was written in a mix of C and assembler. IBM soon adapted the 8514/A driver to support S3 chips, which were themselves 8514/A derivatives (but not strictly compatible with the 8514/A). This driver became the basis for many OEM accelerated drivers.

Device Driver Kit

Around May 1993, IBM published the first iteration (version 1.0) of the OS/2 2.x DDK. This included sample code for the above described 32-bit drivers, as well as the older 16-bit VGA and 8514/A drivers that evolved from the original OS/2 1.1 drivers.

The DDK CD-ROM came with on-line documentation in the OS/2 INF format. This included the same Presentation Drivers Reference (S10G6267.INF) labeled First Edition (March 1992), presumably publication S10G-6267-00.

But there was also a new document titled Display Device Support for OS/2 (S71G1896.INF), which was labeled as Second Edition (March 1993), likely publication S71G-1896-00. This document described the sample code shipped on the DDK and explained how to adapt it for OEM hardware. It is not clear if and how the first edition of the document was published. It was possibly part the pre-release presentation driver development program.

In February 1994, IBM published the OS/2 DDK version 1.2 (there was also a version 1.1, but its whereabouts are unknown). This included a renamed Presentation Driver Reference (now PDRREF.INF), unfortunately with no information as to which edition it is. In any case it is an updated version, slightly reorganized and with a new chapter about Software Motion Video Support in display drivers.

The Display Device Driver Reference (now DISPLAY.INF) is not obviously different but unfortunately also contains no edition information.

To make matters more interesting, IBM also published the driver documentation in BookManager format. Some editions were preserved on IBM Softcopy Library CD-ROMs.

The December 1994 OS/2 Library includes the OS/2 2.1 Presentation Driver Reference book (EJ5A8A01.BOO), document number S10G-6267-01. This is labeled Third Edition (March 1994). There is also the corresponding OS/2 2.1 Display Driver Reference (EJ5A5A00.BOO), document number S71G-1896-01, labeled Second Edition (March 1994).

OS/2 Warp 3

In OS/2 Warp (1994), IBM did what they (and Microsoft) arguably should have done from the very beginning: The Graphics Engine now supported SOFTDRAW (capitalized in IBM documentation), a software rasterizer.

Previously, every display driver had to implement code to draw lines, rectangles, text, or perform bit blits. This code was used not only for screen output but also for drawing to memory bitmaps, completely unrelated to hardware. Perhaps Microsoft did this to make the interface more abstract and give drivers more control… but that was the opposite of what users wanted. Consistent behavior across devices was what users actually desired.

Here’s how IBM put it in the Warp DDK documentation: All presentation display drivers have two major but only marginally related functions: drawing to the display, and drawing to memory bit maps. This dual-mode drawing architecture was resolved by having the bit-map drawing code emulate the XGA hardware. Note that XGA was about the only graphics chip which was capable of drawing to the screen and to system memory bitmaps.

It is notable that printer drivers also needed to implement all the drawing functions, with the caveat that since the beginning, printer drivers had the option to call into the display driver to do the hard work. Which of course could cause “interesting” interactions with particular combinations of printer and display drivers.

IBM’s SOFTDRAW allowed drivers to let OS/2 handle all the complexity of drawing to memory bitmaps. For hardware that provided a linear framebuffer, SOFTDRAW could draw on the screen as well–since the screen was just another bitmap.

The overall structure of a display driver was still the same as before, but SOFTDRAW greatly reduced the difficulty of implementing a display driver. SOFTDRAW made it much easier to accelerate certain operations and fall back on the software rasterizer for anything complex.

For example, a driver might decide to not deal with text output and just let the GRE turn text into bitmaps. But the driver could then still use accelerated bit blits when SOFTDRAW went on to draw the bitmap on the screen.

Warp DDK

In April 1995, IBM published the OS/2 DDK version 2.0. The Presentation Device Driver Reference for OS/2 (PDRREF.INF) now called itself Fifth Edition (April 1995) and includes a summary of changes since the Fourth Edition (March 1994).

The same DDK also includes an updated Display Driver Reference for OS/2 (DISPLAY.INF) which calls itself Fourth Edition (April 1995) and includes a summary of changes since the Third Edition (March 1995)—published only one month earlier.

The Fourth Edition adds primarily information about DCAF (Distributed Console Access Facility, IBM’s remote desktop implementation) support. The missing Third Edition added much information about the S3 accelerated display driver.

Just to make things confusing, the June 1995 edition of the IBM OS/2 Softcopy Library includes Presentation Device Driver Reference for OS/2, Volume I (EJ5A8A02.BOO), document number S10G-6267-02. This document calls itself Fourth Edition (June 1995), but includes a summary of changes since Fourth Edition (March 1994). Based on the contents, this is what the PDRREF.INF from 2.0 DDK refers to as Third Edition (March 1995). The document number (S10G-6267-02) would also imply third edition (the last two digits being -00, -01, and -02 for the first, second, and third editions, respectively).

I have no idea what to believe. In any case, the BookManager document (and likely the printed hardcopy) was split into two volumes, with Presentation Device Driver Reference for OS/2, Volume II (EJ5G1A00.BOO) being document number S30H-2367-00.

The same June 1995 Softcopy Library also comes with Display Driver Reference for OS/2 (EJ5A5A01.BOO), document number S71G-1896-02. This book calls itself Fourth Edition (June 1995), and includes a summary of changes since Second Edition (March 1994). Again, based on the DISPLAY.INF in the 2.0 DDK, this should really be the Third Edition (March 1995). Which, again, would match the -02 document number.

Clearly, IBM’s documentation versioning was a bit of a mess.

OS/2 for the PowerPC

When IBM started porting OS/2 to the ill-fated PowerPC, none of the existing drivers were a good fit due to high complexity and significant to complete dependence on x86 assembler.

IBM decided to develop a new, simplified, and far more modern driver model. The new model was called GRADD (Graphics Adapter Device Driver). The actual device-specific driver was quite simple and all the complexity of GRE driver implementation was centralized in an IBM-provided SOFTDRAW library (described above).

The GRADD model was quite different and even simpler than SOFTDRAW-based drivers. In the classic presentation driver model, display drivers had to implement a large number of mandatory functions. SOFTDRAW allowed drivers to point to an implementation inside SOFTDRAW rather than in the driver proper, but the functions still needed to be implemented.

GRADD drivers worked the other way around and a basic driver did almost nothing except provide a dumb framebuffer and indirectly let SOFTDRAW do all the work. An accelerated driver could hook out certain operations that hardware could do much faster than software, such as screen-to-screen copies, hardware cursors, or bit blits with color conversion. Anything the driver didn’t explicitly ask to handle was done by SOFTDRAW.

A big plus of GRADD was that IBM provided a generic Win-OS/2 driver, which meant that OEMs were no longer required to ship their own Win-OS/2 driver at all.

Warp 4

On the Intel platform, the GRADD driver model was shipped in OS/2 Warp 4 (1996), and also eventually backported to Warp 3 in FixPacks. The initial GRADD support in Warp 4 was somewhat buggy, but stabilized over time.

Some colors were off in Warp 4 GA GENGRADD (24bpp)

Since about 1998, more or less all new OS/2 drivers used the GRADD model. This simplified everyone’s life because there was only one set of bugs to deal with (in SOFTDRAW), rather than different drivers from different vendors all having their own idiosyncrasies and quirks.

Until about 2003, IBM kept publishing an updated GRADD driver package as a separate download, installable on Warp 3 (with FP 35 or later), Warp 4 (with FP 5 or later), WSeB, and Convenience Packages. The package included both a generic unaccelerated driver (GENGRADD) as well as accelerated drivers for a number of then-current graphics chips.

Warp 4 DDK

IBM finalized the OS/2 Warp 4 DDK in September 1996. The DDK now included sample GRADD drivers (which were mentioned in the Warp 3 DDK but no sample code was provided).

The Warp 4 DDK shipped with four significantly different sample display drivers:

  • The old 16-bit VGA assembler driver for 16-color VGA and 8514/A
  • 32-bit assembler driver for 16-color VGA and 256-color SVGA
  • 32-bit C/assembly driver for XGA, 8514/A, and S3 accelerators
  • 32-bit C generic and S3 accelerated GRADD drivers

To give a sense of the complexity of the drivers, the 16-bit VGA driver was over 5 MB of assembler code, heavily macro-ized. The 32-bit VGA driver was over 6 MB of assembler, again using lots of macros. The 32-bit accelerated driver was about 1.5 MB of assembler and 3.6 MB of C code.

In contrast, the accelerated S3 GRADD driver was a little over 200 KB of C code, and the generic unaccelerated GRRADD driver was only 30 KB of C code!

An updated edition of the Presentation Device Driver Reference for OS/2 (PDRREF.INF) was included in the DDK. There is no longer any clear edition information, only a note that certain “updates were made for Version 4 of the DDK”.

The Display Driver Reference for OS/2 (DISPLAY.INF) says that “there were no major changes to this release”. However, there is a new Graphics Adapter Device Driver Reference (GRADD.INF) book which describes the GRADD model on the OS/2 Intel platform. This reflects the fact that IBM effectively switched to GRADD for new development.

Post-Warp 4 DDKs

IBM kept releasing online updates to the OS/2 DDK until 2004. However, there were no longer any formal versioned releases and individual components were updated on an ad-hoc basis. It appears that DISPLAY.INF was no longer updated after the Warp 4 DDK release. However, PDRREF.INF was last updated in 1997 and GRADD.INF in 1999. The GRADD sample code kept being updated until 2003, and similarly the SVGA base support was maintained to keep up with hardware supported by IBM.

Which Way To Go

For supporting OS/2 1.x or 2.0, there’s no real choice. The original 16-bit driver model is the only game in town. Unfortunately, there is no SVGA sample code available, for any bit depth.

Although that isn’t entirely true–the OS/2 1.1 DDK includes a 16-color driver for certain Video 7 (Microsoft’s favorite at the time) models running at 800×600 resolution. Unfortunately the driver can’t be easily modified to support higher resolutions because it doesn’t do any bank switching. That is also the reason why Windows 3.1 and Windows NT came with generic 800x600x16 display drivers—although there is no standard VGA 800×600 mode, once the mode is set (using INT 10h mode 6Ah), drawing can be accomplished using only standard VGA registers.

For OS/2 2.1 (really OS/2 2.00.1 or OS/2 with Service Pak XR06055 and later), there is the option of using a 32-bit display driver. The DDK offers sample code for a 256-color SVGA driver which is not difficult to adapt for other graphics hardware. While this driver should be also reasonably easy to adapt to resolutions higher than 1024×768, it is much harder to support color depths other than 8-bits.

IBM’s OS/2 Warp DDK documentation recommends taking the S3 driver as a basis for developing new drivers. The S3 driver was derived from the 32-bit XGA and 8514/A drivers shipped with OS/2 2.1 (as noted earlier, the original S3 hardware, while not fully compatible with the 8514/A, was architecturally very similar). A major advantage of the S3 driver over the 32-bit SVGA driver is that the S3 driver handles multiple resolutions and multiple color depths in a single binary (whereas the SVGA driver needs a different DLL for each resolution and only supports 256 colors).

The evolution of the S3 driver is documented in the OS/2 Display Device Driver Reference, and further insight can be gleaned from the source code. The driver was originally written for the IBM XGA, in a mix of C and assembler, for OS/2 1.x. The code was then converted to 32-bit for OS/2 2.0. The driver was subsequently cloned and adapted to support the 8514/A (IBM already had an older 16-bit 8514/A driver); to a significant extent, the 8514/A hardware acceleration is a subset of the XGA capabilities. The hardware access code in the driver was split out, which made it easier to deal with the XGA and 8514/A differences. In turn, this rework made it easier to support the S3 accelerators, and the driver was further adapted to ease porting to different hardware dissimilar from the 8514/A or XGA. The S3 driver was also enhanced to support 24bpp modes (the XGA only supported 8 and 16 bpp).

On closer look, the DDK sample code for the XGA / 8514/A / S3 seems to have been designed for maximum confusion. In the 1993 (version 1.0) DDK, there were two drivers, XGA32 and XGA8514. Most likely the XGA32 driver was cloned to XGA8514 and adapted for the 8514/A. In the 1994 (version 1.2) DDK, the XGA and 8514/A drivers were merged again, and support for S3 accelerators was added. The merged driver was under DDK\SRC\PMVIDEO\32BIT and additionally supported 24bpp video modes.

The 1995 OS/2 Warp DDK (version 2.0) added a DDK\SRC\PMVIDEO\S3TIGER sample driver, which was largely identical to the merged ’32BIT’ driver. The difference was that the S3TIGER driver supported EnDIVE, IBM’s software video offloading framework. This framework (circa 1995) did not support any true video decoding, but it did support color conversion and stretching, a feature available in the better graphics chips at the time (such as the S3 Vision 868 and 968).

Developers could be misled into thinking that the S3TIGER driver was the right one to use as a basis for porting. But no! Although IBM kept shipping the S3TIGER sample driver, it was not maintained. On the other hand, the 32BIT driver kept getting minor fixes and sometime in 1998 or 1999, also had support for 32bpp modes added. While 32bpp modes did not visually look any different from 24bpp, and needed more video memory, most graphics chips did not support accelerated 24bpp drawing—while 32bpp acceleration became standard.

IBM also merged DBCS support into the 32BIT driver in late 1999 or early 2000. This coincided with Warp Server for e-Business which was capable of supporting both SBCS and DBCS environments, unlike older OS/2 versions which required separate, modified DBCS drivers.

It is clear that the ’32BIT’ driver and not ‘S3TIGER’ sample driver was the right basis for porting, a fact that is not at all clear from IBM’s documentation.

For Warp 4 and later, GRADD drivers are by far the easiest to develop, and completely avoid any hassle with Win-OS/2 drivers. At the same time, the GENGRADD driver which ships with OS/2 usually offers reasonable resolutions and color depths, and performs well in emulated environments; therefore the need to create a new GRADD driver is quite limited.

Matrox Mystery

While going through the combined XGA / 8514/A / S3 driver source code, curious Matrox references popped up (along with a strange WOLVES.H header file). When the macro ‘MATROX’ is defined at compile time, the XGA driver adds several strategically placed tests where it checks whether it is RunningOnMatrox(). This is done by checking whether the current (MCA) adapter ID equals 80EEh.

The Matrox support is even mentioned in the OS/2 Display Device Driver Reference manual and used as an example of implementing certain techniques.

The sample XGA driver has included the Matrox code since the first OS/2 2.x DDK in 1993, and it survived until the last DDK update in 2004. The only problem? The Matrox-specific code is never built, because the ‘MATROX’ macro is never defined and the MATROX.C support module is never compiled. It is also incomplete.

The adapter in question is (based on the 80EEh ID) a rather obscure Matrox Illuminator-16/MC. The card clearly supported 16bpp modes, using a pixel format different from the XGA. I could not find out if the Illuminator-16 has accelerated drawing. In any case, the Matrox-enabled XGA driver does not use it—it appears to use purely software drawing.

I could not find any actual IBM XGA OS/2 driver that would include this Matrox support. But the Illuminator-16 was reportedly supported under OS/2 and used for video production. Whether Matrox shipped the modified IBM XGA driver or something altogether different, or if the combined XGA/Matrox driver was ever shipped at all, is entirely unknown.

Summary

For supporting OS/2 2.00.1 and later, the 32-bit merged S3 driver should be by far the best starting point for developing a Presentation Manager display driver. It is a full-featured driver written largely in C and meant to be relatively easily adaptable to different graphics hardware.

For any earlier OS/2 versions (including OS/2 2.0 GA), display drivers must be 16-bit. The only available samples are written entirely in assembler and do not support SVGAs, which would necessitate major surgery.

This entry was posted in Development, Documentation, Graphics, IBM, OS/2. Bookmark the permalink.

29 Responses to The OS/2 Display Driver Zoo

  1. It might be a long one but I read it. Twice. And I will return to it. Keep up the good work!

    And as for the drivers needing to draw into memory contexts. At least in Windows, there were DDBs (Device Dependent Bitmaps) and DIBs (Device Independent Bitmaps, I think starting from Win 3.0). DDBs were 1-to-1 representations of the video memory. They could be planar, have strange pitch, and generally “stupid” memory layout (think of CGA and Hercules). And it made perfect sense for the device driver to “draw” into memory (DDB!) bitmaps, because it was done by almost the same code that was used to draw to the actual video memory. And once a DDB had been “drawn”, it could be simply copied 1-to-1 to VRAM (used for double-buffering complex images).

    So, the model had been created with the anticipation that the then-future graphics adapters would use even stranger memory layouts. But then came SVGA and all video modes became flat (non-planar) and linear, with almost the same bitmap format across all adapters and modes. And the “baroque”, overcomplicated Windows and OS/2 display driver model became absolutely irrelevant and gave no performance benefits at all (and I guess performance was the primary driver back in the times of 8088/80286 and CGA/Hercules/EGA). This led to generic frame-buffer handling libraries of Windows 95, Windows NT and SOFTDRAW, and the simplification of display driver models.

    At least this is how I understand the topic and the history 馃檪

  2. LightElf says:

    Everything related to OS/2’s interaction with the display is so overcomplicated that it’s not even funny. PDD, VDD, BVH, PMI, GRADD, GREEXT, VIDEOCFG… Too flexible for stable operation.

  3. Michal Necasek says:

    You know… I think you’re right about the DDBs, it makes sense that the driver would need to keep them in a custom format. And for EGA style adapters it probably wasn’t crazy to do it that way, since the planar memory doesn’t map very well to regular system memory.

    Except I don’t think OS/2 actually ever had DDBs… as far as I know, only “standard” bitmap formats were used for memory bitmaps (basically DIBs only). Have to do more digging.

  4. And what about GPI’s Memory Contexts? I think you can create a screen-compatible memory context and I would imagine the underlying in-memory bitmap would effectively be a DDB. Maybe this is the case?

  5. One interesting aspect to discover would be the bitmap format itself. Windows 3.0 adopted OS/2’s bitmap format for DIBs, but forced DIBs to be flat (non-planar). But OS/2’s format supports a wide variety of memory layouts, including planar bitmaps. It might well be that even though there is no clear distinction between DDBs and DIBs in OS/2, the same bitmap format may sometimes represent a “DDB” (configured to exactly match the video memory’s layout), and sometimes represent an abstract “DIB” (flat, non-planar bitmap with a certain color depth and compression).

  6. Fernando says:

    There were video cards based in TI TMS34010 (1986), TI TMS34020, Intel i860 (1989), Rendition V茅rit茅 V1000 (1996), microprocessors and probably others. I wonder if part of the complexity of the video drivers was because they were expecting to offload part of the functions to this chips. But in the end accelerators like XGA and S3 won.

  7. Michal Necasek says:

    So… yes, OS/2 theoretically supports planar bitmaps, although I guess these are not quite DDBs because they do have a system-defined format. But I can’t find clear evidence that any of the OS/2 drivers actually support any format with a number of planes other than 1 (that is, anything other than packed-pixel format).

    I have not examined all the CGA/VGA drivers in detail but I’m not convinced they use any kind of DDBs. Many drivers can deal with off-screen bitmaps but I’m not sure this is exposed to users, since off-screen memory is not guaranteed to exist. So off-screen bitmaps are usually used only as temporary work areas or for caching.

    The Display Device Driver Reference contains the following statement: “The BITBLT routine for the memory bit map in the sample code does not have any dependencies on the hardware. If the appropriate base code is selected, then it does not have to be modified for differences in the hardware. The BITBLT routine for the physical display screen must be modified depending on the display adapter card.” Which sounds like memory bitmaps were device independent in practice, if not in theory.

  8. Michal Necasek says:

    That is quite possible. It should be kept in mind that the Windows GDI was designed circa 1985, before any such accelerators actually existed. So it wasn’t crazy to design it in a very flexible way because at the time they simply could not know which way hardware would go.

    It probably wasn’t until 1991 or even later when it became clear that 8514/A (including S3 and ATi Mach) and XGA style accelerators were effectively “it”.

  9. John Elliott says:

    I suspect the design of the API, with drivers expected to implement operations like ‘draw line’ and ‘fill area’ for themselves, goes back to the early-1980s graphical terminal model where the driver would just translate the operation into a handful of escape codes and send them to something like a Tektronix 4010 or an HP 7470 plotter. Even on early raster displays, you’d probably get a substantial performance boost from having the code written for the display’s specific memory model rather than something more generic.

    Having a requirement for Windows drivers to be able to render onto a mono bitmap as well as the native format is probably another one of those decisions that made more sense when lots of displays were mono and even colour displays were planar. I can’t help thinking it’d be a right pain as soon as the packed pixel memory layout took over.

  10. Michal Necasek says:

    I actually found a source file which describes some of the Windows GDI evolution. The design started in 1983, well before there was anything resembling accelerators. So I’m sure it was influenced by the design of hardware which was around at the time.

    The BITBLT functionality evolved over three versions to support “ternary ROPs” involving logical operations between source, destination, and pattern (aka brush) pixels. However, that was before color support was added at all! Only then (in the fourth version) they added color support, and that was when everything got a whole lot more complicated.

    OS/2 initially defined support for 1/4/8/24bpp bitmaps in packed-pixel format. As far as I can tell, a driver could keep memory bitmaps in any format it wanted, but the way an application got the pixels into or out of a bitmap was GpiSetBitmapBits/GpiGetBitmapBits, and those worked with the standard device-independent formats.

    I have honestly a hard time figuring out from the gobs of assembler in what format the VGA driver actually kept 4bpp memory bitmaps. It was free to use whatever it liked, that much is clear. I think it used a planar format but how useful that was in practice I don’t know.

    The part where the original GDI design kind of fell apart IMO was that the driver needed to work with the screen in whatever hardware format there was, but it had to handle memory bitmaps in monochrome plus all supported screen color formats. So for example the 8514/A driver can work in either 4bpp or 8bpp, therefore the driver needs code to handle memory bitmaps in 1/4/8bpp. And from what I can tell, the 8514/A driver keeps memory bitmaps in packed-pixel format, so it just ends up implementing completely generic drawing functionality.

  11. @Michal Necasek You’re right – yesterday I took a look at OS/2’s BITMAPINFO2 structure and it can support planes, but not any non-standard (CGA/Hercules) pixel access patterns. Theoretically, one could set up a “DDB-like” bitmap on EGA/VGA (4 monochrome planes, valid pitch), but not on CGA or Hercules. I guess with OS/2 Microsoft and IBM were less concerned with performance than Microsoft were during Windows 1.x/2.x development, and went with more abstract memory bitmaps, which can be then blitted into the video memory with a correct transformation.
    Overall, having memory bitmap drawing handled in the video driver may make perfect sense in Windows (if DDBs are concerned), but makes much less sense in OS/2 (unless they imagined video hardware accelerating memory bitmap creation in any way, say with XGA and off-screen bitmaps). This would explain why SOFTDRAW was such a natural step forward – moving common overhead to the operating system itself.

  12. Michal Necasek says:

    Yes, and Microsoft had equivalents in NT and in Windows 95, but not in Windows 3.1. Arguably not putting SOFTDRAW into the 32-bit GRE was a missed opportunity. IBM had all almost all of the code in the XGA driver already (because they had to!) but every driver vendor still had to supply their own.

    And then for example ATi had good hardware but their drivers were not entirely stable (just like their Windows drivers back then). Matrox was expensive but their drivers were very solid.

    ETA: A software 2D engine is far harder to do with no LFB. That was somewhat rare in ISA days, and unavailable to 16-bit software. NT was designed to work with non-PC platforms where dumb framebuffers were much more common, and it was 32-bit from the beginning, so it was in a much better starting position. VflatD was a clever solution to create a virtual LFB for banked adapters.

  13. Fernando says:

    I was rereading this post/article and have other question. The driver source code (especially the 32-bit part) is specific for the Microsoft assemblers/Compilers or could be compiled with Watcom/IBM/Borland/Microsoft assemblers/compilers?
    I would expect that the original code was for Microsoft and the original code for GRADD would it be for MetaWare High C.

  14. Michal Necasek says:

    The assembler part uses MASM 5.1, and it uses Microsoft’s cmacros.inc. I doubt it builds even with MASM 6.0, though I haven’t tried.

    The Intel OS/2 DDK never used the MetaWare compiler for anything. MetaWare was used in the initial phase (well, there wasn’t much beyond that) of OS/2 for PowerPC devlopment, before IBM had their own VisualAge C++ ported to PowerPC. I think there are some hints that the GRADD code was initially built with MetaWare on PowerPC.

    The C part of the PM driver uses the old Microsoft cl386 compiler, although I had no real trouble building it with Open Watcom instead. The 16-bit C code in the DDK pretty much all uses Microsoft C 6.0, while 32-bit code uses either cl386 or IBM’s CSet/2 or VAC++. I am unclear on why they kept cl386 — but it was shipped with the DDK so they probably didn’t feel switching to a different compiler would solve any actual problem (in fact developers would then have to buy IBM’s compiler). Actually I think some of the newer DDK drivers use Watcom C/C++ as well.

    I don’t think IBM ever used Borland compilers for much at all. For the most part they stuck to IBM and Microsoft tools.

  15. Fernando says:

    Thanks for the response, so that adds to the zoo.
    I was investigating a little more about this and found that Steven Mastrianni developed some device drivers for OS/2, wrote a few articles like:
    Byte 1990 11 Tales From the Trenches
    Byte 1993 07 Confessions of a DDK Developer
    Byte 1993 11 OS/2 Gets Device Driver Support
    Wrote at leas 2 books:
    (bitsavers-IBM-PC-PS2-OS2 2.x)Mastriani Writing OS2 2.1 Drivers in C
    (archive.org) Writing OS/2 Warp Device Drivers in C
    And do a software package “OS/2 2.x Device Driver Toolkit and Library” that appears to be lost to history.

  16. Michal Necasek says:

    Yes, I think IBM even distributed an INF version of Mastrianni’s book on the DevCon CD-ROMs. The books came with a disk with a sample code, but a library it used was sold separately (though the library was just a C-callable wrapper around DevHlp functions I believe).

    One thing Mastrianni didn’t really cover at all was presentation drivers. Which to be fair are a distinct thing.

  17. Nathan Anderson says:

    “IBM shipped Service Pak XR06055 and started preloading OS/2 2.00.1 on some systems. […] The updated GRE also implemented a few additions that had nothing to do with the 32-bit rewrite. […] Additionally, OS/2 drivers now could support seamless Win-OS/2 operation.”

    Maybe I am misunderstanding/misreading this (& if so my apologies), but I’ve seen others (most notably in my recent memory, a Youtube video or two) also make a similar claim that seamless WIN-OS/2 didn’t exist until either 2.00.1 or 2.1. However, although the LA release absolutely didn’t have it, the unpatched retail/GA of 2.0 from March 1992 *does* support it…in (I believe) exactly one driver: the VGA driver.

    I suspect this myth keeps circulating because the feature is pretty well hidden: in addition to only being available to the users of the VGA PM display driver, the OS/2 installer does not create a WIN-OS/2 Window program object under Command Prompts (although this might also not have existed under 2.1 / not been added until Warp 3; I’d have to go back and check), only a WIN-OS/2 Full Screen one, nor is there a WIN-OS/2 Setup icon under System Setup like there is in later versions where you can choose a global default WIN-OS/2 session type. But if you create a new program object and point it at a Windows executable, “WIN-OS/2 window” is absolutely available under the Session tab, and you can also go into the Settings for a Windows .exe and set it to “WIN-OS/2 window” as well. (I don’t know what happens if you have a non-VGA driver installed…maybe these options remain grayed out?)

    The lack of the WIN-OS/2 Setup icon + only a single shipping driver supporting it points to this being a rush-job to getting it shipped out in the final release. It’s also not clear if the code that shipped with 2.0 GA to implement this feature is the same code in essence that shipped later, or if it is completely different…I could believe that it was SO rushed and the initial implementation SO kludgey under-the-hood (which also might explain why they only got around to modifying one display driver to support it before shipping) that this is in effect a completely different implementation of the same concept, and they scrapped it and started over for the new GRE in 2.00.1 / ServicePak 1.

  18. Michal Necasek says:

    Actually you’re correct… and I guess the confusion comes down to the meaning of “supported”. I looked at the OS/2 2.0 announcement and it is pretty clear, if you read it closely. It says:

    “OS/2 2.0 supports a wide variety of DOS, Windows and OS/2 applications running side-by-side in windowed sessions when the primary display adapter (including adapters that support SVGA modes) is configured for VGA mode. […] Broader support for high-resolution modes (higher than 640x480x16) will be provided over time. In addition, IBM is working with the manufacturers of popular boards to assist them in making their Windows and Presentation Manager drivers available.”

    So yes it’s there… if you don’t mind crippling your fancy XGA or SVGA and forcing it down to the most basic VGA mode. Why they made it so well hidden even on the VGA is another question, and I don’t know the answer. Possibly because they didn’t want to install a windowed Win-OS/2 icon only to tell most users “nope, not for you”.

    This was the result of the approach IBM (I think it was done after the MS/IBM split) took for seamless support. It required additional functionality in the PM display driver and it required a customized seamless Windows 3.x driver. Whereas full-screen Win-OS/2 generally just needed the regular OEM Windows 3.x driver and the OS/2 display hardware virtualization took care of the rest. The PM display driver had to very carefully coordinate hardware access with the seamless Win-OS/2 driver which is why both needed to be modified. The modifications were not extensive in terms of code changes, but they affected every entry to the driver.

    This is in stark contrast with the Win-OS/2 support IBM developed for GRADD drivers. In that architecture, IBM supplied all the Windows drivers (seamless and fullscreen) and everything just worked.

    On closer look I don’t think the 32-bit GRE had much to do with seamless support, it’s just that the VGA driver was the only 16-bit driver (from IBM anyway) with seamless support; not even the 16-bit 8514/A driver had it.

  19. Josh Rodd says:

    Yep, OS/2 2.0 had seamless support for VGA; for various reasons, we ran OS/2 2.0 (GA) in 1992, and the Windows 3.0 seamless sessions worked fine. In 1992, lots of computers had just a VGA. An 8514/A or an XGA would have been a major upgrade (or an SVGA, and those didn’t really have usable OS/2 drivers outside of a very short list of cards in 1992).

    One thing I’ve always wished to know more about is just what the BVH series of drivers/DLLs do. Specifically, some unique ones existed for the Image Adapter/A, and there was a separate BVH for VGA, SVGA, and XGA. The entire OS/2 display driver stack is one of the most byzantine things I’ve run across, and I say that as someone who’s spent lots of time working with Kubernetes. For a completely greenfield, non-VGA compatible driver, in 1992 you would have been stuck needing to write:

    – Enough of a BIOS to make the thing boot properly
    – BVH driver
    – 16-bit PM driver
    – 16-bit Windows driver
    – 16-bit Win-OS/2 seamless driver (optional)
    – 32-bit virtual device driver if you want DOS sessions to be able to use the hardware.

    If you wanted to support Windows 3.x, you’d also need to write a .386 virtual device driver for Windows, too.

    Of course, the monolithic nature of Windows/386 was even stranger: they shipped different kernels for CGA, EGA, (CT)VGA, Hercules, and 8514/A, essentially doing the same thing as VDD.SYS (I think, or was it VVGA.SYS?) on OS/2 2.0.

    I agree with the poster above who said the architecture back then of both Windows GDI and OS/2’s GPI was one that planned ahead for a future where you’d just send graphics primitives to some offboard accelerator card. It is interesting that everyone designed this, even as the first driver they went to write was a CGA driver and an EGA/VGA driver, which is about as not-offboard-accelerated as it gets.

  20. Michal Necasek says:

    BVH = Base Video Handler. It’s what gets used when you don’t have Presentation Manager. And many PM drivers use BVH (specifically VioSetMode) to establish the initial display mode. Roughly speaking, BVH is a protected-mode equivalent of the video BIOS.

    Delivering an OS/2 display driver was less work if VGA for DOS sessions was sufficient. And a 16-bit Windows driver was something OEMs generally already had, but the seamless Windows driver was still separate and a bit different. For hardware that wasn’t VGA compatible… yeah, that needed a lot of work. Both the OS/2 PM and Windows 3.x display drivers were big and complex pieces of code, too.

    I’m sure the high level of complexity was why getting display drivers for OS/2 was hit and miss. Some hardware had them, some not at all, some had drivers but not very high quality. IBM did try to make it easier by enabling the use of real-mode video BIOS for setting modes, even on OS/2 setups with DOS box support disabled. And the later GRADD model was really vastly easier (many times easier PM driver, no messing with Win-OS/2 at all).

    OS/2 and Windows 3.x/9x had the most complex display drivers. Pretty much everything else, including NT, was massively simpler.

  21. Josh Rodd says:

    One obvious consequence of the towering complexity of 3.x/9x drivers was that OEMs made sure to build VGA compatible cards. To find a non-VGA compatible card, you have to look at pretty old stuff. IBM鈥檚 own cards like the 8514, Image Adapter/A, and the PS/55 stuff tended to function as a secondary display adapter.

    What release did being able to call real mode BIOS (including VESA calls) come in? I don鈥檛 remember it being there in 2.0, but it was definitely there in the 3.0 era, maybe via a FixPak or a specific update to the SVGA driver? Did it actually run in a virtual 8086 machine or just via emulation?

    On Linux, calling out to display BIOS was the main use case virtual-8086 mode, until it was eventually replaced by an emulator.

  22. Michal Necasek says:

    I think it was both the OS and applications — an awful lot of stuff assumed VGA hardware, both apps and games. But yes, there were lots of incentives for building VGA compatible hardware.

    As far as I know 8514/A was not intended to be a standalone adapter, always a VGA add-on… although technically I don’t think it depended on a VGA and could be used alone. I must admit that I know very little about the Image Adapter/A, I have never seen one. But certainly non-VGA-compatible graphics adapters were not a thing until not too long ago.

    Warp 3 definitely had support for running BIOS services in a VDM. Shipped with VPRPMI.SYS which was part of it. I don’t believe this was part of OS/2 2.x, but the video subsystem could be upgraded. There was a special hidden VDM which ran VIDEOPMI.BAT instead of AUTOEXEC.BAT and therefore could load an OEM’s VESA driver.

    OS/2 2.1 (but not 2.00.1) ships with IBM “Speedway” drivers that install \OS2\MDOS\VESA.EXE. Its purpose is unclear to me, but it may have been needed to run SVGA.EXE properly (and generate SVGADATA.PMI).

    My current understanding is that for OS/2 2.x, IBM’s preferred solution was for OEMs to use PMI files. This was a relatively high-level human-readable description of which registers needed to be set to establish a given video mode. The SVGA.EXE utility helped with creating PMI files by capturing the register state set by the video BIOS. Of course adapters didn’t have to use PMI, and for example XGA did not.

    By the times of OS/2 Warp, the PMI interface proved inadequate because it was too static. The PMI file potentially needed to be regenerated for each adapter + monitor combination, and it was entirely unable to deal with things like laptop screen switching. So IBM added a way to call the BIOS directly, which was more flexible.

    NT had an x86 emulator to run BIOSes early on, because they had all the non-x86 machines to worry about. The same emulator was also built into the ARC bootloader. I’ve seen it used for VGA on PowerPC and also for SCSI on Alpha.

  23. @Michal Necasek Thank you for your explanation of the BVH. Previously, I imagined PM to always implement 100% of the graphics chip setup and programming, including mode setting. And it seemed a bit stupid to me, as the OS would have much more work keeping track of the current video mode and state, more like orchestrating DOS applications in a multi-tasking environment. If at least some PM drivers used the OS-provided (and BVH-augmented) mode setting capabilities, this makes much more sense.

    Windows NT’s model, where the OS only has the very basic text-mode capabilities for showing the boot screen and kernel-mode applications output, and the actual GPU programming is done exclusively in a dedicated device driver, makes much more sense. It was made possible, though, only by the fact that there can be no other video subsystem user in Windows NT than the GDI.

    Actually, I think X Window used to need its drivers to implement 100% of code, with no OS support. Only with Kernel Mode Setting on Linux, they at least moved the video mode setting part to the OS where it belongs.

  24. Michal Necasek says:

    The PM driver can implement its own mode setting entirely, because it gets told when the user switches into and out of Presentation Manager. But most drivers don’t.

    The BVH component is architecturally a lot like the NT video miniport. The SVGA handler (BVHSVGA) is also chained to the VGA handler (BVHVGA) and provides only SVGA capabilities beyond standard VGA. So a SVGA handler for example does not need to do anything with text modes, unless it wants to provide 132-column text or something.

    How things are done inside the BVH is also open. For example XGA does the actual mode setting inside XGA.SYS which is a Ring 0 driver. VGA and SVGA BVHs program the hardware directly.

    NT had a much more “complete” design and the OS is in control of mode selection. Windows 3.x and earlier has zero control and everything is up to the driver. OS/2 2.x is a hybrid where the PM driver and OS can cooperate to control mode selection but the PM driver doesn’t have to do that (and e.g. the IBM SVGA drivers don’t).

    And yes, X11 (certainly XFree86/X.org) was in the worst situation because it could not expect any help from the OS at all. The drivers had to be super careful to restore the hardware to the state they found it in because no one else would.

  25. r34jinkai says:

    @Rados艂aw Sok贸艂 “Windows NT鈥檚 model, where the OS only has the very basic text-mode capabilities for showing the boot screen and kernel-mode applications output”

    Actually no. You can have full framebuffer support under NT kernel alone. In fact, on non-x86 platforms you have to supply a framebuffer driver along the HAL in an OEM F6 floppy to allow the platform to display anything in “BSoD” mode. VGA.sys is a framebuffer driver itself. There were also NT kernel framebuffer drivers for S3, Cirrus, Matrox, the SGI stuff in the Intel ARC workstation machines, the Microsoft ARC Jazz chip, and the IBM WD and G300/G800 chips used in PowerPC machines.

    You can check for example the experiments running NT on the Macintosh NewWorld PPCs or the Nintendo GCube/Wii, these come with framebuffer drivers to allow NT4 operate the “BSoD Mode” screen. The reason why this is hidden is because MS themselves don’t supply code and headers to write such drivers since Windows 2000 days, and actually, there aren’t many users for such feature besides early boot/bugcheck debug displaying . Also most modern cards even in non x86 implement modes which can work with the default framebuffer driver supplied by MS, and Win8 has departed such mode of operation.

  26. Josh Rodd says:

    It’s worth mentioning that X11R6 for OS/2 actually existed, in the form of XFree86… which also implemented its own graphics drivers, complete with hooking the session switch to and session switch from handlers to set and restore the graphics mode. For the most part, these were straight ports from the XFree86 drivers for Linux/Unix.

    OS/2 did not accommodate IOPL from a 32-bit code segment, so XFree86/2 implemented a driver, XF86SUP.SYS, that would diddle with the LDT/GDT for a 32-bit code segment so that it could perform direct port I/O. I’m unclear on how OS/2 did the same thing for native 32-bit PM drivers, which presumably also would do port I/O. (The VGA driver in particular has to do port I/O as part of almost every drawing operation.)

    You could have an OS/2 machine with these cornucopia of drivers installed (at one point, I did):

    – 16-bit BVHVGA
    – 16-bit Windows 3.1 VGA full screen driver
    – 16-bit Windows 3.1 VGA seamless/windowed driver
    – 32-bit PM VGA driver
    – 32-bit Virtual VGA driver for DOS sessions
    – 32-bit XF86_VGA16.EXE (16 = 16 colour) XFree86 server

    It was entirely legal and supported for a full-screen OS/2 application to take control of the video hardware, and some of the Vio calls even supported this, with the caveat this was only officially supported for 16-bit applications, and this has been supported since OS/2 1.0. OS/2 1.1’s PM was essentially implemented as such an application. As far as I know, XFree86 is the only other significant OS/2 application that basically does this – creates a graphical environment in a different session than the main PM session.

    Whilst we’re on the topic, there were some display adapters in the OS/2 1.x and early 2.x era that were built with the architecture most people thought were the future… the 8514/A is the clearest example of this, being built with a coprocessor mindset and not even offering direct access to its framebuffer, with the assumption that the main CPU would simply push drawing primitives to it. The Image Adapter/A carried this forward except with a lot more processing power and a much bigger framebuffer (and was one of the first display adapters to have a true direct colour 16-bit mode). Of course, hardly anyone used it in this mode and I suspect nearly all of them just got used as a Windows accelerator board. Neither OS/2 nor Windows had usable support for multiple displays.

    The ActionMedia and ActionMedia II cards went to even more of this extreme, with an independent toolkit (the Audio Visual Kit, or AVK) for driving them, oriented around playback of digital video. It was capable of being a generic display adapter, but never had PM or Windows drivers, instead being a vessel for “overlaying” video output on top of the main graphics card (either VGA or XGA, etc). It becomes rather obvious when looking at the design of Window GDI and OS/2 GPI that the “future” was going to be these kind of coprocessor graphics cards, and hence why the drivers were expected to contain the logic to do all kinds of graphics primitives – even ones done offline into a buffer. After all, the graphics card probably could have done even invisible graphics operations faster than the software graphics engine, right?

    This didn’t become a reality until much later in the GPU era.

    I had a project for a while to try to make a capable X11 client that worked as a PM display driver, translation all kinds of GPI primitives into X primitives that could then be sent to the X server to be rendered. It ended up being buggy and unstable since timing became a serious issue (users of the Gpi calls expect instant rendering without any networking delays).

  27. Michal Necasek says:

    PM display drivers run hardware access code in Ring 2 (IOPL enabled), which is the standard OS/2 mechanism. I suppose that was too hard to do with the existing XF86 drivers. A PM driver can tell GRE which functions need IOPL and GRE ensures that it’s run in Ring 2.

    Also worth mentioning that IBM had PMX, a PM-based X11 server. Except I think it was X11R5 at best.

    Re hardware, the early 1990s accelerators were designed to work well with GDI/PM, showing how software influences hardware design.

  28. Mark Micire says:

    In the late ’90s, one of the main reasons to use Matrox video cards was the ability to disable their onboard BIOS and remap their base addresses to alternate locations on x86 hardware. At that time, true graphical multi-head configurations were usually limited to advanced systems like those from Sun, SGI, and other specialized hardware manufacturers. Also, little was “plug and play” in those days鈥攚e were still setting I/O and IRQ addresses manually, even on PCI systems. Those of us fortunate enough to acquire even the low-end Matrox cards鈥攁nd who knew how to perform the trick鈥攚ould quickly toggle the card’s switches and load proprietary X servers like Metro-X, which supported these BIOS-less offset configurations. https://www.linuxjournal.com/article/2299

    As I recall, the second and third video cards would use the BIOS from the first in a three-card configuration. The fact that your driver is looking for a specific base address (first card) and that the Illuminator card has alternate addressing schemes in the spec suggests that this was at least possible on MicroChannel, like on the later PCI cards we used.

    Even though the Metro-X server driver supported the displays under X, the window manager did not recognize them as connected frame buffers. In Metro-X, these displays were identified as three separate screens, labeled DISPLAY=0.0, 0.1, and 0.2. This required considerable trickery and customization to enable windows to move between monitors. I suspect that OS/2 might have needed similar adjustments, even if the base driver was nearly ready for deployment鈥攖he build would be delayed until the GUI could accommodate it.

    I have a fantastic photo of my 486 from 2000, featuring three 14″ fixed-frequency monitors displaying 1024×768, running a custom version of WindowMaker – a poor man’s alternative to the Next Workstation. 馃槈

  29. Michal Necasek says:

    Yes, most older PCI graphics cards could not coexist. The VGA resources were fixed, and most SVGA designs used extended VGA registers so it was very hard for them to be used without VGA registers.

    One of the oldest cards that was designed to coexist in multiple instances was IBM’s XGA. It did not require VGA and all the native XGA registers were relocatable I/O or MMIO.

    Software of course was another issue. Before Windows 98/2000 I don’t think there was anything off-the-shelf really.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.