UNIX Alphabet Soup

When reading historical UNIX documentation or source code, one is likely to come across various terms and acronyms that are now more or less completely forgotten, even among current developers of UNIX or UNIX-like systems. These acronyms might be the names of tools, directories, organizations, products, or documents.

Following is a random assortment of terms, with brief explanations where appropriate. Note that most acronyms still in current use are not listed. Continue reading

Posted in UNIX | 3 Comments

Preserving Floppies

For many years, software was delivered predominantly on floppies. This was true especially in the world of PCs where by definition (almost) every system contained at least one floppy drive and prior to the mid-1990s and mass arrival of CD-ROMs, there was no other standardized distribution medium (compare e.g. with UNIX workstations where software was typically delivered on tapes).

Since floppies, or indeed any storage medium, do not have unlimited lifespan, there is a need to preserve the contents of floppy disks. As time marches on, this is becoming a more pressing issue, especially with 5¼” disks; ironically, finding functioning drives is becoming harder than locating error-free floppies. But even with error-free disks and working drives, the next question is how exactly to preserve the data. There are numerous options, each with advantages and drawbacks. Continue reading

Posted in PC history, Virtualization | 25 Comments

ATI mach8/mach32/early mach64 Documentation?

It’s a long shot, but I’m looking for programming documentation for ATI’s mach8/mach32 and early mach64 chips (prior to 1996 or so). The earlier documents may have only existed in paper form. These used to be available from ATI but that was a long time ago. I’d be obviously happy to pay for shipping and other costs if physical items were involved.

I’m especially interested in mach8 and mach32 register references and programming guides, plus any sample code ATI may have had (I have the mach64 SDK and sample code disks, but nothing for mach8/mach32). I do have several documents describing the latter-day mach64 derivatives (Rage XL etc.) but am interested in the earlier chips, mach8/mach32 and the earlier mach64 GX/CX chips.

Posted in ATi, Documentation | 37 Comments

BSD Buglets

Last week I ran into two wholly unrelated problems while researching the history of BSD-derived Unix systems on PCs. Both are classics in their category and merit a closer look.

Y2K Strikes Again

The first issue is a very typical Y2K bug found in 386BSD 0.0 and 0.1. When the system comes up (if it does—it’s not easy to bring up 386BSD 0.x on anything remotely modern!), it shows the system date as January 1, 1970, i.e. the beginning of the UNIX epoch. This is not merely a cosmetic issue.

For example when rebuilding the 386BSD kernel, or indeed any software which uses the make utility, the source files will be timestamped 1992 or later, but the object files will be timestamped 1970. As a consequence, the object files will be always out of date and make will be forced to rebuild them. It gets much worse if the system is networked. It is possible to correct the date manually but it will be reset to 1970 every time the system boots, which is rather unsatisfactory. Continue reading

Posted in BSD, UNIX, x86 | 11 Comments

PC DOS Retro

There’s a new DOS history and reference information website called simply “PC DOS Retro Page”. The site includes several reference pages (DOS commands, drivers, functions, internal structures) as well as a very extensive timeline of DOS releases.

Vernon Brooks, the site’s creator, worked as PC DOS lead developer at IBM. That ought to make the site rather more authoritative than most!

Posted in DOS | 21 Comments

If you ENTER, you might not LEAVE

I’ve recently spent some time debugging curious hangs/aborts in two more or less exotic operating systems, Plan 9 and QNX 4.25. Both turned to be caused by the same innocuous-looking BIOS change, even though the circumstances were somewhat different and the symptoms initially didn’t look similar at all.

With Plan 9, the system simply hung during boot. With QNX 4.25, the graphical installer aborted (without any obvious hint as to what might be going wrong) soon after beginning to detect devices. QNX itself continued to work.

Investigating the Plan 9 hangs was somewhat easier because the system died soon after the error occurred. The proximate cause turned out to be a corrupted stack pointer, never a good thing. Continue reading

Posted in Intel, x86 | 9 Comments

Why I Don’t Want a Laptop with a Glued-In Battery

Here’s why:

A Sick Battery

Not much to add really… in this case, the laptop wasn’t damaged because the battery simply forced its way out of the shell. If it had been glued in, it would have destroyed the case and quite possibly also the system board.

Posted in Apple | 6 Comments

DOS Goodies at bitsavers.org

The excellent bitsavers.org last week uploaded scans of several IBM Personal Computer DOS manuals. Included are the manuals for DOS 1.0 (1982), 1.1, and 2.0, a preliminary technical reference for DOS 3.1, the DOS 3.1 user’s reference manual, and DOS 3.3 technical reference.

Note that the old DOS manual also contained programming reference information; over time, much of the content moved over to the Technical Reference, as did some of the tools (such as the linker). Continue reading

Posted in DOS, IBM | 1 Comment

ISA bus 8514/A?

During the development of the 8514/A, IBM clearly had ISA-based adapters. A proof of this may be found in the source code for the Windows 2.x setup program (part of the Binary Adaptation Kit, or BAK), which among other things detects the graphics hardware so that it could select the appropriate graphics driver. The comment in the source code is quite clear: Continue reading

Posted in Graphics, IBM, PC hardware, Windows | 7 Comments

The XGA Graphics Chip

After covering the 8514/A and its clones, it’s only appropriate to write a few words about the XGA (eXtended Graphics Array), IBM’s final attempt at establishing a PC graphics hardware standard.

The XGA was introduced on October 30, 1990, about the same time when several other companies just started selling their own 8514/A clones. The XGA was a combination and superset of VGA and 8514/A: VGA compatible, high-resolution, accelerated graphics chip. Initially, an XGA chip was built into the new PS/2 Model 90 and 90 XP, and also available as a stand-alone upgrade for existing PS/2 systems in the form of the “IBM PS/2 XGA Display Adapter/A” (a typical IBM product name). The initial price was $1,095 for an XGA with 512KB VRAM and additional $350 for a memory upgrade to 1MB VRAM.

IBM XGA

OS/2 1.3, which was announced on the same day as XGA, shipped with built-in XGA drivers. IBM also supplied drivers for Windows 2.1 and 3.0, OS/2 1.2, and several popular software packages such as AutoCAD. The XGA also shipped with an implementation of the AI (Adapter Interface). Existing applications written to the AI and supported on the 8514/A continued to work on the XGA.

In mid-1992, IBM released an updated version called XGA-2 or XGA-NI (Non-Interlaced), with significantly more flexible display support and several other enhancements.

XGA vs. 8514/A

Perhaps the biggest architectural change compared to the 8514/A was that the XGA integrated a VGA subsystem. In a way this was an admission of defeat: IBM’s earlier strategy of providing an on-board VGA chip with an additional high-resolution accelerator such as the 8514/A clearly hadn’t worked out.

In terms of capabilities, an XGA was very much like a VGA + 8514/A. In addition to standard VGA functionality, there was a new 132-column text mode. With 1MB VRAM, there was also a new high-color mode with 640×480 resolution and 65,536 colors (16 bits per pixel).

The mode support was otherwise the same: 640×480 and only interlaced 1024×768. IBM did not support the increasingly popular 800×600 resolution. The likely reason was that at the time, IBM didn’t sell multi-frequency monitors; as a consequence, the original XGA only supported four pixel clock frequencies (four separate oscillator chips are clearly visible near the center of the board).

The XGA draw engine was very similar to the 8514/A in terms of capabilities (hence the AI level compatibility), but was not compatible on the register level. A significant change was that IBM fully documented the XGA register interface, something that was never officially available for the 8514/A.

XGA Hardware

The VGA compatibility circuitry on the XGA was nothing new, but the accelerator engine and the overall architecture were significantly different from the 8514/A.

A major new feature (if not a mainstream one) was the fact that up to 8 XGA boards could coexist in a system, with a single one providing VGA compatibility. On system start-up, each XGA would be assigned an instance number which determined the I/O addresses used by the board.

Unlike the 8514/A, the entire XGA framebuffer could be directly accessed by the host CPU. There were three options: a 4MB aperture could be mapped anywhere in the 4GB address space, ideal for 32-bit operating systems and any number of XGAs; a 1MB aperture could be mapped below 16MB for 16-bit protected-mode operating systems; and a 64KB window was optionally available for a single XGA at the VGA A000h or B000h segment (real-mode accessible).

It should be noted that some XGA drivers did not require any memory aperture to be enabled, resulting in a framebuffer not addressable by the host CPU at all, similar to the 8514/A. Due to the bus-mastering capabilities of the XGA, the inability to access the framebuffer directly did not necessarily cause any performance loss, whereas on the 8514/A transfers to/from video memory required programmed I/O and hence CPU attention.

The look-up table (LUT) on the XGA was still essentially the same as VGA, with the addition of a direct-mapped 65,536 color mode.

A new feature was a 64×64 hardware sprite, almost exclusively used as a mouse cursor. On previous devices, including the EGA, VGA, and 8514/A, a mouse cursor had to be managed entirely in software, a non-trivial implementation which incurred a sizable performance hit when the cursor was moving quickly. With the 8514/A, the implementation was less costly as offscreen memory could be used to save the underlying image as well as provide a cursor image to be drawn with the BitBLT engine, but it still hardly counted as simple.

With the XGA, the mouse cursor could be implemented as a completely independent sprite displayed on top of framebuffer contents. Moving the cursor was simply a matter of updating the X/Y position registers, but more importantly, the cursor did no longer need to be hidden and re-drawn every time when the current drawing operation and the cursor intersected.

To support saving and restoring of the XGA state, all registers were readable (while many 8514/A registers were write-only). The XGA could even save and later continue the operation currently in progress, a potentially useful feature as blitting a large rectangle could take a relatively long time.

The Draw Engine

The XGA draw engine was very similar to the 8514/A but enhanced in several respects. A new (for IBM) concept was that of maps. An XGA map provided a translation between linear memory and a 2D bitmap. The XGA supported four maps, each with a given starting address, pixel depth, width, and height. An unusual feature of the XGA was that maps could reside both in video memory and in system memory. The XGA could not only copy between system and video memory but also draw into system RAM using bus mastering. In a multi-XGA system, one XGA could potentially even access another XGA’s memory without host CPU intervention.

A similar implementation of maps was found in the earlier Intel 82786 graphics coprocessor (1986), one of Intel’s several unsuccessful attempts to enter the graphics hardware market.

With the 8514/A, there was essentially a single fixed map covering the entire framebuffer. The XGA enabled much more flexible use of offscreen memory because linear (one-dimensional) memory management was possible. The 8514/A forced rectangle-based 2D memory management which is much more complex to implement and less efficient.

The pixel pipeline and drawing operations in the XGA were similar to the 8514/A, with mixes, color comparisons, plane mask, etc., although the hardware interface was different and used 32-bit memory-mapped registers—very typical for 1990s 2D accelerators.

The XGA supported Bresenham lines, short stroke vectors, rectangle fills, and BitBLTs just like the 8514/A. One new feature was the mask map which allowed an arbitrary mask bit-map to be applied to operations, allowing the user to draw arbitrarily-shaped objects. The mask map could be empty/undefined and only its dimensions used, in which case it worked much like the 8514/A scissors (clip rectangle).

Another new feature was the support for patterns (“brushes” in Windows GDI parlance, “tiles” and “stipples” in X11 terminology), both monochromatic and color. The patterns were very rather naturally specified through the use of XGA maps. On the 8514/A, the same effect could be achieved but required more effort.

XGA Virtual Memory

The most exotic feature of the XGA was probably its support for virtual memory. The XGA essentially replicated the 386 memory-management unit (MMU). Just like the 386, it used physical addresses by default, but virtual memory (i.e. paging) could be enabled. The XGA had its own PDBR (Page Directory Base Register) corresponding to the CR3 register in the x86 architecture.

With virtual memory enabled, the XGA used the host’s page tables and was able to process page faults. Protection violations and non-present pages encountered while the XGA accessed system memory were reported via an interrupt. The equivalent of the CR2 register also existed on the XGA in order to report the fault address.

The same basic idea was used by Intel several years later in the form of the AGP GART, except the XGA had gone much further. Both solved the same underlying problem, namely taking discontiguous physical memory pages and transforming them into contiguous regions that the graphics adapter could access. The XGA also made it possible to support task switching and easy access from user mode applications.

The XGA virtual memory feature was used by IBM’ OS/2 2.x VDM (Virtual DOS Machine) subsystem; it was utilized to provide support for DOS software which used XGA bus mastering. It is unclear whether any other drivers used the virtual memory features of the XGA. Since no hardware besides the XGA supported such technology, it was not useful to general-purpose software which also needed to run on non-IBM hardware.

The XGA-2

On September 21, 1992, IBM announced the XGA-2, an improved XGA with support for non-interlaced 1024×768 resolution and 1MB VRAM standard. The XGA-2 sported a programmable PLL circuit and handled pixel clocks up to 90MHz; this enabled support for up to 75Hz refresh rate at 1024×768 resolution. Finally, the 800×600 resolution was also supported, at up to 16bpp.

The XGA-2 had an improved DAC with 8 bits per channel, rather than 6 bits like the original XGA (as well as the VGA and 8514/A). The draw engine was enhanced to support 16bpp maps and performance was generally increased by using faster VRAM and several minor optimizations. At introduction, the “IBM PS/2 XGA-2 Display Adapter/A” cost mere $360.

XGA Clones

For the XGA, IBM chose a novel strategy: Instead of making the hardware specifications secret, the register interface was fully documented; in addition, IBM licensed the XGA chip design to SGS-Thomson (inmos) and Intel. Worth of note is that Radius manufactured ISA-based XGA-2 adapters built around chips from inmos (as usual, IBM didn’t bother with non-MCA adapters).

Radius ISA XGA-2

There were no clones to speak of in the true sense of the word, i.e. non-licensed chip designs based on reverse-engineering the IBM originals. IBM’s strategy was not particularly successful—while the XGA-2 was a decent chip, there were very capable alternatives already available (from S3, Tseng, and others), and implementing the bus-mastering design in an ISA environment was troublesome.

The XGA Legacy

The XGA architecture was a very modern design, with a linear framebuffer aperture, highly flexible bus-mastering draw engine, and hardware cursor. It was released at a time when most PC graphics cards were dumb framebuffer SuperVGAs limited to banked memory access; it took the rest of the PC graphics hardware industry several years to catch up with XGA’s capabilities. In many ways, the XGA was a classic 1990s design even if it never reached its full potential (it could have easily supported up to 4MB VRAM as well as 24/32bpp True Color pixel formats).

However, measured against IBM’s own goals, the XGA was a failure. Not only was the XGA unsuccessful in establishing a hardware standard, the experience persuaded IBM to quit the PC graphics market altogether and rely on graphics chips from companies like Cirrus Logic or S3 for its own systems.

There is little doubt that Microsoft Windows and IBM’s own OS/2 were a major cause of this failure. When a graphics chip vendor only needed to supply one or two drivers for widely used GUI environments (rather than a dozen or more, with custom driver development often needed), there simply wasn’t enough value in register-level compatibility. The hardware was evolving too quickly for that.

As a consequence of XGA’s failure to establish itself, VGA has remained as the only hardware standard to date, a sad fact more than 25 years since its introduction.

References

Power Programming the IBM XGA by Jake Richter, MIS Press, 1992, ISBN 1-55828-127-4
IBM PS/2 Hardware Interface Technical Reference—Video Subsystems, S42G-2193

Code Names Addendum

The IBM code name for the XGA was Expressway. OS/2 driver source code often refers to “Expressway” or “Xway” rather than XGA.

Around 1991, IBM also produced onboard graphics and MCA adapters referred to as “VGA 256c” or “SVGA/A”, respectively. This chip was used among others on the ThinkPad 700/700C and 720/720C laptops, as well as several PS/2 and PS/1 models. The code name of the chip was Speedway, which again can be found in OS/2 driver source code. The Speedway appears to have been an XGA minus the accelerator, and used an I/O register interface extremely similar or identical to XGA.

Posted in Graphics, IBM | 5 Comments