The IBM PC, 41 Years Ago

No, the OS/2 Museum does not have either a time machine or difficulty doing basic math. As of this writing, it is August 2021 and the IBM PC was announced in August 1981, 40 years ago.

IBM Personal Computer (IBM promotional photo)

But in August 1980, one year earlier, IBM started putting together the design of the IBM PC. If that (one year) sounds like an awfully short development cycle for a product like the PC, that’s because it was, especially for a company like IBM where the typical product development cycle was closer to five years at the time. The tight schedule determined the PC design: No custom or not yet available chips, no major software development, favor proven and familiar technologies.

With these constraints in mind, the design of the PC was sketched out on a stack of papers on August 10, 1980. Some of those drawings—not previously published—are now presented here; see explainer at the end of this article for a glossary.

IBM PC Design, August 1980

A closer look at the drawings reveals that even though there were quite a few changes between August 1980 and August 1981, the core was all there from the beginning: A box with two floppy drives, five I/O expansion slots, and a detachable keyboard; an Intel 8088 CPU with optional 8087 FPU, 8259A interrupt controller, 8237-5 DMA controller; RAM on motherboard, with an option to install additional RAM expansion cards; a display adapter with separate monitor or TV.

Choosing these particular chips was no coincidence, and there was one very clear reason for it: The IBM System/23 Datamaster.

The Datamaster Legacy

The IBM PC hardware was very strongly influenced by the design team’s prior experience with the Datamaster. That may seem odd given that the Datamaster was announced in July 1981, just weeks before the PC. But in reality, the Datamaster development started in 1978 and the hardware was finished in Summer 1980. The product release was significantly delayed by difficulties with implementing IBM’s chosen BASIC variant.

IBM System/23 Datamaster (Model 5322)

The Datamaster used an 8-bit Intel 8085 CPU, 8259 interrupt controller, 8237 DMA controller, and 8253 programmable timer. It also utilized an expansion bus remarkably similar to the PC’s.

CPU Choice

The Datamaster team learned that the 64K address space of an 8-bit CPU wasn’t quite big enough for the tasks IBM had in mind for it. The Datamaster used paging to expand the addressing capabilities, but that complicated things significantly.

With that in mind, IBM wanted a CPU with a much larger address space for the PC. The Intel 8086 or 8088 was a logical choice; the slightly better performance of an 8086 was not considered worth the increased complexity and cost. The 8088 provided a good compromise between supporting 16-bit software with a huge (for the time) 1 MB address space while utilizing cheaper and familiar 8-bit infrastructure.

There was already an existing 8086 BASIC (Microsoft’s) and other tools, and porting software from the 8085 was not difficult. As an added bonus, IBM already had Intel MDS development systems that supported both 8085 and 8086 development.

With the above in mind, it’s easy to answer questions such as “why did IBM not use the Motorola 68000”. The CPU was barely available in 1980, there was no BASIC for it yet and no software, and IBM had no experience with it. Choosing the 68000 would have delayed the PC release well beyond what IBM was willing to accept; that alone was a sufficient reason to not pick the 68000.

Internal Layout

The initial design called for a power supply taking up the left side of the PC’s enclosure, and adapter cards competing for space with the internal floppy drives. The actual design moved the power supply behind the floppy drives, leaving more room for long adapter cards.

IBM took advantage of the extra space and the adapter cards released with the IBM PC were on the larger side.

Expansion Bus

The Datamaster didn’t only influence the choice of the PC’s CPU and support chips. It also strongly influenced the PC’s 62-pin I/O expansion bus, later known as the ISA bus. How significant was the Datamaster influence? The following two diagrams should answer that:

IBM Datamaster I/O expansion (December 1980)

The above is a diagram from an IBM Datamaster service manual dated December 1980. Below is a diagram of the expansion connector from the IBM PC Technical Reference dated August 1981.

IBM PC I/O expansion (August 1981)

The diagram is mirrored (A pins on the left vs. A pins on the right), but clearly extremely similar. What used to be page select bits 0-3 neatly turned into address bits 16-19. Interrupt and DMA levels that had specific purpose on the Datamaster are generic on the PC. There are real differences, e.g. pin B20 used to be DMA request 0 on the Datamaster but turned into a system clock signal on the PC. Since the PC dedicated DMA channel 0 to DRAM refresh, DMA request 0 no longer made sense. Pin B04, unused on the Datamaster, turned into interrupt request 2.

This similarity meant that existing adapter cards for the Datamaster could be used in the IBM PC with very minimal changes, if any. This no doubt greatly sped up the initial stage of development because there was no need to design a brand new floppy controller or serial adapter, for instance.

Memory

The first IBM PC system board supported 16 to 64 KB RAM, with 64 KB being about the maximum of what a “personal computer” of the era would support. Also available were 32/64 KB cards which plugged into the PC’s expansion slots. With three such cards (a practical maximum; two of the five expansion slots would be taken by a display adapter and a floppy controller), the PC could be expanded to 256 KB RAM.

The prototype board shown below has Mostek MK4232 memory chips (32K×1), with room for 18 chips total. That allows installing up to 64 KB RAM (two banks of 32 KB) with parity.

IBM PC prototype board (BYTE Magazine)

Using parity was not at all typical for low-end computers at the time, but IBM felt that the added ability to detect errors was worth the small additional expense.

From the beginning, the 1 MB address space was carved into several regions. At the top, 128 KB was reserved for system firmware, 128 KB for other memory, and 128 KB for display memory. That left 640 KB available for system RAM, but that was purely theoretical—the original PC supported only up to 256 KB RAM. Even the PC/AT only supported 256 or 512 KB on the system board.

The PC’s memory map was quite reasonable and the infamous 640 KB memory limit didn’t come into play until the late 1980s, well after the expected design life of the PC. At that point, the problem wasn’t hardware (286 and 386 processors) but rather software (DOS) holding things back.

Storage

As the drawings show, the initial PC design anticipated external 8″ drives for the PC. That ended up not happening. But the rest of the PC storage subsystem turned out more or less exactly as the initial design.

While the Datamaster used 8″ drives, the PC used 5¼″ drives. The media are much more convenient to work with (if you’ve ever seen 8″ floppies, they’re huge), and the PC form factor would not have been possible with 8″ drives.

The Datamaster used the NEC μPD765 floppy controller. It did the job, IBM engineers knew how to use it, and saw no reason to pick a different controller for the PC.

The PC used floppies with 512-byte sector size, something that later became an ubiquitous default, so much so that any other size was extremely exotic; yet in 1980/81, typical sector sizes were 128 bytes, 256 bytes, or even 1,024 bytes. Once again, the Datamaster also used 512-byte sectors, except it used 128-byte sectors on the first track of a floppy, as was then common. The PC fortunately simplified things and used 512-byte sectors throughout.

The PC could probably have maximized the floppy storage capacity by using 1,024-byte sectors, but that was perhaps not even considered. There are interesting tradeoffs in that each sector requires a certain amount management overhead and additional slack on the floppy, but at the same time each file tends to waste some unused space in its last sector. 512-byte sectors strike a good compromise between keeping the disk sector overhead low and not wasting too much allocation space per file.

The PC storage subsystem was not required to be compatible with existing systems and set its own standards.

Display Hardware

A closer look at the PC design drawings shows that one area where the final PC noticeably differed from the initial design is display hardware. Not coincidentally, that’s also where the PC significantly differed from the Datamaster.

The Datamaster display was text only and used the Intel 8275 CRT controller (CRTC). To be able to display graphics, the PC needed a different CRTC; the Motorola 6845 was chosen. The PC also needed to support TVs as well as dedicated monitors, and the characteristics of NTSC television determined what IBM’s CGA would do.

The PC in the home (IBM promotional photo)

The initial PC design notably called for 80×24 text resolution, the same resolution the Datamaster used, and also the standard resolution of terminals. The PC instead ended up using the 80×25 resolution, which is extremely common today but was far from typical in 1981.

The initial design also called for a 280×192 graphics resolution (same as Apple II). The released PC (CGA) instead used 320×200 graphics. The most likely answer as to why is ‘because that was possible within the constraints of NTSC TV and 16 KB display RAM’.

The CGA had 16 KB RAM, or 16,384 bytes. A 4-color (2 bits per pixel) 320×200 resolution or a 2-color 640×200 resolution uses exactly 16,000 bytes. There would still be enough memory for a few more lines of graphics, but not for another line of text using 8×8 character cells (about the smallest cell size that’s still legible).

The text-only MDA similarly had 4 KB (4,096 bytes) of RAM. With 160 bytes required to store a line of text (80 characters each with an attribute byte), there was room for 25 lines of text—utilizing 4,000 bytes of memory—but not more.

Although the 320×200 resolution appears to have been new with the IBM PC, the Commodore 64 (introduced a few months later) used it as well. On the PC, the supported text and graphics resolutions were about the maximum possible given the display memory size and attached display hardware.

Keyboard

The PC’s physical keyboard layout was identical to the Datamaster (Model F keyboard), but the key labels were different and the keyboard was not built into the system unit. While the Datamaster used a parallel connection to the keyboard within the system housing, the PC used a serial link through a long coiled cable.

The original IBM PC keyboard layout

The original 83-key PC keyboard now seems alien, since the enhanced 101/102-key layout took over in the late 1980s.

Note that the Ctrl-Alt-Del sequence (deliberately) required two hands on the original PC keyboard, since Ctrl and Alt were only on the left side of the keyboard and Del was on the right.

Technical Reference

A key ingredient of the PC’s success was the Technical Reference. While the Tech Ref wasn’t exactly a tutorial and didn’t have a whole lot of explanatory text, it included complete schematics of the IBM PC, as well as a full listing of the PC BIOS, quite thoroughly commented. Any halfway competent engineer could locate datasheets of the components IBM used and see exactly how they were connected in the PC. Or build a clone of the PC.

The IBM Technical Reference was, again, a direct result of the tight development schedule. There was no time to engage a team of technical writers and develop documentation in the style then typical for IBM products. Publishing existing schematics and BIOS source code was, on the other hand, very quick and easy—and much appreciated by engineers developing hardware and software for the IBM PC.

ASCII Island in a Sea of EBCDIC

In the 1980s, IBM systems (including the Datamaster) almost exclusively used the EBCDIC character set, different from ASCII. The PC, as the design drawings show, was meant to use ASCII from the very beginning. Perhaps surprisingly, this did not cause any significant problems during development.

The PC development team was deliberately separated from the rest of IBM. Microsoft’s BASIC used ASCII, DOS used ASCII. The Intel MDS machines also used ASCII, and that was where the PC BIOS was developed.

The ASCII/EBCDIC divide only caused a minor inconvenience for the PC development team when the BIOS listings (ASCII) were transferred to IBM’s mainframe systems before publication in the Technical Reference.

BIOS Interrupts

The Datamaster used interrupts as firmware entry points and this approach was continued on the IBM PC. While direct calls to known addresses were typical for similar machines at the time, the Datamaster could not easily use that approach because of paging.

While the software interrupt approach may seem unnatural, it turned out to be extremely flexible because it is easy to “hook” existing interrupts, add new functionality, and “chain” to the existing interrupt handler; the same concept as inheritance in object-oriented programming. IBM notably used this approach with the PC/XT hard disk controller (adding BIOS INT 13h for hard disks in a new ROM and chaining to the old BIOS service for floppy access) and the EGA (using add-on ROM to drive the EGA but falling back to the system BIOS for CGA/MDA support).

As it happens, DOS independently chose the same software interrupt approach to provide system services, with the same resulting flexibility and extensibility (and sometimes chaos).

PC Software

IBM’s original plan was to ship the PC with ROM BASIC and the CP/M operating system, both standard for personal computers at the time. Microsoft was the preeminent supplier of BASIC to OEMs and already had 8086 BASIC, so all IBM had to do was negotiate a contract for BASIC and implement the OEM interface required by Microsoft.

The PC shipped with a 32 KB ROM BASIC which was called ‘Cassette BASIC’, alluding to the fact that the BASIC ROM could only use an attached cassette tape for storage. As part of DOS, IBM also shipped Disk BASIC and Advanced BASIC (BASIC.COM and BASICA.COM) which both utilized the ROM BASIC and among other things provided additional “device drivers” allowing the ROM BASIC to use floppy disks for storage.

On the operating system side, things did not go so smoothly. There are many legends and conflicting stories, and not so many facts.

The biggest unanswered question is why the IBM PC did not ship with CP/M. There are many rumors about how Gary Kildall, the CP/M inventor and Digital Research (DRI) boss refused to meet with IBM executives and went flying instead, or how he flew in too late for a meeting. Which doesn’t really make any sense at all, since meetings can be rescheduled.

Other rumors say that Gary Kildall was not involved at all since it was his then-wife, Dorothy McEwen, who was in charge of negotiating OEM contracts. Another version of the story goes that she wouldn’t meet with IBM execs because she was already in a different meeting with HP and then went on vacation. Again, I do not find this credible. Meetings can be rescheduled.

I believe the real story is much more prosaic and straightforward: DRI did not have a product to sell. CP/M-86 simply didn’t exist in 1980, or really 1981 for that matter.

IBM was not going to wait and it was time for Plan B. Microsoft, already contracted to provide language tools for the IBM PC (assembler, Pascal, FORTRAN, etc.) needed an OS, and knew where to find one.

The DOS story is reasonably well documented. Back in 1979, Seattle Computer Products (SCP) started building 8086-based systems and needed an operating system. CP/M for the 8086 wasn’t available, and it was unclear when it might be. Tim Paterson stepped in and threw together QDOS, Quick and Dirty OS, soon renamed to 86-DOS: A bare-bones CP/M workalike that was good enough to manage files on a floppy and launch programs. The major advantage of 86-DOS was that it enabled relatively simple porting of existing 8085 CP/M applications to the 8086, largely accomplished through machine translation.

Microsoft bought 86-DOS for cheap, licensed it to IBM, and for much of the 1980s and the early 1990s, thus acquired a license to print money.

The IBM PC, Maker of Empires

It is fairly obvious that the IBM PC laid the foundations for two business empires, neither of them IBM.

Thanks to the IBM PC, DOS became the standard PC software and Microsoft was happy to license it to any OEM. For many years, Microsoft raked in cash from licensing DOS without needing to put much effort into improving the product. Digital Research briefly threatened Microsoft’s cash cow (and it’s clear that Bill Gates was very worried), but Microsoft managed to replace DOS with Windows before DRI could really eat into Microsoft’s bottom line.

Intel, by all appearances, stumbled into their x86 empire entirely by accident. The 8086 was considered a stopgap product. The 80286 was seen as a minor update and the 80386 started out as a sort of dead-end project, before turning into a matter of major strategic importance for Intel after the iAPX 432 abysmally failed. The PC effectively forced Intel to go the x86 route.

The importance of being in the right place at the right time cannot be possibly overstated.

Summary

The IBM PC development cycle was very short, only one year from the start of the design phase to a finished and announced product. The design was jumpstarted by heavily leaning onto the development team’s experience with the System/23 Datamaster. The core PC architecture was more Datamaster than not, with the notable exception of a CPU upgrade (Intel 8088 instead of 8085). The PC’s I/O subsystem, on the other hand, only had some (storage, communications) or barely any (display) relation to the Datamaster, setting new standards.

The tight schedule determined almost everything about the design of the PC, from the hardware (significant reuse of Datamaster design) to the software (using existing 3rd party software, no waiting for CP/M-86). The IBM PC was the right product at the right time, and its success and durability outstripped anyone’s wildest expectations.

Explainer

The design drawings use shorthand that may require explanation. Here’s an attempt to decode some of the acronyms and IBMese.

  • CD: Card (feature card or I/O card)
  • CH: Channel
  • DEC: Decoding or decoder
  • DRV: Drive
  • 1LPC: 1 Line Per Channel—how many wires fit in the 0.1 inch space between pins on the components; higher LPC implies more expensive to manufacture
  • MPU: MicroProcessor Unit aka CPU
  • PCK: Parity Check
  • Planar: System board
  • ROS: Read-Only Storage aka ROM
  • RQ/GT: Request/Grant

Sources

This entry was posted in 8086/8088, BIOS, DOS, IBM, Intel, PC hardware, PC history. Bookmark the permalink.

60 Responses to The IBM PC, 41 Years Ago

  1. Richard Wells says:

    The System/23 has two 8255s on the planar – marked as 4178628. The one to the upper left of the board is close to the 16-pin connector that should be for the keyboard. I can’t find traces going from the keyboard to the 8255 but the pictures I have access to only show the upper surface.

    Apologies for the double post. Took a while to check the IBM part number reference.

  2. Michal Necasek says:

    The 8255 in the PC didn’t just handle the keyboard. It also provided the cassette interface, could control the speaker, allowed software to read the DIP switch settings, plus had a few more minor functions. The choice almost certainly wasn’t 8251 vs. 8255, the choice was 8251 vs. a few latches around the 8255.

    And yes, parallel interfaces are basically gone from computers. Even the bus (PCI Express) is serial. RAM is about the last thing that’s not.

  3. Michal Necasek says:

    My impression is that the PC design team cared little about what IBM mainframe execs thought. The cost and complexity alone would have been a good reason not to choose the 8089.

  4. zeurkous says:

    @Richard Wells:

    So that’s another System/23 legacy. That explains things.

    Makes the 5150 seem even more like a machine of the past and not of the
    future.

  5. Michal Necasek says:

    All too often it happens that technology designed for the future very quickly becomes the past, while technology designed for the present becomes the future.

  6. zeurkous says:

    That’s because many futurists effectively behave as if they can predict,
    with reasonable accuracy, what the future will be like.

    “Designing for the future” ought to imply that one has not only learned
    from past mistakes, but also that one doesn’t make too many assumptions
    about what things will be like.

    That makes it hard. But it can be done. Most of all it requires
    humility.

  7. MiaM says:

    A few comments:

    “The text-only MDA similarly had 4 KB (4,096 bytes) of RAM. With 160 bytes required to store a line of text (80 characters each with an attribute byte), there was room for 25 lines of text—utilizing 4,000 bytes of memory—but not more.”

    This is wrong as 96 unused bytes could of course contain another line of 80 characters. The real reason for the 80×25 screen size is a limitation in the 6845 CRTC chip. Unfortunate as it would had been really nice if the MDA card had two different font sizes and a mode with more rows, similar to 80×43 on EGA and 80×50 on VGA. I don’t know if there ever were a more or less drop-in compatible successor to the CRTC that allowed more than 25 rows. 25 is kind of arbitrary as it isn’t an even binary number. Maybe it’s an artifact of how the CRTC generates the vertical timing, i.e. the row counter might have extra meanings during the blanking, retrace and sync signal periods.

    Re 25 vs 24 rows: Although 25 rows were unusual, it wasn’t unheard of. For example all PET models used 25 rows (and the pet 80xx, afaik released in may 1980, used 80 columns so 80×25 like the PC. The earlier PETs used 40×25).

    Re the possibility of the MDA card had being a color card intended for the Datamaster: Although the Datamaster had a built in monochrome CRT, nothing would had stopped IBM from releasing a version with a built in color CRT. A problem at the time (both of the Datamaster and the PC) was that CRTs weren’t that good. Sure the text is rather readable on the CGA monitor in 80 column mode, but it’s not exactly crisp. Having three different monitor options would likely had confused the market. By three I’m thinking of the MDA and CGA monitors that actually were produced, and also a MDA frequency monitor with color. Sure, IBM could had fitted the MDA card with two different clocks and sets of fonts and made it able to produce 15khz video too, but that could possibly had confused the market.

    Re a possible version using the display cards memory as system memory: That would had required that version to either use the MDA card or the CGA card to be able to halt the processor to avoid “snow”. Otherwise it would had been as slow as the slow mode on a Sinclair ZX81, kind of. At the time when “0k” was suggested the designers were probably not aware of if the CGA card would have the “snow” problem or not. As the MDA card uses SRAM, expanding it from say 4k to 8k would likely had had a cost similar to having 16k regular system memory. The only possibility for “0k” to have made sense with the MDA card would had been a home computer style system, built in to the keyboard and a PSU with only a 5V rail. (Two additional rails were required by the 16kbit SRAM chips – otherwise it was only the RS232 port and disk drives that required anything else than 5V).

    Re possibly using some “standard” bus cards: The S100 bus not only uses (usually linear and thus heat producing) voltage regulators on each card and weren’t updated for >64k at the time as already mentioned. It also uses 50% wider pin spacing which just takes up unnecessary space.

    Most impoartantly, which S100 cards would had been attractive for a user to add to a PC? Sure, 64k memory cards could had been attractive. But otherwise? The PC parallell and serial ports were so trivial that there were no real reason to use any other similar but incompatible cards. For disk drives and especially for display cards anything else would had required changing the system board ROMs.

    There for sure must had been some rare use case S100 boards that would actually had been attractive to put in a PC for a few users, but those few users could likely just had continued using S100 computers until those card types became available for the PC. (I’m thinking about things like IEEE-488/GPIB cards, general purpose I/O cards, cards to interface to non-IBM minicomputers and mainframes and whatnot). I’m not familiar with the other bus type mentioned in the comments, but I doubt that there were any cards that had been that attractive to install in a PC.

    Re Intel having the interrupt vectors at the start of memory space: I actually think this is one of the greatest improvements in x86 as compared to most other processors. Every computer running CP/M 80 needs to have some hardware to either switch in ROM instead of RAM at start of memory to start, or hardware to switch in a NOP instruction instead of RAM. The 68000 also suffers from this – for example on the Amiga an I/O pin is kind of wasted on solving this by allowing switching in/out ROMs at start of memory (where RAM usually resides, except at power up).

    This shows that Intel actually intended the x86 to be used on computers loading an operating system from disk, or at least soft load some drivers and whatnot, as opposite to having everything in ROM like embedded systems and “home computers” did.

    Re possible piggy back DRAM chips: IIRC Mostek made pre-piggy-backed chips using two 16kbit DRAM chips on top of each other, with whatever logic required for it to not need bodge wires. The intended market was afaik minicomputers and mainframes.

    Re loading the firmware from disk: There are systems that only are able to read and execute a boot block but not even able to print text on the screen using it’s built in rom. I think that’s a terrible design / money saving as there is no way to tell the user to insert a bootable disk or that there were some read error. I think The 8-bit guy on Youtube did a video on a system that does that (IIRC a DOS based early 80’s computer with way better graphics than the PC).

    Also afaik ROMs either has a diode or lacks a diode for each bit of information. Compare with DRAMs that have a transistor for each bit, and SRAMs that have more than one transistor for each bit. At the time mask programmed ROMs would likely had been cheaper (per size) than any RAM. So it would actually be a cost saving to have the BIOS in ROM.

    Btw re development of the PC before the disk hardware and DOS were available: This is just a guess but I would assume that the earliest development ROMs did hook up to some kind of host system via some interface (likely a serial port) and to test out various software they would had just transferred the software via that interface from the host system. Making it possible to push a memory dump through for example RS232 (using an actual UART) is easier than making a cassette interface work (even if it only has to work with direct signals and not take into account wow/flutter, incorrect speed and phase shifting of magnetic tapes).

    Re having a spare ROM socket: It’s already been mentioned that Apple II did this. Other computers did that too, like for example the Commodore PET, and there were several addition ROMs. A curious example it that some software actually used a rom solely as a dongle, containing firmware for a printer (that didn’t even use the same CPU as the PET, but was mass produced by Commodore for their printer OEM so the software manufacturer would had gotten a good price for those chips).

    Re the keyboard and it’s interface: It’s ironic that it was very flexible for various national variations, yet DOS silently dropped full support for non-extended keyboards at a point in time where XT class machines were still in use. Before extended keyboards, you just held ALT and got the US keymap for a computer where you had loaded the national keymap, and this allowed you to type things like [ ] \ and so on. Not sure which version of DOS dropped this, but at least on 6.x you have to type ALT+92 to get a backslash when using swedish keyboard layout on a non-extended keyboard. (Also at some point in time DOS started forcing the users to use the annoying MODE CON PREPARE and whatnot to be able to select keyboard map, even on a computer where there were nothing to “prepare” (i.e. using MDA/CGA)).

  8. Michal Necasek says:

    I’m confused… how do you store the 160 bytes needed for another line of 80 characters + attributes into the remaining 96 bytes?

    Yes, 25-line terminals were not unheard of, just not typical. The IBM PC made them typical. Commodore certainly used 25-line screens all along.

    IBM Boca had Intel MDS systems with an ICE and everything. They definitely had the equipment to “bootstrap” the development of an 8086-based system. They had at least one MDS system with a hard disk where the BIOS development was done.

    As for the DOS keyboard support: What exactly are you talking about? Are you talking about the behavior of some specific DOS keyboard driver? You’d probably have to complain about Microsoft Sweden specifically, because the people in Redmond most likely had no clue whatsoever.

  9. MiaM says:

    Ooops, forgot about the attribute bytes :O

    My impression is that DOS behaves this way with every keyboard layout variation and the solution were probably either “live with it” or replace the keyboard. It only affected systems using a way older keyboard than DOS version, i.e. upgraded systems and systems that were built using used parts and whatnot. Maybe it’s also a lack of documentation – in more modern times I’ve got the impression that AltGr sends Ctrl+Alt. If that would had been known back in the days then it would likely had been easy to just push Ctrl+Alt manually.

  10. Michal Necasek says:

    AltGr is a distinct key but with many or most keyboard layouts it does act like Ctrl+Alt. Whether that was the case back then I don’t know. It’s really something the keyboard driver would define.

    On the BIOS level, Right Alt is Right Alt, not some combination of keys.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.