When attempting to play the classic racing game Need for Speed SE in a virtualized environment (VirtualBox), I was dismayed to discover that all the in-game videos were completely garbled. Curiously, the introductory video played when the game is first launched looked fine, even though there was no reason to think it should be any different from the in-game videos.
At first glance, it looked like the video mode used by the game didn’t match the data. Indeed the game was using a 320×200 mode at 16bpp, which was rather suspicious. One would expect an 8bpp mode, perhaps the standard VGA mode 13h, which is what the intro video uses.
A more detailed investigation revealed that the garbled video is the result of a subtle bug in NFS SE, rather than any deficiency in the virtualized environment. The game uses a routine called initgraphics which establishes a video mode. The routine predictably takes the requested resolution and color depth as arguments. However, the color depth is optional and if supplied as zero, the routine will choose a suitable mode. And that’s where the problem lies.
The initgraphics routine will use a 16bpp mode if found, otherwise it will fall back to an 8bpp mode. For some reason, or more likely, for no specific reason at all, the game uses the default color depth (zero) when playing in-game videos. However, the video playback in fact assumes an 8bpp mode and the output will be garbled (as shown above) if a different color depth is used.
So how come this bug was not found when Need for Speed SE was first released? That’s easy—at the time, the 320×200 mode at 16bpp was very unusual and not supported by the VESA BIOS built into the graphics cards. In that situation, the initgraphics routine falls back to 8bpp and all is well. But in a more modern environment, the 16bpp mode will be available and used, causing the video to be garbled.
How to fix this? One possibility is to make the 320×200, 16bpp mode unavailable, but that is not so easy and is only a workaround rather than a real fix. The better solution is to patch the game executable (NFS.EXE) to avoid the whole automatic color depth selection nonsense, which is not only unnecessary but wrong. All it takes is changing one byte where the arguments are passed to initgraphics in the video playback setup code, and using 8 instead of 0 for the color depth.
In my copy of NFS SE, that is the byte at offset 607E6h in the NFS.EXE file (dated 6-17-96, size 1,254,591 bytes). After changing it from zero to 8, the in-game videos appear like this:
That looks so much better! The game now chooses the correct mode for video playback regardless of whether the 320×200 mode with 16bpp is available or not.
It’s like the whole OS/2 floppy ‘detection’ thing… Relying on things will fail, in the long run is a ‘bad thing’ ….
Exactly…clearly the behavior of NFS SE isn’t intentional, but it very nicely illustrates why assuming more than 1+1=2 is just not very smart. Actually reminds me of the bug with the loader in old DOS versions–who knew that hard disks with more than 17 sectors per track might show up one day… (See Hang with early DOS boot sector.)
Wow, that messed-up screenshot was a blast from the past. And explains a /lot/.
I used to play /tons/ of Need For Speed during my childhood, and had that problem with /all/ the videos.
And guess what? I forked out six months of pocket money for a very flash (at the time) Tseng Labs ET4000/W32p… which, you guessed it, had a 16-bit 320×200 mode. 😛
Hah, that ET4000 was badass though, had a 256 byte buffer built in the chip itself that sped up bursts of data transfer.
This actually works!
I used HXD Editor to edit the exe and now all videos work fine.
You sir, deserve a medal!
I am playing nfs 2 special edition and the same problem happens with me and I cannot find the offset you are mentioning please someone help me and please tell me which app u used to patch you exe file
I ran into this issue back in the day, but it only occurred on some systems, not all. I never really understood what the problem was. And unlike most such issues, univbe didn’t fix this (in fact, it probably breaks this on certain working cards, if it adds a 16-bit 320×200 mode that the standard VGA BIOS doesn’t expose).
This explains it perfectly: some machines I ran it on had the 16-bit mode, others did not.