DOOM Black Book – This is Brilliant!

Book cover

I heard about the DOOM Game Engine Black Book by Fabien Sanglard on the Hanselminutes podcast episode 666, and immediately ordered the book. It was a riveting read – at least for someone who likes technology and computer history like I do. The book walks through how the ID Software classic DOOM game from 1993 works and the tricks and techniques used to get sufficient performance out of the hardware of 1993. As background to how the software was written, the book contains a great description of the hardware design of IBM-compatible PCs, gaming consoles, and NeXT machines circa 1992-1994. It covers software design, game design, marketing, and how ID Software worked.

Never Played DOOM

Up front, I must admit to never having played the original DOOM when it came out – back in those days, I was an Apple Mac user, and my primary computer was the brilliant Mac SE/30… great for doing reports and writing, not exactly the most ported-to gaming machine. The processor was definitely a tad on the slow side compared to PCs of the era, and the screen just 512×384 black-and-white. I really have no relationship to the game itself, but the book is great even without any personal nostalgia.

DOOM Hardware Requirements

When DOOM came out, hardware was not quite as fast as it is today… to put it mildly. To set the scene for the code and the story told in the book, this is worth remembering.

DOOM was run at a resolution of (up to) 320×200 pixels – but with slightly elongated pixels as used by the PC hardware of the day, it was effectively displayed at 4:3 (corresponding to 320×240). The book notes that many ports of DOOM to other hardware missed this detail, making the graphics appear squashed – since the same graphic sprites were used on machines with actual square pixels while designed for elongated pixels. Here is how the book explains this – it is a great example of the fantastic writing in the book. Clear, concise, and richly illustrated:

Photo from my copy of the book, showing the effect of aspect ratios on the graphics assets.

In terms of processor requirements, the best PC processor available was the Intel 486DX2, running at 66MHz with a 33MHz memory bus. When combined with the new and VESA bus, DOOM could reach something like 24 frames per second at full 320×200 resolution (according to the book). With slower processors, or machines with the older ISA bus, this could easily drop to single-digit framerates.

To get playable framerates, it was possible to use “low resolution” mode, where only half as many pixels were rendered horizontally, and then simply pixel-doubled to fill the screen. In addition, the active screen area could be reduced to simply have to generate fewer pixels – down to a 96×48 pixel area, less than 10% the pixels of the full-size screen.

Personal Computer History

The book starts out with a long chapter about the state of the “PC” in the early 1990s. It was by then the dominant computer platform in the world, even though it did not work particularly well really. I do remember just how unreliable PC systems seemed to be compared to my (rather more expensive) Mac, and the book confirms this. Windows was unreliable and crashing all the time, and you had to fiddle with drivers to get new hardware to work. PCs would often crash for no real reason at all, and programmers and other users learned to save their work to disk all the time in order to avoid data loss. It was bad enough that the DOOM programmers bought super-expensive NeXT computers in order to have a stable computer system to work on that would not lose work accidentally all the time.

There are a lot detailed diagrams, explaining just how the hardware was built. Maybe you do not really need to know that… but it is great to see it!  It would have been fun to see similar diagrams for today’s hardware – but now it tends to be a single chip, maybe a few.

Some beautiful photos and diagrams from the section on PC audio.

I found it particularly interesting with the story of PC audio. The book offers a great example of what happens when you get to “good enough” hardware. Unlike graphics where we are still seeing improvements after more than 20 years of progress, PC audio became essentially a solved problem when the Sound Blaster 16 came out, offering “CD quality” audio. Audio soon became a commodity, and migrated onto motherboards from add-on boards, and then into the chipset from separate chips. While there are still specialty audio solutions today, the AC’97 and later the Intel HD Audio standards basically means that built-in PC audio is ubiquitous and given, and not a market of any great import.

DOOM Software Tricks

DOOM was a huge step forward in 3D graphics for its time. It built on the previous game from ID, Wolfenstein 3D, and pushed the envelope on what was possible to do in terms of 3D environments. Wolfenstein 3D essentially used a grid as its basis, and all walls were thus restricted to 90 degree angles. This simplified the rendering problem enough to make it feasible given the hardware available when it was designed (only a year or two before DOOM).

In DOOM, the walls are at arbitrary angles, and levels can be designed in a much freer way. It was also possible to have floors at multiple different heights, and walls with holes in them allowing windows into adjacent rooms. This was a huge step forward in terms of level design and visual appeal. However, all walls are still strictly vertical, which saves a ton of time for the rendering engine. There are many such clever tricks in the DOOM engine, where smart observations and shortcuts provide necessary performance without being immediately obvious to the human observer.

Another performance optimization was the precomputation of a lot of useful rendering and game engine information for a level – instead of figuring out lines of sight and especially lines of hearing for adversaries, all of that was precomputed at build time and saved with the level. This made the level editing tools essentially into compilers for level data rather than a “simple” editor.

The book explains all of this in loving detail, including snippets of the DOOM code to really show how things were done. Like this:

Today, games programming and graphics are both more straightforward and more complex. With current graphics hardware and APIs, as well as ready-to-use game engines (Unity, Unreal, et al), great graphics in a game are almost given. Graphics today are expressed in a way that is much higher level and without requiring the kinds of simplifications and tricks used in DOOM. There are other trade-offs, but they are not at all as stark as with the DOOM engine.

Things are also more complex. Back in 1992, it was possible to achieve something never seen before with a good idea and some clever coding. I recall a few years prior to that some of the amazing demos some of my high school friends created on the Amiga. Today, doing “amazing” seems to require an army of coders, artists, designers, and probably some PhDs in computer graphics. We can do a lot more, but it has also become a lot more difficult to do something truly different due to the much thicker layers of software.

The Watcom C Compiler

Today, it is easy to find a high quality C compiler for free. Both gcc and clang-llvm provide very competitive performance in the generated code, and if you want to get some additional performance you can pay a little bit of money for something like the Intel C Compiler. Programming in C is kind of synonymous with “hard-core performance optimization”.

That was not the case back in 1992… The given approach to wring the best performance out of PC was to program in assembly language. However, the advent of the Watcom C Compiler changed this – and DOOM was compiled using that compiler, providing sufficient performance that basically the whole game could be done in C! This is one reason why the source code for DOOM survives in a form that is useful to this day – an assembly-heavy program would be very hard to compile today, and very difficult to make work on anything else than an old PC.

Working on NeXT

A most surprising aspect of the development of DOOM was that it was all carried out on NeXT machine. The NeXT machines cost ludicrous amounts of money, but they offered so many benefits that ID software used them as the main development platform – even while cheap mass-market PCs were where the money was made. The NeXT machines were stable – you did not have to save your work all the time in order not to lose too much work to the next system crash. The performance of the hardware and sophistication of the software environment let ID software create tools that built the game.

The NeXT machine also had some seriously high-resolution displays for the day – 1120×832 pixels, allowing the programmers to see much more code than on a DOS text display with 24×80 characters. The usefulness of big displays to programmers remains to this day…

It was essentially cross-development, just like you do today for typical embedded systems. The code would edit and compile and link on the NeXT, and then copied to the PC that each developer also had by their desk for testing and assessing the actual performance. This made or a much more efficient workflow than just working directly on the PCs of the day.

This way of working also made the DOOM source code very portable – to facilitate testing on NeXT as well as on PCs, the code is cleanly layered and has specific APIs hiding the particulars of a host system. This is why the game has been ported to so many different systems since – it was intentionally architected and written to be portable from the very start.

It should be noted that this portability even covers the most insidious of issues, endianness. The 68000-series processors used in the NeXT machines were big-endian, while the PCs x86 processors were (and still are) little-endian. Thus, values read from game files have to be correctly handled to avoid issues. Since DOOM was primarily to run on the PC, game files use little-endian data, and big-endian systems end up with the need to swap bytes around during loads.

Console Computer Architecture

The last part of the book covers the many gaming console ports that were made of DOOM – and the often radical compromises that had to be made in order to fit DOOM into the limited storage, memory, and processing power of the consoles.

It offers a great overview of the many funky computer architecture variants that were used in the consoles of the era to provide maximum (theoretical) performance from a very limited hardware budget. The typical cost being that programming became a bit more complex, with software having to be carefully optimized to the particulars of the machine. Where the PC at the time was essentially a processor with memory and simple display system, the consoles usually had multiple processors of different types that had to coordinate in order to get the job done. This posed a challenge for porting the DOOM program code which was written for the simpler PC architecture.

The console hardware also often took shortcuts in their graphics systems – one example that the book explains very nicely in chapter 5.12 is how most consoles failed to do perspective-correct rendering of textures on walls. This meant that the graphics looked really bad when using hardware texturing, bad enough that John Carmack vetoed ports using the non-perspective texturing.

Buying the Book

It is a bit annoying that you could only buy the physical book over Amazon – I would have preferred to give the money to some other bookseller, to be honest. But Amazon does have a well-oiled print-on-demand system that takes a PDF from the author and gives me a printed book to read. While taking a lion’s share of the profits (Fabien explains it all on his website).

All I can say is – buy the book and read it! In whatever format you like. I am very happy to have the book in paper format, since that makes for the best reading experience for something like this. Reading on a screen just isn’t the same thing as a proper book.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.