I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.

Where are the comments?

twitter / mail / newsfeed

Why Do Dedicated Game Consoles Exist?

The announcement of the Nintendo 2DS has reopened an old question: "Should Nintendo give up on designing their own hardware and write games for existing platforms like the iPhone?" A more fundamental question is "Why do dedicated game consoles exist in the first place?"

Rewind to the release of the first major game system with interchangeable cartridges, the Atari VCS (a.k.a. Atari 2600) in 1977. Now instead of buying that game system, imagine you wanted a general purpose PC that could create displays of the same color and resolution as the Atari. What would the capabilities of that mythical 1977 PC need to be?

For starters, you'd need a 160x192 pixel display with a byte per pixel. Well, technically you'd need 7-bits, as the 2600 can only display 128 colors, but a byte per pixel is simpler to deal with. That works out to 30,720 bytes for the display. Sounds simple enough, but there's a major roadblock: 4K of RAM in 1977 cost roughly $125. To get enough memory for our 2600-equivalent display, ignoring everything else, would have been over $900.

For comparison, the retail price of the Atari 2600 was $200.

How did Atari's engineers do it? By cheating. Well, cheating is too strong of a word. Instead of building a financially unrealistic 30K frame buffer, they created an elaborate, specialized illusion. They built a video system--a monochrome background and two single-color sprites--that was only large enough for a single horizontal line. To get more complex displays, game code wrote and rewrote that data for each line on the TV screen. That let the Atari 2600 ship with 128 bytes of RAM instead of the 30K of our fantasy system.

Fast-forward fourteen years to 1991 and the introduction of the Super Nintendo Entertainment System. Getting an early 90s PC to equal the color and resolution of the SNES is easy. The 320x200 256-color VGA mode is a good match for most games. The problem is no longer display quality. It's motion.

The VGA card's memory was sitting on the other side of a strained 8-bit bus. Updating 64,000 pixels at the common Super Nintendo frame rate of 60fps wasn't possible, yet the SNES was throwing around multiple parallaxing backgrounds and large animated objects.

Again, it was a clever focus that made the console so impressive. The display didn't exist as a perfect grid of pixels, but was diced into tiles and tilemaps and sprites and palettes which were all composited together with no involvement from the underpowered 16-bit CPU. There was no argument that a $2000 PC was a more powerful general-purpose machine, likely by several orders of magnitude, but that didn't stop the little $200 game system from providing an experience that the PC couldn't.

The core of both of these examples--graphics on a 2D screen--is a solved problem. Even free-with-contract iPhones have beautiful LCDs overflowing with resolution and triangle-drawing ability, so it's hard to justify a hand-held system that largely hinges on having a similar or worse display. There are other potential points of differentiation, of course. Tactile screens. Head-mounted displays. 3D holographic projection. But eventually it all comes down to this: Is the custom hardware so fundamentally critical to the experience that you couldn't provide it otherwise? Or is the real goal to design great games and have people play them, regardless of which popular system they run on?

(If you liked this, you might enjoy Nothing Like a Little Bit of Magic.)


Purely Functional Photoshop
How much memory does malloc(0) allocate?
Getting Past the Cloning Instinct
Organizational Skills Beat Algorithmic Wizardry
Tips for Writing Functional Programming Tutorials