programming in the
twenty-first century

It's not about technology for its own sake. It's about being able to implement your ideas.

How Much Processing Power Does it Take to be Fast?

First, watch this.

It's Defender, an arcade game released thirty years ago. I went out of my way to find footage running on the original hardware, not emulated on a modern computer. (There's clearer video from an emulator if you prefer.)

Here's the first point of note: Defender is running on a 1MHz 8-bit processor. That's right ONE megahertz. This was before the days of pipelined, superscalar architectures, so if an instruction took 5 cycles to execute, it always took 5 cycles.

Here's the second: Unlike a lot of games from the early 1980s, there's no hardware-assisted graphics. No honest-to-goodness sprites where the video processor does all the work. No hardware to move blocks of memory around. The screen is just a big bitmap, and all drawing of the enemies, the score, the scrolling mountains, the special effects, is handled by the same processor that's running the rest of the code.

To be fair, the screen is only 320x256 with four bits per pixel. But remember, this was 1980, and home computers released up until mid-1985 didn't have that combination of resolution and color.

Now it's 2010, and there's much amazement at the responsiveness of the iPad. And why shouldn't it be responsive? There's a 32-bit, gigahertz CPU in there that can run multiple instructions at the same time. Images are moved around by a separate processor dedicated entirely to graphics. When you flick your finger across the screen and some images slide around, there's very little computation involved. The CPU is tracking some input and sending some commands to the GPU. The GPU is happy to render what you want, and a couple of 2D images is way below the tens of thousands of texture-mapped polygons that it was designed to handle.

Okay, JPEG decompression takes some effort. Ditto for drawing curve-based, anti-aliased fonts. And of course there's overhead involved in the application framework where messages get passed around to delegates and so on. None of this justifies the assumption that it takes amazing computing power to provide a responsive user experience. We're so used to interfaces being clunky and static, and programs taking long to load, and there being unsettling pauses when highlighting certain menu items, that we expect it.

All the fawning over the speed iPad is a good reminder that it doesn't have to be this way.

(If you liked this, you might like Slow Languages Battle Across Time.)

permalink April 24, 2010

previously

archives

twitter / mail

I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.

Where are the comments?