I'm James Hague, a recovering programmer who has been designing video games since the 1980s. This is Why You Spent All that Time Learning to Program and The Pure Tech Side is the Dark Side are good places to start.
Where are the comments?
Stumbling Into the Cold Expanse of Real Programming
This is going to look like I'm wallowing in nostalgia, but that's not my intent. Or maybe it is. I started writing this without a final destination in mind. It begins with a question:
How did fast action games exist at all on 8-bit systems?
Those were the days of processors living below the 2 MHz threshold, with each instruction run to completion before even considering the next. No floating point math. Barely any integer math, come to think of it: no multiplication or division and sums of more than 255 required two additions.
But that kind of lively statistic slinging doesn't tell the whole story or else there wouldn't have been so many animated games running--usually at sixty frames-per-second--on what appears to be incapable hardware. I can't speak to all the systems that were available, but I can talk about the Atari 800 I learned to program on.
Most games didn't use memory-intensive bitmaps, but a gridded character mode. The graphics processor converted each byte to a character glyph as the display was scanned out. By default these glyphs looked like ASCII characters, but you could change them to whatever you wanted, so the display could be mazes or platforms or a landscape, and with multiple colors per character, too. Modify one of the character definitions and all the references to it would be drawn differently next frame, no CPU work involved.
Each row of characters could be pixel-shifted horizontally or vertically via two memory-mapped hardware registers, so you could smoothly scroll through levels without moving any data.
Sprites, which were admittedly only a single color each, were merged with the tiled background as the video chip scanned out the frame. Nothing was ever drawn to a buffer, so nothing needed to be erased. The compositing happened as the image was sent to the monitor. A sprite could be moved by poking values in position registers.
The on-the-fly compositing also checked for overlap between sprites and background pixels, setting bits to indicate collisions. There was no need for even simple rectangle intersection tests in code, given pixel-perfect collision detection at the video processing level.
What I never realized when working with all of these wonderful capabilities, was that to a large extent I was merely scripting the hardware. The one sound and two video processors were doing the heavy lifting: flashing colors, drawing characters, positioning sprites, and reporting collisions. It was more than visuals and audio; I didn't even think about where random numbers came from. Well, that's not true: I know they came from reading memory location 53770 (it was a pseudo-random number generator that updated every cycle).
When I moved to newer systems I found I wasn't nearly the hotshot game coder I thought I was. I had taken for granted all the work that the dedicated hardware handled, allowing me to experiment with game design ideas.
On a pre-Windows PC of the early 1990s, I had to write my own sprite-drawing routines. Real ones, involving actual drawing and erasing. Clipping at the screen edges? There's something I never thought about. The Atari hardware silently took care of that. But before I could draw anything, I had to figure out what data format to use and how to preprocess source images into that format. I couldn't start a tone playing with two register settings; I had to write arcane sound mixing routines.
I had wandered out of the comfortable realm where I could design games in my head and make them play out on a TV at my parents' house and stumbled into the cold expanse of real programming.
(If you liked this, you might enjoy A Personal History of Compilation Speed.)