It's not about technology for its own sake. It's about being able to implement your ideas.
Back before I completely lost interest in debates about programming topics, I remember reading an online discussion that went like this:
Raving Zealot: Garbage collection is FASTER than manual memory management!
Experienced Programmer: You mean that garbage collection is faster than using
free to manage
a heap. You can use pools and static allocation, and they'll be faster and more predictable than garbage collection.
Raving Zealot: You need to get over your attitude that programming is a MACHO and RECKLESS endeavor! If you use a garbage collected language, NOTHING can go wrong. You're PROTECTED from error, and not reliant on your MACHONESS.
What struck me about this argument, besides that people actually argue about such things, is how many other respected activities don't have anywhere near the same level of paranoia about protection from mistakes. On the guitar--or any musical instrument--you can play any note at any time, even if it's out of key or, more fundamentally, not played correctly (wrong finger placement or pressure or accidentally muting the string). And people play instruments live, in-concert in front of thousands of people this way, knowing that the solo is improvised in Dorian E, and there's no physical barrier preventing a finger from hitting notes that aren't in that mode. The same goes for sculpting, or painting, or carpentry...almost anything that requires skill.
(And building chickadee houses isn't universally considered a MACHO hobby, even though it involves the use of POWER TOOLS which can LOP OFF FINGERS.)
In these activities, mistakes are usually obvious and immediate: you played the wrong note, you cut a board to the wrong length, there's blood everywhere. In macho programming, a mistake can be silent, only coming to light when there's a crash in another part of the code--even days later--or when the database gets corrupted. Stupidly trivial code can cause this, like:
array[index] = true;
index is -1. And yet with this incredible potential for error, people still build operating systems and giant applications and massively multiplayer games in C and C++. Clearly there's a lot of machoness out there, or it's simply that time and debugging and testing--and the acceptance that there will be bugs--can overcome what appear to be technical impossibilities. It's hand-rolling matrix multiplication code for a custom digital signal processor vs. "my professor told me that assembly language is impossible for humans to use."
Would I prefer to ditch all high-level improvements, in exchange for programming being the technical equivalent of rock climbing? NO! You can romanticize it all you want, but when I wrote 8-bit games I clearly remember thinking how much more pleasant it was to tinker in BASIC than to spend hours coding up some crazy 6502 code that would lock-up the entire computer time after time (the bug would be that changing a loop index from 120 to 130 made it initially be negative, so the loop would end after one iteration, or some other obscurity).
What both this retro example and the C one-liner have in common is that the core difficulty stems less from the language itself than because code is being turned loose directly on hardware, so crashes are really crashes, and the whole illusion that your source code is actually the program being executed disappears. Problems are debugged at the hardware level, with data breakpoints and trapped CPU exceptions and protected memory pages (this is how debuggers work).
It's a project suitable as part of a single semester undergraduate class to write an interpreter for your favorite low-level language. Write it in Scheme or Erlang or Scala. Use symbolic addresses, not a big array of integers, to represent memory. Keep track of address offsets, instead of doing the actual math. Have functions return lists of memory addresses that have been read from or modified. Keep everything super simple and clean. The goal is to be able to enter expressions or functions and see how they behave, which is a whole lot nicer than tripping address exceptions.
All of a sudden, even hardcore machine code isn't nearly so scary. Write a dangerous function, get back a symbolic representation of what it did. Mistakes are now simply wrong notes, provided you keep your functions small. It's still not easy, but macho has become safe.
(If you liked this, you might enjoy Sending Modern Languages Back to 1980s Game Programmers.)
permalink August 30, 2008
I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.
Where are the comments?