It's not about technology for its own sake. It's about being able to implement your ideas.
"As new code was compiled, older code (and other memory used by the compiler) was orphaned, eventually causing the PC to run low on free memory. A slow garbage collection process would automatically occur when available memory became sufficiently low, and the compiler would be unresponsive until the process had completed, sometimes taking as long as 15 minutes."
—Naughty Dog's Jak and Daxter post-mortem
I know the title will bait people who won't actually read any of this article, so I'll say it right up front to make them feel the error of their reactionary ways: I am pro garbage collection. It has nothing to do with manual memory management supposedly being too hard (good grief no). What I like is that it stops me from thinking about trivial usage of memory at all. If it would be more convenient to briefly have a data structure in a different format, I just create a new version transformed the way I want it. Manually allocating and freeing these insignificant bits of memory is just busywork.
That's hardly a bold opinion in 2008. There are more programming languages in popular use with garbage collection than there are without. Most of the past paranoia about garbage collection slowness and pauses has been set aside in favor of increased productivity. Computers have gotten much faster. Garbage collectors have gotten better. But those old fears are still valid, those hitches and pauses still lurking, and not just in the same vague way that some people like to assume that integer division is dog slow even on a 3GHz processor. In fact, they apply to every garbage collected language implementation in existence. Or more formally:
In any garbage collector, there exists some pathological case where the responsiveness of your program will be compromised.
"Responsiveness" only matters for interactive applications or any program that's vaguely real-time. In a rocket engine monitoring system, responsiveness may mean "on the order of a few microseconds." In a robotic probe used for surgery, it might be "on the order of four milliseconds." For a desktop application, it might be in the realm of one to two seconds; beyond that, users will be shaking the mouse in frustration.
Now about the "pathological case." This is easy to prove. In a garbage collector, performance is always directly proportional to something. It might be total number of memory allocations. It might be the amount of live data. It might be something else. For the sake of discussion let's assume it's the amount of live data. Collection times might be acceptable for 10MB of live data, maybe even 100MB, but you can always come up with larger numbers: 250MB...or 2GB. Or in a couple of years, 20GB. No matter what you do, at some point the garbage collector is going to end up churning through those 250MB or 2GB or 20GB of data, and you're going to feel it.
Ah, but what about generational collectors? They're based on the observation that most objects are short lived, so memory is divided into a nursery for new allocations and a separate larger pool for older data (or even a third pool for grandfatherly data). When the nursery is full, live data is promoted to the larger pool. These fairly cheap nursery collections keep happening, and that big, secondary pool fills up a little more each time. And then, somewhere, sometime, the old generation fills up, all 200MB of it. This scheme has simply delayed the inevitable. The monster, full-memory collection is still there, waiting for when it will strike.
What about real time garbage collection? More and more, I'm starting to see this as a twist on the myth of the Sufficiently Smart Compiler. If you view "real time" as "well engineered and fast," then it applies to most collectors in use, and they each still have some point, somewhere down the road, at which the pretense of being real time falls apart. The other interpretation of real time is some form of incremental collection, where a little bit of GC happens here, a little bit there, and there's never a big, painful pause.
An interesting question is this: What language systems in existence are using a true incremental or concurrent garbage collector? I know of three: Java, Objective C 2.0 (which just shipped with OS X Leopard), and the .net runtime. Not Haskell. Not Erlang. Not Objective Caml [EDIT: The OCaml collector for the second generation is incremental]. Not any version of Lisp or Scheme. Not Smalltalk. Not Ruby. That begs a lot of questions. Clearly incremental and concurrent collection aren't magic bullets or they'd be a standard part of language implementations. Is it that the additional overhead of concurrent collection is only worthwhile in imperative languages with lots of frequently modified, cross-linked data? I don't know.
Incremental collection is a trickier problem than it sounds. You can't just look at an individual object and decide to copy or free it. In order to know if a data object is live or not, you've got to scan the rest of the world. The incremental collectors I'm familiar with work that way: they involve a full, non-incremental marking phase, and then copying and compaction are spread out over time. This means that the expense of such a collector is proportional to the amount of data that must be scanned during the marking phase and as such has a lurking pathological case.
Does knowing that garbage collectors break down at some point mean we should live in fear of them and go back to manual heap management? Of course not. But it does mean that some careful thought is still required when it comes to dealing with very large data sets in garbage collected languages.
Next time: A look at how garbage collection works in Erlang. The lurking monster is still there, but there are some interesting ways of delaying his attack.
permalink January 5, 2008
I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.
Where are the comments?