programming in the
twenty-first century

It's not about technology for its own sake. It's about being able to implement your ideas.

On Being Sufficiently Smart

I'm proud to have created the wiki page for the phrase sufficiently smart compiler back in 2003 or 2004. Not because it's a particularly good page, mind you; it has been endlessly rewritten in standard wiki fashion. It's one of the few cases where I recognized a meme and documented it. I'd been seeing the term over and over in various discussions, and it started to dawn on me that it was more than just a term, but a myth, a fictional entity used to support arguments.

If you're not familiar, here's a classic context for using "sufficiently smart compiler." Language X is much slower than C, but that's because floating point values are boxed and there's a garbage collection system. But...and here it comes...given a sufficiently smart compiler those values could be kept in registers and memory allocation patterns could be analyzed and reduced to static allocation. Of course that's quite a loaded phrase, right up there with "left as an exercise for the reader."

One of the key problems with having a sufficiently smart compiler is that not only does it have to be sufficiently smart, it also has to be perfect.

Back in the mid-1990s my wife and I started an indie game development company, and we needed some labels printed. At the time, all we had was a middling inkjet printer, so the camera ready image we gave to the print shop was, to put it mildly, not of the necessary quality. Then we got the printed labels back and they looked fantastic. All the pixellated edges were smooth, and we theorized that was because of how the ink flowed during the printing process, but it didn't really matter. We had our labels and we were happy.

A few months later we needed to print some different labels, so we went through the same process, and the results were terrible. Every little flaw, every rough edge, every misplaced pixel, all perfectly reproduced on a roll of 1000 product labels. Apparently what had happened with the previous batch was that someone at the print shop took pity upon our low-grade image and did a quick graphics art job, re-laying out the text using the same font, then printing from that. The second time this didn't happen; the inkjet original was used directly. The problem wasn't that someone silently helped out, but that there was no indication of what was going on, and that the help wouldn't be there every time.

Let's say that a compiler can detect O(N^2) algorithms and replace them with O(N) equivalents. This is a classic example of being sufficiently smart. You can write code knowing that the compiler will transform and fix it for you. But what if the compiler isn't perfect (and it clearly won't be, as there aren't O(N) versions all algorithms)? It will fix some parts of your code and leave others as-is. Now you run your program, and it's slow, but why? You need insight into what's going on behind the scenes to figure that out, and if you find the problem then you'll have to manually recode that section to use a linear approach. Wouldn't it be more transparent to simply use linear algorithms where possible in the first place, rather than having to second guess the system?

There's another option, and that's to have the compiler give concrete information about behind the scenes transformations. I have a good mental picture of how Erlang works, in terms of the compiler and run-time. It's usually straightforward to understand what kind of BEAM code will be generated from particular source. That was true until fancy optimizations on binary operations were introduced in 2008. The documentation uses low-level concepts like "match context" and discusses when segmented binaries are copied and so on. It's all abstract and difficult to grasp, and that's why there's a new compiler switch, "bin_opt_info," to provide a window into what kind of code is being generated. Going back to my early programming days, the manual for Turbo Pascal 4 listed exactly what optimizations were performed by the compiler.

The Glasgow Haskell Compiler (GHC) is the closest I've seen to a sufficiently smart compiler, with the advantages and drawbacks that come with such a designation.

I can write code that looks like it generates all kinds of intermediate lists--and indeed such would be the case with similar code in Erlang--and yet the compiler is sufficiently smart to usually remove all of that. Even in the cases where that isn't possible, it's not a make or break issue. In the worst case the Haskell code works like the Erlang version.

But then there's laziness. Laziness is such an intriguing idea: an operation can "complete" immediately, because the actual result isn't computed until there's specific demand for it, which might be very soon or it might be in some other computation that happens much later. Now suppose you've got two very memory intensive algorithms in your code, and each independently pushes the limits of available RAM. The question is, can you guarantee that first algorithm won't be lazily delayed until it is forced to run right in the middle of the second algorithm, completely blowing the memory limit?

The GHC developers know that laziness can be expensive (or at least unnecessary in many cases), so strictness analysis is done to try to convert lazy code to non-lazy code. If and when that's successful, wonderful! Maybe some programs that would have previously blown-up now won't. But this only works in some cases, so as a Haskell coder you've got to worry about the cases where it doesn't happen. As much as I admire the Haskell language and the GHC implementation, I find it difficult to form a solid mental model of how Haskell code is executed, partially because that model can change drastically depending on what the compiler does. And that's the price of being sufficiently smart.

(Also see Digging Deeper into Sufficiently Smartness.)

permalink April 18, 2009

previously

archives

twitter / mail

I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.

Where are the comments?