programming in the
twenty-first century

It's not about technology for its own sake. It's about being able to implement your ideas.

Digging Out from Years of Homogeneous Computing

When I first started looking into functional programming languages, one phrase that I kept seeing in discussions was As Fast as C. A popular theory was that functional programming was failing to catch on primarily because of performance issues. If only implementations of Haskell, ML, and Erlang could be made As Fast As C, then programmers would flock to these languages.

Since then, all functional languages have gotten impressively fast. The top-end PC in 1998 was a 350MHz Pentium II. The passage of time has solved all non-algorithmic speed issues. But at the same time, there was a push for native code generation, for better compilers, for more optimization. That focus was a mistake, and it would take a decade for the full effect of that decision come to light.

In the early 2000s, PCs were the computing world. I'm not using "personal computer" in the generic sense; I'm talking about x86 architecture boxes with window-oriented GUIs and roughly the same peripherals. People in the demo-coding scene would shorten the term "x86 assembly language" to "asm" as if no other processor families existed. The Linux and Windows folks with nothing better to do argued back and forth, but they were largely talking about different shades of the same thing. One of the biggest points of contention in the Linux community was how to get a standard, "better than Windows but roughly the same" GUI for Linux, and several were in development.

Then in 2007 the iPhone arrived and everything changed.

This has nothing to do with Apple fanboyism. It's that a new computer design which disregarded all the familiar tenets of personal computing unexpectedly became a major platform. The mouse was replaced with a touchscreen. The decades old metaphor of overlapping windows shuffled around like papers on a table was replaced by apps that owned all the pixels of the device. All those years of learning the intricacies of the Win32 API no longer mattered; this was something else entirely. And most significantly for our purposes: the CPU was no longer an x86.

Compiler writers had been working hard, and showing great progress, in getting Haskell and Objective Caml turning into fast x86 machine code. Then, through no fault of their own, they had to deal with the ARM CPU and a new operating system to interface with, not to mention that Objective-C was clearly the path of least resistance for hardware developed and being rapidly iterated by a company that promoted Objective-C.

That a functional language compiler on a desktop PC was getting within a reasonable factor of the execution time of C no longer mattered if you were a mobile developer. The entire emphasis put on native code compilation seemed questionable. With the benefit of hindsight, it would have been better to focus on ease of use and beautiful coding environments, on smallness and embeddability. I think that would have been a tough sell fifteen years ago, blinded by the holy grail of becoming As Fast as C.

To be fair, ARM did become a target for the Glasgow Haskell compiler, though it's still not a reasonable option for iOS developers, and I doubt that's the intent. But there is one little language that was around fifteen years ago, one based around a vanilla interpreter, one that's dozens of times slower than Haskell in the general case. That language is Lua, and it gets a lot of use on iPhone, because it was designed from the start to be embeddable in C programs.

(If you liked this, you might enjoy Caught-Up with 20 Years of UI Criticism.)

permalink September 27, 2012

previously

archives

twitter / mail

I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.

Where are the comments?