programming in the
twenty-first century

It's not about technology for its own sake. It's about being able to implement your ideas.

Instant-On

"Mobile" is the popular term used to describe devices like the iPhone and iPad. I prefer "instant-on." Sure, they are mobile, but what makes them useful is that you can just turn them on and start working. All the usual baggage associated with starting-up a computer--multiple boot sequences that add up to a minute or more of time, followed by a general sluggishness while things settle down--are gone.

What's especially interesting to me is that instant-on is not new, not by any means, but it was set aside as a goal, even considered impossible stuff of fantasy.

Turn on any 1970s-era calculator. It's on and usable immediately.

Turn on any 1970s or 1980s game console. It's on and usable immediately.

Turn on any 8-bit home computer. Give it a second or two, and there's the BASIC prompt. You can start typing code or use it as a fancy calculator (a favorite example of Jef Raskin). To be fair, it wasn't quite so quick as soon as you started loading extensions to the operating system from a floppy disc (such as Atari DOS).

That it got to where it wasn't unusual for a PC to take from ninety seconds to two minutes to fully boot-up shows just how far things had strayed from the simple, pleasing goal of instant-on. Yes, operating systems were bigger and did more. Yes, a computer from 2000 was so much more powerful than one from 1985. But those long boot times kept them firmly rooted in the traditional computer world. They reveled in being big iron, with slow self-testing sequences and disjoint flickering between different displays of cryptic boot messages.

And now, thankfully, instant-on is back. Maybe not truly instant; there's still a perceived start-up time on an iPad. But it's short enough that it doesn't get in the way, that by the time you've gotten comfortable and shifted into the mindset for your new task the hardware is ready to use. That small shift from ninety seconds to less than ten makes all the difference.

(If you liked this, you might like How Much Processing Power Does it Take to be Fast?.)

permalink December 19, 2010

previously

archives

twitter / mail

I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.

Where are the comments?