It's not about technology for its own sake. It's about being able to implement your ideas.
Interaction designers have leveled some harsh criticisms at the GUI status-quo over the last 20+ years. The mouse is an inefficient input device. The desktop metaphor is awkward and misguided. Users shouldn't be exposed to low-level details like the raw file-system and having to save their work.
And they were right.
But instead of better human/computer interaction, we got faster processors and hotter processors and multiple processors and entire processors devoted to 3D graphics. None of which are bad, mind you, but it was always odd to see such tremendous advances in hardware while the researchers promoting more pleasant user experiences wrote books that were eagerly read by people who enjoyed smirking at the wrong-headedness of an entire industry--yet who weren't motivated enough to do anything about it. Or so it seemed.
It's miraculous that in 2011, the biggest selling computers are mouse-free, run programs that take over the entire screen without the noise of a faux-desktop, and the entire concept of "saving" has been rendered obsolete.
Clearly, someone listened.
(If you liked this, you might enjoy Free Your Technical Aesthetic from the 1970s.)
permalink April 11, 2011
I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.
Where are the comments?