It's not about technology for its own sake. It's about being able to implement your ideas.
The iPhone has obsoleted a number of physical gadgets. A little four-track recorder that I use as a notebook for song ideas. A stopwatch. A graphing calculator. Those ten dollar LCD games from Toys 'R Us. And it works because an iPhone app takes over the device, giving the impression that it's a custom piece of hardware designed for that specific purpose.
But it's only an illusion. I can be in the middle of recording a track, and I get a call. That puts the recorder to sleep and switches over to the phone interface. Or I can be playing a game and the "Battery is below 20%" alert pops up at an inopportune moment. These are interesting edge cases, where the reality that the iPhone is a more complex system--and not a dedicated game player or recorder--bleeds into the user experience. These intrusions are driven by things outside of my control. I didn't ask to be called at that moment; it just happened. I understand that. I get it.
What if there was something I could do within an app that broke the illusion? Suppose that tapping the upper-left corner of the screen ten times in row caused an app to quit (it doesn't; this is just an example). Now the rule that an app can do whatever it wants, interface-wise, has been violated. You could argue that tapping the corner of the screen ten times is so unlikely that it doesn't matter, but that's a blind assumption. Think of a game based around tapping, for example. Or a drum machine.
As it turns out, two such violations were introduced in iOS 5.
On the iPad, there are a number of system-wide gestures, such as swiping left or right with four fingers to switch between apps. Four-finger swipes? That's convoluted, but imagine a virtual mixing console with horizontal sliders. Quickly move four of them at once...and you switch apps. Application designers have to work around these, making sure that legitimate input methods don't mimic the system-level gestures.
The worst offender is this: swipe down from the top of the screen to reveal the Notification Center (a window containing calendar appointments, the weather, etc.). A single-finger vertical motion is hardly unusual, and many apps expect such input. The games Flight Control and Fruit Ninja are two prime examples. Unintentionally pulling down the Notification Center during normal gameplay is common. A centered vertical swipe is natural in any paint program, too. Do app designers need build around allowing such controls? Apparently, yes.
There's an easy operating system-level solution to the Notification Center problem. Require the gesture to start on the system bar at the top of the screen, where the network status and battery indicator are displayed. Allowing the system bar in an app is already an intrusion, but one opted into by the developer. Some apps turn off the system bar, including many games, and that's fine. It's an indication that the Notification Center isn't available.
(If you liked this, you might enjoy Caught-Up with 20 Years of UI Criticism.)
permalink December 30, 2011
I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.
Where are the comments?