It's not about technology for its own sake. It's about being able to implement your ideas.
When I was in college, I took an upper-level course called "Operating Systems." It was decidedly hardcore: preemptive multitasking and synchronization, task scheduling, resource management, deadlock avoidance, and so on. These were the dark, difficult secrets that few people had experience with. Writing one's own operating system was the pinnacle of geeky computer science aspirations.
The most interesting thing about that course, in retrospect, is what wasn't taught: anything about how someone would actually use an operating system. I don't mean flighty topics like flat vs. skeuomorphic design, but instead drop way down to something as fundamental as how to start an application or even how you'd know which applications are available to choose from. Those were below the radar of the computer science definition of "operating system." And not just for the course, either. Soft, user experience topics were nowhere to be found in the entire curriculum.
At this point, I expect there are some reactions to the previous two paragraphs brewing:
"You're confusing computer science and human-computer interaction! They're two different subjects!"
"Of course you wouldn't talk about those things in an operating systems course! It's about the lowest-level building blocks of an OS, not about user interfaces."
"I don't care about that non-technical stuff! Some designer-type can do that. I'm doing the engineering work."
There's some truth in each of these--and the third is simply a personal choice--but all it takes is reading a review of OS X or Windows where hundreds of words are devoted to incremental adjustments to the Start menu and dock to realize those fluffy details aren't so fluffy after all. They matter. If you want to build great software, you have to accept that people will dismiss your application because of an awkward UI or font readability issues, possibly switching to a more pleasing alternative that was put together by someone with much less coding skill than you.
So how do you nudge yourself in that direction without having to earn a second degree in a softer, designery field?
Learn basic graphic design. Not so much how to draw things or create your own artistic images (I'm hopeless in that regard), but how to use whitespace, how fonts work together, what a good color scheme looks like. Find web pages and book covers that you like and deconstruct them. Take the scary step of starting with a blank page and arranging colors and fonts and text boxes on it. Hands-on experimentation is the only way to get better at this.
Read up on data visualization. Anything by Edward Tufte is a good place to start.
Foster a minimalist aesthetic. If you're striving for minimalism, then you're giving just as much thought to what to leave out as what to include, and you need to make hard choices. That level of thought and focus is only going to make your application better. You can go too far with minimalism, but a quick glance around the modern software world shows that this isn't a major worry.
Don't build something a certain way simply because "that's how it's always been done." There's strong programmer impulse to clone, to implement what you've already seen. That can result in long eras of misguidedly stagnant IDEs or calculator apps, because developers have lost sight of the original problem and are simply rehashing what they're familiar with.
Optimize for things that directly affect users. Speed and memory are abstract in most cases. Would you even notice if an iPhone app used 10 megabytes instead of 20? Documentation size and tutorial length are more concrete, as are the number of steps it takes to complete common tasks.
permalink June 5, 2013
I'm James Hague, a recovering programmer who has been designing video games since the 1980s. Programming Without Being Obsessed With Programming and Organizational Skills Beat Algorithmic Wizardry are good starting points. For the older stuff, try the 2012 Retrospective.
Where are the comments?