I'm James Hague, a recovering programmer who has been designing video games since the 1980s. This is Why You Spent All that Time Learning to Program and The Pure Tech Side is the Dark Side are good places to start.
Where are the comments?
Programming Modern Systems Like It Was 1984
Imagine you were a professional programmer in 1984, then you went to sleep and woke up 30 years later. How would your development habits be changed by the ubiquitous, consumer-level supercomputers of 2014?
Before getting to the answer, realize that 1982 wasn't all about the cycle counting micro-optimizations that you might expect. Well, actually it was, at least in home hobbyist circles, but more of necessity and not because it was the most pleasant option. BASIC's interactivity was more fun than sequencing seven instructions to perform the astounding task of summing two 16-bit numbers. Scheme, ML, and Prolog were all developed in the previous decade. Hughes's Why Functional Programming Matters was written in 1984, and it's easy to forget how far, far from practical reality those recommendations must have seemed at the time. It was another year of two parallel universes, one of towering computer science ideas and the other of popular hardware incapable of implementing them.
The possible answers to the original question--and mind you, they're only possible answers--are colored by the sudden shift of a developer jumping from 1984 to 2014 all in one go, without experiencing thirty years of evolution.
It's time to be using all of those high-level languages that are so fun and expressive, but were set aside because they pushed the limits of expensive minicomputers with four megabytes of memory and weren't even on the table for the gold rush of 8-bit computer games.
Highly optimizing compilers aren't worth the risk. Everything is thousands, tens of thousands, of times faster then it used to be. Chasing some additional 2-4x through complex and sensitive manipulations isn't worth it. You'll regret your decision when for no clear reason your app starts breaking up at high optimization settings, maybe only on some platforms. How can anyone have confidence in a tool like that?
Something is wrong if most programs don't run instantaneously. Why does this little command line program take two seconds to load and print the version number? It would take serious effort to make it that slow. Why does a minor update to a simple app require re-downloading all 50MB of it? Why are there 20,000 lines of code in this small utility? Why is no one questioning any of this?
Design applications as small executables that communicate. Everything is set-up for this style of development: multi-core processors, lots of memory, native support for pipes and sockets. This gives you multi-core support without dealing with threads. It's also the most bulletproof way of isolating components, instead of the false confidence of marking class members "private."
Don't write temporary files to disk, ever. There's so much RAM you can have nightmares about getting lost in it. On the most fundamental level, why isn't it possible to create and execute a script without saving to a file first? Why does every tweak to a learning-the-language test program result in a two megabyte executable that shortly gets overwritten?
Everything is so complex that you need to isolate yourself from as many libraries and APIs as possible. With thousands of pages of documentation for any system, it's all too easy to become entangled in endless specific details. Build applications to be self-contained and have well-defined paths for interfacing with the operating system, even if those paths involve communicating with an external, system-specific server program of sorts.
C still doesn't have a module system? Seriously? And people are still using it, despite all the alternatives?
(If you liked this, you might enjoy Remembering a Revolution That Never Happened.)