techcrunch.com/2012/04/07/when-code-is-hot -> techcrunch.com/2012/04/07/when-code-is-hot/
Follow me on Twitter Jon Evans is a novelist, journalist, and software engineer. His novels have been published around the world, translated into several languages, and praised by The Times, The Economist, and the Washington Post. His journalism has appeared in Wired, Reader's Digest, The Guardian, The Globe & Mail, and The Times of India, and he writes a weekly column for TechCrunch....
hundreds of thousands to its online programming tutorials. "Those jumping on board say they are preparing for a future in which the Internet is the foundation for entertainment, education and nearly everything else ...
went on to write about how "many professors of computer science say college graduates in every major should understand software fundamentals." At parties these days, people are more impressed when I say I write apps than when I say I've had a few novels published. If so, it should seem unreservedly great to those of us who started programming when we were ten and haven't much stopped since. So why does this sudden surge of enthusiasm make me feel so uneasy? Partly, I suppose, because something like this happened once before, and it didn't end well. Remember how hackers were hot in the late '90s, and would-be dot-commers flooded computer-science classes everywhere? Demand for programmers back then was so high -- sound familiar? Half of every team I worked in back then was composed of people who couldn't be trusted with anything beyond basic programming grunt work, if that. It's no coincidence that the best technical team I ever worked with was in 2002, right after the dot-bust weeded out all of the chaff. But mostly, I think, I'm uneasy because it seems like the wrong people are taking up coding, for the wrong reasons. It's disconcerting that everyone quoted in the articles above say they want to be "literate" or "fluent", to "understand" or to teach "computational thinking." Learning how to program for its own sake is like learning French purely on the off chance that you one day find yourself in Paris. People who do that generally become people who think they know some French, only to discover, once in France, that they can't actually communicate worth a damn. Shouldn't people who want to take up programming have some kind of project in mind first? Non-coders tend to think of different programming languages as, well, different languages. I've long maintained that while programming itself -- "computational thinking", as the professor put it -- is indeed very like a language, "programming languages" are mere dialects;
Like other languages, though, or like music, it's best learned by the young. I am skeptical of the notion that many people who start learning to code in their 30s or even 20s will ever really grok the fundamental abstract notions of software architecture and design. Stross quotes Michael Littman of Rutgers: "Computational thinking should have been covered in middle school, and it isn't, so we in the CS department must offer the equivalent of a remedial course."
So let's focus on how we teach programming to the next generation. But tackling a few online tutorials in your 20s or later when you have no existing basis in the field, and/or learning a few remedial dumbed-down concepts in college? I fear that for the vast majority of people, that's going to be much too little, far too late. Joseph Conrad didn't speak a word of English until his 20s, and he became one of the language's great stylists. But most of us need to learn other languages when we're young. I'm sorry to say that I think the same is true for programming.
|