I was eating lunch today out, as I normally do, and oddly enough, the following question came up. "Why are there no old programmers?"
This was based on the following observation, which was, admittedly, rather limited. At our company, we have maybe 6-7 programmers around 30, perhaps 3-4 above 30 but within their early 40s, probably another 5-6 aged 25 or under. In any case, there aren't any programmers (well, maybe one) that are 45 or above. It's a relatively young company, and I'm on the older end.
Why is that? Why don't we find programmers above 45 or so.
It's not that they don't exist. I think one thing about "good" companies is that they do hire young, because, while older guys do have some experience, they have to continuously learn, and that's a new skill that's being expected in computing that didn't exist before.
Think about this. If you look at programming in the early 80s, it didn't require you understand databases (mostly), version control, reading incomplete APIs on the web, knowing about stuff like RSS or CSS or various protocols or learning a sophisticated IDEs, and then learning new stuff a few years later.
Indeed, many jobs require you do roughly the same thing over and over again, and use those skills over and over. For a programmer, at the time, that would mean learning a single language, learning the techniques of the day, learning to debug in a certain way. All that time spent getting good at that, and they need to learn something new again.
To give an analogy, think about MS Word. They just came out with Word 2007. This is supposed to be an overhaul of the system. All the time you spent learning how to do stuff in Word, may have to partially relearned. At the very least, you have the benefit that it's MS releasing the product, and they won't do something completely different. But it is different enough. You do need to do a fair bit of work to adapt skills you didn't think had to be adapted!
This happens all the time to programmers and worse. There are more resources to programmers, just as their are more resources to everyone--from the Web! You now have to find libraries and documentation and have to know where to find it. Scouring the Web has become a necessary skill. Before, you had a limited number of choices, if it wasn't obvious from the software you had what to do, then that was it. With the Web, there's always some search that might yield results, which is both boon and bane, good and bad.
Older programmers learned to program, but in a certain language, in a certain way. They didn't expect that they had to keep up with programming like some people keep up with Hollywood gossip.
The generation of programmers that learned during the 90s and beyond are starting to realize that the skill of continuous learning is needed. To give an example, recently, someone in our company found a nifty tool and suggested that people give it a try.
So what causes one person to think "Yeah, that's cool, I will try it out", and another to think "Well, I'm not required to use it, and I don't see a need for it, so why bother?". The modern programmer needs to look for new tools and try it out, and see if they can use it. It's like the person searching for that perfect diet, except where that search is typically futile, finding good tools is often not futile.
But why do people not learn new tools? First, they have to find it. Then, they have to install it. That may not be trivial. And imagine it takes a day of tinkering to get right. At what point is someone likely to say "Forget it. Unless I'm required to do it, I'm not going to waste my time, not even 20 minutes, trying to get this to work."
Then, you have to use it and get some value out of it. This may force you to work in a way you're not used to. Think of version control. Version control offers a ton of benefits, but there's work. You have to think about version control, what commands are available, knowing whether to branch or not, and perhaps fixing stuff up that's broken. Imagine what happens when you don't use version control. You don't have to deal with any of this.
And the downside? If you lose stuff, you lose stuff. And some people are content with that, because it means less they have to worry about right now.
And installation? Not all software installs easily. Some require hunting for libraries and such. Think about every computer science class you've taken. Most of them prepare all the software you need. That way, you don't have to deal with those headaches yourself. The good news is, over time, installation has become a lot easier. In the old days, you were left to figure out all sorts of things on your own, including, if you were on UNIX, what kind of UNIX you were on.
If you have the attitude that you want to use a new tool, even if it takes time to master, then you're likely to become better, to improve because you are willing to waste ten hours to save ten seconds. You're willing to force yourself to think in a new way because it offers benefits to you.
I'll give you another example. Since I went through academia, I became aware of TeX and LaTeX, which are the typesetting system created by Knuth and modified by Leslie Lamport (and since then, modified by a bunch of other folks). Knuth is not only a computer scientist, he's an aesthete. He cares about beauty.
He spent ten years of his life trying to preserve high quality typography. He cared that in good texts, ff, placed together, is replaced by two f's that overlap. That if you have the word Vast, the "a" sits underneath the "V", which doesn't happen with most word processors because each character is in a bounding box that doesn't overlap.
I know, for example, there is a different between left double quotes and right double quotes. Indeed, one of our data sources uses "TeX" style representation to represent left double quotes (two backticks in a row is considered a left double quote). I recognize this because I've used TeX.
But the average computer programmer graduating from college is not likely to have seen this. Indeed, they may have only learned about ASCII, heard about Unicode, and not realized that Unicode is not merely a 16 bit extension of ASCII (though at its core, that's what it is), and that this increase in characters allows a lot more punctuation.
Thus, the average computer science graduate might not have to worry about fonts, but someone out there
does or they have to worry about some standard or they have to worry about Unicode. This often means spending a good deal of time learning about stuff they didn't teach you.
So this fella, who's actually pretty young, had never heard of this, didn't do that much research on the topic--didn't even know there was more to be researched (even as I pointed out an article to him), and probably finds all this rather tedious. Why study this when there's more interesting, more
straightforward stuff to learn. Read about the world in a book, and it reads like a story. Let someone digest it for you so you can regurgitate it back.
To give you another analogy. At the end of every year, there's some sophisticated formula for deciding who needs to win and lose to get into the NFL (American football) playoffs. Most announcers have no idea how this is done, because they simply don't sit down and learn it. They figure they were bad at math, and so there's no reason to learn it, and they stop.
Or how the salary cap works. That's usually beyond most people, but once you study it and figure it out, it's not that hard.
So I posited that, unless there was going to be something completely different coming up, that the programmers of today might become old programmers. They're willing to spend the time figuring this out or figuring that out, even stuff that seems a complete waste of time to learn. It's this desire to figure new stuff out that will make today's programmers age gracefully.
It's not to say that every programmer will survive. After all, there's still a fair bit of programming that requires, say, debugging, and some people simply don't like debugging code, especially code they didn't write.
But these skills were skills that programmers of the past, even very bright people, those with Ph.D's didn't have. In math, you learn a notation and a system of proof, and that's it. They don't change notation on you just because it's trendy to do so. It means if you sent a mathematician through time in the future, they would have some chance of following a proof today, because the language stayed the same, but someone who wrote FORTRAN would find today's C++ code nearly indecipherable. The ideas involved, the sophistication used, and this is just day-to-day everyday programming, not genius code, would be hard to figure out.
We're still in the early stages of computer programming development, maybe comparable to Babe Ruth playing baseball. There's likely to be a lot more thought given to the craft of programming. In the meanwhile, we're still in the quagmire of today, where learning to program is still very hard, and learning to program well, requiring a philosophy, requiring us to care how the code is written.
The question is whether we want to grow old doing it.