There are a few jobs out there where you learn skills over time, and then you're set. I'm over-simplifying, I'm sure. There's the folks at the McDonald's selling you food. The postman (or woman) delivering mail. The inspirational speaker cheering you on to success.
How many people would think that college teachers you everything? If you think of college as teaching you a set of skills, rather than a set of meta-skills, you might still get a job as a programmer, but would you be any good?
For example, you may learn how to set up a database in MySQL, but could you transfer those skills to Postgres? You may have learned Java, but can you handle C#? For some, these differences are trivial. It's easy to pick up one, or another. You don't need a class to teach you this stuff.
Which begs the question, why not? It's assumed that, having learned one thing, you will have the ability to learn something else. And yet, if you pay attention to the classes, they almost never teach you these skills about learning. Indeed, a programming class in Java covers Java.
True, many programming skills you pick up aren't particularly Java specific. Loops, recursion, object-oriented programming. But there are plenty that are, such as JARs, classpaths, WARs, and so forth. These hardly translate to other languages. Indeed, many programming courses like to skip the things about a language that make it specifically that language, feeling it won't carry over, and thus depriving the student of what they need to know, which is the ability to learn something new, something specific to whatever they are learning.
If you learned UNIX, you may complain that you know nothing about Windows. And indeed, there's nothing very specific you can do. No one book covers the useful things you need to know in Windows. You pick it up over time. To be fair, I think UNIX is better, and that the stuff you learn, provided you get past the usual ls, cd, and so forth of UNIX, and hit some of the advanced topics (environment variables, shell programming, tty, sockets), there are books on the topic.
Of course, the web has now become the preferred way to pick this information up. Java was nice because java.sun.com was a nice resource. But teachers who taught programming weren't used to thinking of the web as a resource, and indeed, I think many books on Java don't even discuss the web as a resource to find information.
The way to find out how good programmers are good is by following them in their footsteps for some time and asking questions, and alas, knowing what those answers mean.
The rare skill is finding information and knowing where to look, and then trying it out, and seeing if that works, and seeing if something else works. Indeed, much of this is like playing a video game, which is why, I suspect, good programmers and video game playing co-exist. It's not to say that a person who hates video games can't be a genius in coding, but that video game playing has those skills (in a very general way) you need, such as persistence, just trying stuff out, that you need to solve problems.
In the end, some level of brute force is needed.
And this bothers some people. They wonder why this information isn't simply taught. Why is brute force needed? And it's not pure brute force either. You need to know where to look, or several places to look, then to interpret the results, and then use that to guide you to new things.
And then it helps to remember the answer you found. Otherwise, you retrace back and figure it out again.
I know, there's the simple part of being "smart", to understand deep concepts such as concurrency, or algorithmic analysis, or some such. But there's also simply diving deep.
Hard as that may be to believe.
Three recent talks
-
Since I’ve slowed down with interesting blogging, I thought I’d do some
lazy self-promotion and share the slides for three recent talks. The first
(hosted ...
4 months ago
No comments:
Post a Comment