I once read that computer science brings together people who think in the same way. That is, they think algorithmically. How best to describe this kind of thinking? Basically, it's problem solving using tools. The tool part can't be underestimated, because it's essential to the kind of solution you get.
For example, suppose the problem is eating spaghetti. I give you a fork and possibly a spoon. How do you eat it? One way is to get the fork in the spaghetti, use the spoon to push the fork against, and swirl. The spaghetti gets wrapped around the fork. When there are no strands left out, then you eat.
But, now I ask you to eat spaghetti with chopsticks. The swirling trick doesn't work nearly as well. Instead, you have have to lift the spaghetti up in the air, put your mouth underneath, and eat. Or you bring the bowl to your mouth, and slurp.
In any case, how you eat depends very much on what you use to eat. It depends on the tools available tool to you.
In a way, programming is much the same. Now, most people would tell you that the programming language is the tool. Say, Java or C++ or C#. However, that's no longer the case. The kind of software people want to write, say, server software, would require so much effort to develop the code to do the infrastructure that you'd drive yourself nuts. Other people have therefore created framework to make it easier to write these applications.
That's supposed to help you program. But, then people have developed all sorts of things that are peripherally realted to programming. For example, if you have multiple developers, you want some way to prevent one person from overwriting the work of another person. Software firms use some kind of version control system. There are several to choose from: CVS, Subversion, ClearCase. The problem? Suppose you get to know CVS reasonably well. Now, the company wants to switch to ClearCase. Now you have to learn more stuff.
Is this reasonable? Technology is both boon and bane to programmers. In order to program Java, I use Eclipse. Eclipse is not emacs. It has lots of features, in fact, far too many for me to keep track of. I'm sure I'm not using a bunch of features I could be using. Whose fault is it that I don't use it? I'd say it's Eclipse's fault.
You see, at one point, I learned how to program. Those basic skills are still the fundamentals of programming. Yet, the entire environment has changed. A good programmer these days must learn how to use a testing system, a version control tool, a build language like Ant, possibly XML and processing XML, possibly SQL of some sort, some server technology. Now, tell me, is that really necessary?
To give you an analogy, suppose I tell you to learn a musical instrument, say, the clarinet. You invest time learning the clarinet. Do you expect the clarinet to change over the years you use it? What if they move the keys around? Add keys. Remove keys. Add bizarre features? A double reed. An electronic sound modifier? What if the music notation changed on you? You have to learn to read a new form of music. It offers features (I can't imagine what) that you've never seen. You can see it's beneficial, but it's work.
Next week, you're supposed to be playing a trombone. Or now, you have a piano, and have to deal with chords. At one point in your life, you played the clarinet, and while that music is still fundamental to what you do, there's more crap to deal with. How much of this is algorithmic thinking?
My feeling is that, yes, the tools are changing, and changing, and changing some more. They address problems that programmers deal with, but there's no agreed upon solution. Thus, variation upon variation upon variation. You know why Mac lovers hate PCs and why PC lovers hate Macs? Because using the other computer makes them feel like idiots. They know how to use one computer well, and now they're told everything they learned is different, and behaves in some way they're not used to.
Is that worth it? Some people don't care. They love the new. They'll sit down and figure out whatever someone else has come up with, and deal with it.
I remember there was a skateboarder named Mark "Gator" Rogowski. He was a star in the skateboarding community, before drinking and self-abuse eventually lead him to jail. At one point, when skateboarding in parks lost its allure, a new kind of street skateboarding. Mark tried to keep up with this new style, but found he couldn't learn the new way of skateboarding. He tried using his old skills, but people didn't want to see it.
Did Mark suddenly forget how to skateboard? He got good at what he did, and then things changed on him. He wasn't able to learn the new stuff, or wasn't willing to.
To me, that's what it feels like programming has become. Most people just deal with it, but it's not like mathematicians, who basically have agreed what makes a valid proof. Once they have that, they just think and think. They don't have to learn weird stuff to do what they've always done. Sure, new techniques do become available, and at times, there have been mathematicians who couldn't wrap their heads over different math techniques, but for the most part, the way they do math lasts.
Anyway, like many other programmers, I do what I can to deal.
Three recent talks
-
Since I’ve slowed down with interesting blogging, I thought I’d do some
lazy self-promotion and share the slides for three recent talks. The first
(hosted ...
5 months ago
No comments:
Post a Comment