Thursday, April 23, 2009

What is Computer Science?

I was having a debate with a friend, which wasn't much of a debate to be honest, because we weren't really debating.

He felt that a computer science program should cover the fundamentals which, to him, meant math. You should know things like algorithms, linear algebra, statistics. He said things like IDEs and version control should not be taught. You have limited resources, and those things are fads that change. And since industry uses these things, then let industry teach it, or pick it up on the job.

He argued that no one is going to pick up linear algebra if they aren't taught it in classes, but people can always learn IDEs and version control.

At the time, while I disagreed, I didn't present any counterarguments.

So it made me think. Would a typical person actually pick up those things at work? There are fairly smart people that don't want to learn IDEs or version control because it is a fad. Could they learn it on their own? Perhaps, but there are so many mundane details to learn, much of which isn't that important to learn, that people often go in the other direction. They learn just enough to get by without fully understanding it.

To take this to an extreme, there are smart people that know math that don't even bother to learn to program properly. If you say IDEs and version control is unimportant because of its faddish nature, then you might as well say learning to program in any popular programming language is also faddish. It can take at least as long to really learn a language well as it does to learn the basics of stats or linear algebra (i.e., 3-4 months of focus) and it takes a certain personality to learn a language as well as possible.

The problem, in my mind, is that computer science, as it leads to a programming job, is faddish and fundamental. If you focus on the fundamentals, then you are telling graduating majors that everything else is a fad and is "easy" to pick up. These faddish things are in fact not easy to pick up and are difficult in completely unsatisfying ways.

At least, you can argue that if you learn linear algebra or stats and it's tough to figure out, then when you finally do figure it out, then you can feel you've understood something deep.

This isn't really the case with IDEs and version control. You learn a lot of mundane details and wonder why certain things even exist, and what their purpose is for. A typical industrial strength IDE has hundreds of options. There are ways to extend the IDE (say, for an open source IDE like Eclipse), ways to integrate it with a build system, with a bug-tracking system, etc.

So many details are there, and it takes a long time to master them, and it's not always that uniform from one product to another.

My thinking is that we need to teach how to deal with this because it is a survival skill in the industry, despite being so very ephemeral.

In other words, despite the fact that people may not choose to learn linear algebra once they go into industry, they may, for quite different reasons, not choose to learn an IDE particularly well when they arrive there either. To assume that they will is a bit of folly.

No comments: