Joel Spolsky has been complaining about the dumbing-down of computer science education. He's not the only one. Suddenly, out of the woodwork, you are getting folks who are agreeing with this. Is this problem specific to computer science? Do all the other disciplines have great teachers, and computer science awful ones?
There are a plethora of problems with computer science education, and I'll hit some of them myself, but the solutions are, frankly, very hard. Some of the issues are institutional, mired in the way academia views computer science education. Some of it is merely the mission of the university, seeking to educate as many students as possible, and the resulting mediocrity that's sure to come from that. Some of it is due to the incredible faddishness of the industry that pulls everyone in a million directions, and declares that their one obscure area of expertise is what every student should learn (a recent article proclaimed everyone should learn compilers--pretty soon, you hear everyone should learn algorithms or type theory or AI or network security or linguistics).
Let's begin with computer science education itself, and why it's causing problems. Perhaps the first problem with programming is that it's gotten quite complex. Object-oriented programming may seem particularly cool especially when it caught fire in industry in the late 80s, and universities struggled to keep up.
Object-oriented programming is tough compared to the simplicity of C or Pascal (and C isn't that easy either). But we continue to teach programming as if it were C or Pascal because academia doesn't want to admit that programming got difficult, and that two courses aren't enough.
Indeed, traditional CS had two courses for programming: CS1 and CS2. CS1 was learning the basics, which honestly, was control-flow (if statements, loops), arrays, and functions. No classes, and maybe a touch of pointers. CS2 was data structures: stacks, heaps, trees, and pointers. That was all you needed.
CS courses then often taught assembly, which has disappeared (and to be honest, if it returned "as is", it would not give enough insight into low-level programming as some would imagine).
One could easily argue that object oriented programming requires at least a third course to fully comprehend programming in it. A third semester alone is useful for a horrid language like C++ where templates, virtual functions makes reading and debugging a nightmare (and memory management).
And we haven't even talked about threads!
The other part academia hasn't particularly cared for is that software engineering has become a discipline. True, many a department now have software engineering professors, but academic software engineering is a strange beast, often divorced a bit from the reality of real software engineering, and then further lacking the respect from other more mathematical disciplines that have been around longer.
Indeed, many other computer science research areas look at programming as a mere tool, useful as a means to an end, and not an end to itself.
Software developers have to deal with a lot of issues these days. Let's hit a few of the basics. At the very least, you now need to know version control. There used to be RCS and SCCS, which both sucked. Now there's CVS and Subversion, and now a whole spate of distributed version control systems. It takes a while just to understand how version control works, especially the nastiness of branching and merging.
Academics who haven't dealt with version control (fortunately, fewer and fewer) find this subject painful. What does this have to do with programming? And, in a very real sense, they are right. It has very
little do with programming, and everything to do with software development. And because it's tool-based, and because we still haven't fully gotten it right, people are going to come up with one system after another, and as soon as you master CVS, you waste time trying to learn git and other bits of arcania just to get by.
But then, there's the new trend (and it is a trend now) towards agile programming. That means unit testing. That means test driven development. That means behavior driven development. And, oh the plethora of tolls. Programming has become so convenient to the masses that the best of them produce wonderful tools. And now, people have to pay attention to their existence! If you were doing Rails development, you might play with RSpec or autotest, tools that are only about a year old, and you'd have to keep your ear pretty close to the ground to keep up. That's tough when an academic wants to do research rather than keep track of the tabloids.
Software development has lead to people using terms like "requirements gathering" and "test document" and "test plan". Documentation has gotten big, even if people routinely do a bad job of it.
Let's briefly talk testing. This used to be something a programmer does. Indeed, it's still something a programmer does. But now, there are separate folks that handle quality assurance, so much so, that it's given the name
quality assurance, and there's a whole spate of terminology and tools surrounding testing! And the mentality of testing is quite different from coding. What was considered something a conscientious programmer would do has now become its own discipline, almost worthy of a major.
Speaking of tools, what about usability? The web did a marvelous job exposing the need to write usable software. The average person doesn't understand software so much, and can quickly leave one webpage for another. A webpage has to be visually appealing, yet easy to use, and preferably both. Once upon a time, people figured the only people using programmers were other programmers. Thus, beauty, comprehensibility, and all those things people now care about were a complete afterthought.
Did we mention how faddish the industry is? Right now, dynamic languages like Python and Ruby have caught everyone's fancy. And while those languages make great strides towards wide acceptability, people are already looking for the next great panacea of a language. Whispers of Erlang, Haskell, O'Caml, Scala abound. Even if we stick to Python and Ruby, both have enough magic in it that you can do a lot of non-obvious magic.
And, one of Java's downsides is that it's so verbose that people need a good IDE to write code in the language. You simply didn't need a decent IDE for languages like Pascal. Eclipse itself is so complex, you need hundreds of pages to scratch the surface of what it can do. It's integrated with tools to test, use version control, refactor(!). Things no one much cared about 20 years ago, so that people could focus on, you know, programming (I know--it's all programming, isn't it?). Thus, a good programmer now has to master a complex IDE, and one that's not likely to be around 20 years from now.
Once upon a time, most programs didn't play well with each other. But now, people extend languages all the time. Thus, people write tons of libraries for Python and Ruby. You have to worry about what libraries exist, and how to use them. There are people that now link in other people's code. A good programmer has to locate all sorts of software, and evaluate them and decide whether to use it or not. In the good old days, you'd simply write the code yourself (badly) or simply do with a bad solution (your own).
Oh, what about open source? Want to explain the gazillion variations of open source licensing and what it means to the average programmer?
It's a big world out there. Want to explain internationalization, and how it affects your code? Is your code ready for the world market?
How about handling all those timezones and dates? That also falls under internationalization. As does, of course, Unicode (and that it's not just one code, but a family of codes).
How about databases? You don't talk about all those web frameworks without databases? And web frameworks? And XML? These are now part of the day-to-day toolset a programmer needs to know.
And that's outside of all the usual stuff academics generally care about, like algorithms, compilers, computation theory, AI, bio-computing, numerical analysis, and so forth, most of which, the average developer knows little about.
All of these topics could fill courses and course and courses that a typical computer science department doesn't even want to tackle. Why? Because five years from now, another new trend will sweep in, and people will have to learn again. And will those changes be an improvement? More than likely, not enough to offset the headaches learning it.
Now, here's what I'd love all the critics to do. Teach an intro course. Decide that everyone is an idiot, and tell it to their faces. Then, be told that you still have to get them to learn something, and feel what it's like, what it's really like, to have to get people to learn that don't want to learn. If it will make the visualization easier, imagine it's your own child, refusing to learn, wondering why it's so hard, and why there's so much crap, rather than that superstar you just hired who can't get enough of this new stuff, and can take anything you throw at him or her, and turn out magic.
Spolsky complains about the dumbing down of the curriculum, but it's only because Java doesn't do it for him. He knows that to get the speed he wants, he's dealing with languages that will give it to him. Even he's not crazy enough to believe that coding in assembly will offset the productivity losses coding in something that horrid. Don't you think that if Java ran ten times faster than C++, he'd be hapier to give up all the crap associated with C++. But because he needs stuff that runs better, runs faster, he realizes that his coders have to know these grungy details.
Compare computer science to math, where irrelevant details are left out, and where people learn deep concepts, to computer science, where dealing with complexity has lead us to fads of the day, as good as we have now, but likely to be replaced with something new, and more and more and more code out there that we have to deal with.
Now, let's take a step back. Breathe.
We can teach as much of this as we want, but learning isn't simply a bunch of concepts that you teach. It's a worldview. When you are given a problem, what do you do? Suppose someone tells you to port a device driver. Do you even know what a device driver is? Or what it means to port? And yet, some people can take something that vague, and get code to work, and someone else will say that they were never taught that in college, and how are they supposed to deal with this?
And the fact of the matter is that, as much as the industry complains, unless they are prepared to head into academia (itself, very territorial, and having its own idea about what students need), academia can basically ignore what is being said. First, academia is so distributed that most professors couldn't even tell you what courses are required for their own students to graduate. They barely care about their own class, and don't even think about how their class fits in the overall plan. To get them to work together and make such changes, especially changes that are likely to come every five years, is to against their nature that knowledge shouldn't be a total fad.
And it's contrary to the mission of universities which is to graduate students. Most software pundits would have 90% of computer science majors jettisoned, despite the fact that mediocre programmers are often needed to do a lot of work. They would have their other courses jettisoned, because there's no time to worry about all those humanities and such. If every major took that attitude, most students would not even be in college. Since most universities are in the business of graduating students, then each major has to worry about how to get students who don't understand pointers very well to do well enough to get out.
Imagine it's your job to educate all the students who want to be computer science majors. The mediocre ones and the brilliant ones. Then, your view of what they should learn changes, when you realize that it's hard to even get the basic programming down, beyond all this other crap you have to learn to be decent in the field.