The tech world, perhaps like many other niches, has its share of superstars. Of course, superstar is a bit of an exaggeration. Even within the programming community, not everyone has heard of "Matz", not everyone has heard of David Heinemeier Hansson. I was able to meet up with "Matz" and ask him a pretty silly question. Although he's attended many a Ruby conference, and although he has pretty good vocabulary, American accents and speed of talking seem to confuse him.
I asked him whether he thought about writing another language, possibly a crazy one, and he said that such languages have a very short lifetime, while languages like Ruby, being more general, have a longer lifetime.
I asked him this question because I recall that Niklaus Wirth, who wrote Pascal, wrote at least two or three other languages. But perhaps Wirth wrote languages at a time where its growth was not expected. Wirth probably figured he'd write it, and he'd be done with it. I'd guess that he would weary of thinking about what feature to add, what had to be supported, and so forth. I'm completely speculating of course.
On the flip side, two Duke students asked Matz about using Ruby as an intro language. He said that it has been used in Japan, though someone (Amy Hoy, to be precise) asked whether Ruby would make a good introductory language, and he said that some parents avoid giving their kids sharp knives, but some people give it to them. He wrote the language for himself, a programmer, and he was unsure how well it would work for beginning programmers.
I was curious why these Duke students were interested in Ruby as a language. I talked to Justin Wickett, who apparently runs a small company, and Michael Ansel, who are in computer science and computer engineering, respectively. They mentioned how there was too much math and theory, and while this would make them well-rounded, there was this huge world of web development that they were missing out on.
There is a though, prevalent in college, that students should be taught something, and taught something useful. The ultimate lesson in college should be that you can teach yourself. Even so, it's funny how most colleges, despite having this as their underlying theme, don't address the point directly. Why don't they tell students how to go about learning on their own? Indeed, it's often done by doing a poor job teaching, thus forcing themselves to pick up a language or a skill or a concept or something.
The computer industry is weird in that way. The people who best succeed need a combination of curiosity, initiative, and smarts, and it helps to have all three. the smarts will give you the ability to understand difficult things. The initiative (and/or patience/persistence) will compel you to work through times when things seem idiotic, or particularly challenging. And you have to be curious to find new things, because the industry is so faddish. People try this, or they try that. And as a conscientious developer, you need to pay attention to all sorts of things.
And that can be completely infuriating to many people who don't like that.
But back to Justin and Michael. They reflect a common belief in students that computer science departments fail to keep up with the real world, and while there's cool stuff being done, they don't seem to be doing it, and they feel bad that they lack these skills. But in particular, they feel the teachers should teach it to them (perhaps for the tuition they are paying, they wonder why they have to go through all the mistakes). But if the profs lack experience in these fields, then they lack experience, and they would be better served trying to find other resources to accomplish the tasks they want to achieve.
If that sounds like an indictment on the way we teach computer science, perhaps it is, but it also reflects the kind of reality of practical software development which flies in the face of the kind of knowledge that universities are used to imparting. Computer science professors are likely to say the only useful information is that which stands the test of time, thus compiler theory, automata theory, algorithms, data structures, are all stuff that seem to stand the test of time despite the fact that most of these subjects are less than 50 years old. In the computer industry, 50 years is a lifetime.
The problem with real world development is that it's not clean and sanitized as it is for computer science courses. Often, many of the real world difficulties are hidden away to give students one less thing to worry about. I recall talking to Jaime and he said that he ended up doing some of the work for a student working on a project because it was hairy, and the student wouldn't be able to figure it out. And yet, that's the kind of tedium real computer programming often involves.
At some point, people have to see it.
So that raises the question. Is Ruby a good programming language for beginners? Is there some suitable subset that can be taught that, while it doesn't fully show the power of the language, still covers enough of the important topics?
Many students struggle with programming, which is why a sanitized version is often given to students. And of course, they have other courses, which is why stuff is sanitized.
Three recent talks
-
Since I’ve slowed down with interesting blogging, I thought I’d do some
lazy self-promotion and share the slides for three recent talks. The first
(hosted ...
4 months ago
1 comment:
Hey there, this is Amy. I found your post googling on RubyConf and thought I'd be obnoxious and write a reply :)
To address the first point about university education: the pains you hear are kids coming to terms with the concept that they're not paying for anything, really. They expect to be taught things that will be useful, but they're not; they're expecting to be led, and they're not. They are having their time "wasted" on lots of things, though, precluding them from spending time on more relevant pursuits. They don't want to pay for that realization (with time OR money).
That's why I stopped going to university. :) Most don't come to that conclusion, though.
As for Ruby as a first programming language, I do wonder. I don't think the full Ruby experience is appropriate. I think people need a more intimate, or simpler, introduction... BASIC was good; LOGO was good, too. Ruby lets you do too much when you're still just getting the hang of how to structure an instruction for the computer. First you have to learn to think in an orderly, computer-like fashion, I believe. And there's a danger in winning too much at the beginning.
I'm really intrigued by what Why the Lucky Stiff is doing with Hackety Hack and Shoes, but I'm still wondering where the next Hypercard is... something that is ubiquitous, that lets you do stuff without programming, and eases you into code by making it so tempting and easy to add functionality to a button... and before you know it, you're programming.
Post a Comment