Sunday, May 07, 2006

There's No I in Team

Have you ever thought about a genie who would respond to your commands if you would simply utter his name? You don't think that's possible? Ah, but I think it is.

I shall utter the genie's name and he shall soon appear.

Jared Richardson! Jared Richardson! Jared Richardson!

There, that should get his attention. Jared's on vacation I think, and so he probably won't read this until he gets back unless he's a weenie that has to check his RSS feed even when he's on vacation (but, but, I might be missing stuff!).

I just read Jared's latest blog entry (augh, how I hate to have to clip in links) about team building.

One question I asked during a question and answer session for the expert panel was "What job do you think will the software industry have ten years from now that they don't have now?". Crystal ball questions are tough to ask because many people don't look that far ahead. They care about next year at best. Ask them what will happen twenty years from now, and they'll joke about something like maybe enjoying living on a tropical island.

One group of people who, in principle, make it their jobs to look ahead are science fiction writers. This is why it's interesting to get a Bruce Sterling to speak to a tech audience, especially since he's asked to prognosticate only forty or fify years ahead. But I think it's a useful exercise to think about.

For example, I can think of several jobs that might exist depending on how code is developed. If you develop code rather cavalierly to gain speed in development, then there will soon be plenty of undocumented code, especially if you are like the typical developer who hates to spend any time documenting stuff. Some organizations, like Microsoft, now have nearly two testers for every coder. Might there be someone who's job, then, is to document and refactor code? You develop the code. It goes through testing. I begin to come up with a way of documenting stuff in parallel, realizing the changes may force me to keep up, just as changes in code forces testers to keep up.

It seems like a silly job. Cleaning and documenting code. But at one point, testing seemed like a silly job too. Even if you believed it was important (and it is), how do you get testers? Even now, it's a problem. Most computer science departments haven't realized just how important testing is, at least, as a discipline, and so they haven't worked it into their curriculum. If we're going to get better testers, part of the answer is getting computer science departments to realize they aren't just teaching syntax of languages, they are teaching software development lifecycles.

Related to cleaning and documenting code is something higher level, and that's the tech writer. The tech writer can present the code in a readable format so even people outside the team have some idea of what's going on. Hopefully, in clarifying issues, people will realize whether their designs make sense. Admittedly, some groups put their design up front, and spend lots of time in a rather formal process. It takes time to develop, but one hopes it leads to code that's been documented and well thought out. It may not be that agile.

However, there's one job I wonder if might become more relevant, and that's teaching. There are schools that teach nursing, say, and presumably know enough content to teach nurses adequately. However, it would be daunting to produce the software equivalent of that. Consider the frameworks, the languages, the databases, and so forth. How many people even know a fraction of all that? And how long is that information good for?

Even so, as folks involved in software, we have to learn to deal with the ever changing landscape of technologies and tools. This means keeping up with the tools and techniques that are out there, making a decision on whether to learn it, then learn it. Every software developer could stand to be better at what they do.

Jared points out, in his article, that he's in favor of the daily meeting, which tries to deal with problems as they come up. I've heard of this. I read a book on Scrum, which is considered part of agile methodology, but honestly, could apply to other displines beyond software development. They favor stand-up meetings where they deal with issues of what has been done, and what needs to be done.

Eventually, the answer may be, the guy needs more training in something. For example, we could all stand to be experts in CVS, but most of us do just enough to check things in and out and make comments. Forget about trying to branch and merge. Let the content management guy take care of that. Forget about running a command-line diff or patch UNIX style. That's black magic.

At what point should we decide to evaluate the skills of the people we have and teach them to get better at what they do. The company invests in the skills of its employees. This tends to run counter to what companies like to do, especially those like Microsoft or Google, who figure they have talented people who need little direction. Even some direction may provide a much better employee.

Ultimately, the goal would be for the person to get skills beyond the basics. For example, I could teach you, say, Eclipse. But then, could you take that and learn all you could about, say, IntelliJ? And how would you go about doing this? And how do you go about teaching people how to learn? Because ultimately, this is what makes people better.

Dave Thomas has pointed out that too many people in software are novices are just beyond that. They aren't very good at what they do. He says that while software may have had advances. Programming languages do more. Computers are faster. However, people aren't that much better.

I argue that it's harder in software development to keep up than other disciplines, though I'm sure any engineer says the same about their discipline (we're hard too!). We need the ability to evaluate what's going on. Fortunately, other people are in the same boat, so we can often let someone who has thought about it more than us do that decision making for us. Software architects, I've learned, must do this. Their job is to evaluate and pick technologies for developers to use, basing their decision, one hopes, on reasonable and accurate criteria.

Now it's easy to say "but these guys don't want to learn". I'd argue there's at least three categories of folks. There's those that learn it on their own, with varying degrees of success. Some dig deep to try to understand stuff like an expert. Most learn it just well enoug to do what they need to do, even it's not optimal or "correct". Others don't really want to learn much more especially if they have to go through a period of being an idiot.

One thing that's changing is the idea that you can master some tool and use it the rest of your life. These days, you master it long enough, and then throw it away, and pick up something new. Usually, there's enough overlap that you don't have to completely toss out everything you'd know, otherwise, life would be too miserable.

But, there are those in between that are capable of learning, but just not as fast as others, and perhaps not with the same vigor. However, in a controlled environment, where they are taught stuff, and quizzed on stuff, they would excel. I've had students who complain that they would do better if they were forced to learn the material. I think forced? Why must they be forced?

Because as much as we want everyone to love, love, love programming, many people find it a skill that they do. How many garbage men want to excel at what they do? How man sales clerks? People program because it's the one thing they got good at, but they don't have to be passionate about it. A bit of professionalism can help matters out.

However, it takes a special person to set up such a thing, especially in a company that may be tight on trying to get stuff built. The more you go on side projects like this, the more people might prefer doing them over doing their real work, because it's small and throwaway. Management might not care for that.

Still, there are places that have done interesting things. In order to keep morale up, Yahoo decided to have a "programming day". It was a day off. During this day, you could work on any project you wanted. Of course, if you wanted to do something interesting, you needed to recruit others to help out on your project. People had one day to get something to work, and prizes would be awarded at the end of the day.

Even some of the failures turned out to be good ideas in the long run, because they were often a bit too ambitious for a single day's worth of work.

Now suppose your company is a shop that does mostly Java development. You agree that three of you will learn a little about Ruby and post up tutorials on the Wiki. Then, on the special day, you'll create a project using Ruby on Rails and throw in some Ajax for good measure. At the end of the day, you list out all the stuff you thought were missing in the experience (say, Ruby was harder to pick up than expected). Out of this, you might decide to work on other individual projects on your own.

Google is famous for this 20% rule, which allows employees up to a full day to work on some other project, should they choose.

The question is how do we encourage people to learn and to get better. Having a corporate teacher who runs something like a class and can help people identify weaknesses may prove beneficial in the long run.

So that's my new idea, Jared. What do you think?

No comments: