Monday, May 24, 2010

We need an issue tracking system that is more project based

It’s hard to imagine that a big company as we are, we don’t have any issue tracking for software projects. We do have Microsoft Excel and OneNote, but we all know they are not project-centered and there is no place to keep the discussion history.

For ages, we have an internal tool which is used to track issues found in testing on formal release builds. Super slow performance aside, the system serves its purpose of identifying a specific build and describing the issue. There are stages to solve an issue, and managers depend on the stage to monitor how a specific build is free of known issues.

Note I said ‘free of known issues’ instead of ‘finished with according to test plan’ or even ‘completed in development/delivery plan’. The system simply has no idea what’s in your development or test plan. Would you feel safe driving a car shipped with just a few known issues, but no one is tracking properly what are supposed to be designed or tested?

As a development team, we need a tracking system which starts with a software project, with all features listed, and track if they are all completed first development and then test, in the end. From time to time, we need to report and track issues based on target builds, as no one knows what the cause of that crash is yet. But we should usually work under project, implement this feature, test according some plan, or fixing bugs for that feature.

Now you can see the duality here. We need a system which can track status of plain issues, and also provide structure for planned features and milestones. It would be better if we have links among internal documentation, and code changes, and there are systems like that. However, that’s a bit too much for a weekend’s project.

For this purpose, I chose Mantis. This is a popular issue tracking system which I have used when I was in the software engineering program. We have used FogBugz, yeah, interesting name, ClearQuest and heavily customized Bugzilla in my previous job. All kind of famous and powerful, but not what I can setup in a weekend.
I linked it to our internal SMTP server so it sends email on issue status change, and even linked to our LDAP system so it authenticates using the same credential, but later dropped it because of security concerns.

I asked around and learnt our hardware department just started using ClearCase and ClearQuest, IC design is actually more closer to software design than most people would think these days. That’s really weird because we software teams are on Perforce but hardware teams switched from cvs to ClearCase. Maintaining yet another huge system takes time and money, especially when ClearCase is showing its age now. I know some guy who worked for Rational, the company created ClearCase. Guess what his team was using internally for revision control…

Monday, May 17, 2010

A planned approach to interview developers

The way our team interview new developers is very democratic. The resume and schedule is sent to several interviewers a week before the interview. On that day, the candidate would be led to each individual interviewer for a 45-min session. Usually the poor candidate would go through 4-6 sessions in one day. After a week or two, a short meeting would be called and everyone reports what he or she thinks about the candidate. What we actually ask during the interview is totally up to the interviewer and usually no one cares to communicate beforehand, or after.

I won’t say it doesn’t work, but we might have a better chance to improve over time if we plan ahead, execute, review, and adjust the plan for next operation. I feel the current approach has two obvious drawbacks.

First, there is simply not much time to actually understand the candidate. Without planning and communication, we all might have to start with introducing each other, what do we do in this department, how the work environment is like, and going through the resume. A newly grad did a project in X would end up explaining the same project several times. An experienced developer would have to explain why he/she left the previous employer to every interviewer. This is not efficient use of our time.

Second, we do not probe different aspects of the candidate. Since no one cares to say I am going to test this aspect, we have to test every aspect. The result is we can only probe a little bit on every aspect we care the most, but not with any depth or looking for better overall coverage.

For example, a good candidate for us needs to understand the C language, so every interviewer asks simple C questions. However, we are seeing more and more work done in C++ and Java, but no one has addressed that.

Another example would be how a senior candidate can document requirement, design a system, plan for a project, or improve quality. I am not sure how much you can do within a 45-min session.

I believe we should have templates for interviews. For newly grads, the first would be simple tests, introduction to the company/department/position, and going through the resume for general understanding. I don’t know if psychological tests are legal in the States or not, but I believe it works to some extend based on past experience.

After that would be a programming language or capability test. This can be done by the candidate alone for an hour and half. During the interview lunch, others can review the answer sheet and decide if we want to send he/she home or push really hard to see how he/she communicates under pressure.

In the afternoon, we should be working on two or three aspects of the ideal candidate we want, like OS specific knowledge, fix-point arithmetic, DSP, debugging other’s code, real-time multi-threading, and coding practice.

If we have a plan, execute according to it, and review the results, we should have a better chance to improve over time.