I've got an interview with a company that claims to score a 12 on the Joel Test. [...] What are some ways of determining if they really implement all 12 points? Are there any particular questions I can ask?It's reasonable to say, "show me." Ask them for examples and concrete details of their support for the Joel Test subjects. Since they claim they score all 12 points, they are obviously proud of it. People tend to like to show off, so they'll probably be eager to share more details.
If you ask more specific questions, it'll become apparent from their descriptions whether they really have those good practices.
We can think of many specific follow-up questions to the basic questions. The Joel Test questions are in bold below, and my follow-ups, er, follow:
- Do you use source control? What source control system do you use? Why did you pick that one? What is your branch/release policy? What are your tag naming conventions? Do you organize your tree by code vs. tests at the top with all modules under each directory, or do you organize by module at the top with code and tests under each module directory?
- Can you make a build in one step? What tools do you use to make builds? How long does it take to go from a clean checkout to an installation image? What would it take to modify the build? Is it integrated into your testing harness? What would it take to duplicate a build environment? Are the build scripts and tools also under source control?
- Do you make daily builds? What software testing tools do you use for daily builds? Do you use a Continuous Integration tool? If so, which one? How do you identify who "broke the build?" What is your test coverage?
- Do you have a bug database? What bug tracker software do you use? Why did you pick that one? What customizations did you apply to it? Can you show me trends of rate of bugs logged or bugs fixed per month? How does a change in source control get associated with the relevant bug?
- Do you fix bugs before writing new code? What is your bug triage process? Who is involved in prioritizing bugs? How many bugs did you fix in the last release of your product? Do you do bug hunts with bounties for finding critical bugs?
- Do you have an up-to-date schedule? Can I see it? How far are you ahead of/behind schedule right now? How do you do estimating? How accurate a method has that turned out to be?
- Do you have a spec? Can I read one? Do you have a spec template? Can I see that? Who writes the specs? Who reviews and approves the specs?
- Do programmers have quiet working conditions? Can I see the cubicle or work area for the position I'm interviewing for? (or an equivalent work area)
- Do you use the best tools money can buy? What tools do you use? Are you up to date on versions? What tools do you want you don't have yet? Why not?
- Do you have testers? How many? Can I meet one? Do testers do black-box or white-box testing?
- Do new candidates write code during their interview? What code would you like me to write? What are you looking for by seeing my code?
- Do you do hallway usability testing? How frequently? Can I see a report documenting one of your usability testing sessions? Can you give me an example of something you changed in the product as a result of usability testing?
Beware if their answers to the specific follow-up questions are evasive like, "um yeah, we are committed to doing more best practices and we'll be looking to you to help us effect changes toward that goal." If they're so committed to it, why don't they have anything to show for it yet? Probably because like many companies, when the schedule is in jeopardy, following "best practices" goes out the window.
I'm posting to my blog the questions I've answered on StackOverflow, which earned the "Good Answer" badge. This was my answer to "Administering the Joel Test."