As an independent contractor, I frequently need to estimate software development tasks large and small. I'm working on a task that is overdue because of several factors, some of which are under my control, some are not. This task has had me considering (yet again) how to A) do a better job estimating, and B) set the expectations of my client so that they are prepared if there are complications, without scaring them unnecessarily.
I've had occasions in the past where it seemed reasonable to simply add a 25% contingency figure to schedules or budgets. That never worked for me, nor have I heard that it works for any one in the software industry. (Feel free to share your positive experiences, but I'll be hard to persuade.) I've finally figured out why that's not useful for making software estimates more accurate: it assumes that the estimator is simply overly optimistic by an arbitrary percentage of the project whole. While it's true that sometimes I'm off because I'm overly optimistic, that's resolved by being more realistic, not by jacking up the estimate randomly. No, the times I'm off that I'm concerned with now are those times when I didn't know something that I came to know later.
Cathy Pountney presented the topic "Best Practices for Project Management" at the 2006 Great Lakes Great Database Workshop (GLGDW). When the subject of estimating software development arose, one of the audience members said that his company has replaced the word "estimate" with "forecast". It's safe to say the room visibly brightened with the light bulbs going off over many audience member's heads, mine included.
Why does "forecast" resonate where "estimate" doesn't? Comparing forecasting weather with estimating the level of effort and cost of software is brilliant. As we all know, weather is notoriously changeable. There are so many variables and forecasting relies not only on good algorithms for doing so, but on having enough weather stations in strategic locations that the forecaster has enough data to reasonably anticipate those changes. Let's consider the differences.
When I take a car for repairs, the mechanic has a manual for my auto with every bolt, nut, wire, and doorknob specified. Every single piece of the system is known and documented. That's not to say that auto problems aren't complicated. They can be as complicated and as mysterious as any software bug. Never-the-less, the mechanic has an extensive knowledge base, if you will, available to diagnose the likely cause of a problem. He knows with a high degree of accuracy, how much time it will take to take apart, say, the clutch assembly (is there such a thing!?). Indeed, in many engineering disciplines there are tables of how much time Task A will take. If once the clutch is disassembled, the mechanic finds a colony of badgers living inside (just call it Magical Realism, okay, and bear with me), he'll call and ask me if I want to have him remove the badgers, which will, of course, cost me more. Much, much more.
Software, on the other hand, is almost always new to a greater or lesser degree. At least, it is in my work. That said, my estimates are good for either very small changes in familiar code, or implementations that I've done many times before and that use tools I've used before enough times to be proficient. The first example are fairly common since I work quite a lot with legacy code. However, the change is frequently needed in code that is living in the dank basement of a legacy code base, and as such, I don't really know before going in if I'll to move boxes of 5-year-old magazines first, or fumigate black widow nests. Still, I know with a high degree of certainty what the task will take if there are no complications, and, more importantly, I know what complications are likely, and so can clearly state those to the client.
Most often, I need to write code that implements complex business (domain) logic that will affect existing systems in unknown ways. In the project that is currently threatening to outlive me, I am also implementing a third party took that I've never used before to do a task I'm not familiar with in this context. In other words, I didn't have enough Doppler stations on the coast of this software project to have as firm a number as "estimate" implies. As such, my client, who is absolutely the best I've every worked with for understanding uncertainty in software project, is disappointed, and rightly so.
All of this is to set the scene for my current thinking about estimating, which is that I don't estimate, I forecast (that helps me remember how uncertain I am) and I need to start est…er…forecasting work by including, even starting with, an uncertainty element. At this time I see these pieces that have some degree of uncertainty:
- The domain. How much of the tasks has been clearly defined?
- The tools. Will I be using a new tool to implement the task? How critical is the piece that depends upon it?
- My proficiency.
- How familiar am I with the domain?
- How familiar am I with the tools?
For now I've omitted any client related uncertainty I'm lucky to have clients who are always available when needed, and free (but not too free) with information.
Now, I expect that you are probably screaming for me to read Steve McConnell's excellent book Software Project Survival Guide, and of course, that's good advice for just about any software management question. However, I'm looking for a slightly different focus on software estimating. McConnell's cone of uncertainty applies to the stages of a project, whereas I would like a way to pinpoint the uncertain elements within an esti…damn…forecast.
I'll continue to write about my experience thrashing out a process. My first goal is to accurately identify the uncertain elements, and then to track those elements with more than just time spent, but with a detailed diary of the my experience implementing them. The last is because it occurs to me that how a programmer operates during a phase that's highly uncertain will vary wildly. The overall project affect may not be much different, but the details may be.
As always I welcome the community's experience and knowledge.