Estimating without historic data is like operating without long-term memory. Find out how to avoid killer jobs and increase accuracy in the process
Accuracy of estimation in construction can be viewed from two different angles — one from an accounting or “exactness accuracy” standpoint and the other from a “job performance-based accuracy” perspective. While exactness accuracy is basically just a matter of taking the time to review the drawings and come up with a correct count of material, job performance-based accuracy is an area where an estimator's skills can shine. In fact, even the smallest improvement in job performance-based estimation accuracy (JPEA) can lead to significant improvements in profitability. How? By recognizing the importance of JPEA and using available data from estimation, accounting, and field reports, you can create a useful tool for making decisions based on the historic performance of your firm.
One of the negative outcomes of a bad estimate is “killer jobs.” You know the ones — those jobs that can single-handedly make the difference in a year-end profit or loss, result in tarnished relationships with employees, customers, and vendors, and sometimes even bring a company to its knees.
Everyone in the construction industry has either experienced a killer job personally, knows a colleague who has, or at least recognizes one could be looming on the horizon. What causes killer jobs? The labor component is usually the first to blame, followed by a poor estimate, poor management on the general contractor's part, or interference from other trades. However, it's the mismanagement of all of these variables that truly creates a killer job, especially when there's lack of appropriate job performance-based feedback in the estimation process. The good news is, if you find yourself going down this road, you can turn things around.
The familiar analogy of football — a sport where catastrophic losses do not go unnoticed or unstudied — can show us a lot. Although one player may have missed a critical pass or another committed an unnecessary penalty, without going back and reviewing the whole game tape, the coach runs the risk of the same outcome in the next game. With the correct use of statistics or game tapes, the coach can make predictions based on the team's historic performance. However, without taking all of the “live” factors of the game into account, he cannot reliably predict what will happen on the field. The same is true in estimating: Without reliable historic data capturing the JPEA, the celebration on bid day could evolve into a catastrophe on the job site in just a few short months.
Figure 1 shows a representative contractor's percentage of gross profits by job. The spread between a job's outcomes could be a 40% profit or a 15% loss. This difference between potential profit and loss is far too big to allow for reliable and accurate job performance-based estimates. In this example, the standard deviation is 13.4%, indicating that the next job might earn as high as 28.1% or as low as 1.3% gross profits (on average). On a $100,000 job, this is a potential difference between earning a profit of $28,100 or $1,300. A $1 million job would have an expected profit of either $280,000 or $13,000 — a $267,000 difference!
Getting back to the football analogy, painful losses usually occur when a team and its coaches are not well prepared, both strategically and tactically. The pre-game studies did not foreshadow what showed up on game day. The same is true in construction: If historic job-based data is not used during the initial estimate, there is more risk of the job becoming one of those dreaded killer jobs.
Comparing the estimated versus actual achieved profits from past jobs is the first step of data mining. Figure 2 shows the estimated profits for a set of jobs, with each estimate predicting a gross profit return between 12% and 24%. The reality, however, was somewhat different. At the end of these jobs, the actual profits ranged anywhere from a 41% gain to a 12% loss. Why such a difference? The variation was primarily caused by different labor productivity results in various cost codes. To see this picture, accounting data, estimation data, and field productivity information must all be evaluated collectively.
Estimators typically work with an average performance of the cost codes over time. However, using averages of data without considering the range and statistical outliers will skew the results. To use the available data correctly, factors such as geography, cost code mix, allowance for pre-assembly, and type of construction need to be considered. This allows you to make confident decisions based on more than just averages.
For example, if an estimator looks at how a job turned out, he or she may see something like Fig. 3 — the job was completed in fewer hours than were estimated. However, a closer look at the individual cost code in Fig. 4 shows that the job's “average” hours do not show the entire picture. Some cost codes were underestimated, while others were overestimated. If your next estimate has a majority of hours in cost code 302, it could turn out to be very profitable. However, if your next job up for bid is one with a lot of hours in cost code 402, you could be facing a killer job.
Beyond cost codes, jobs can also be compared with certain characteristics, such as job location, type of work, customer type, and contract type, to name a few. For example, Fig. 5 shows the same cost code comparisons across several jobs, revealing significant variation in labor productivity, depending on the location and type of work. Using data that accurately represents the work environment adds more detail to the estimation process, allowing for more selective bidding.
Going back to the previous example, let's say this contractor performs poorly on fixture installation projects (average of 30% over budget on hours used versus estimated hours), but is now interested in taking a risk and bidding his next job tight. If the estimator does not pay attention to past history, his 30% labor overage could turn into a loss of 170% in the case of a renovation-type project, putting him in a no-win situation.
As you can see, averages of prior jobs do not provide a clear picture of what happened between the estimate stage and closeout stage. The first step for improving estimation accuracy in situations like these is to compile and analyze historic data by cost code and specific job characteristics. Once this type of data analysis is established, your estimators will be on the fast track to more accurate estimates.
Moore and Nimmo are associate implementers with MCA, Inc., Flint, Mich.
Typically, accounting measurements of production (spent versus budgeted hours or dollars) are mistakenly used as measurements for productivity, which is not unlike driving with the rear-view mirror only. By the time the data from operations in the field have been scored, reported, and produced, weeks (and sometimes months) have passed, leaving no time for the field to react and adjust.
Instead, measure job progress from a labor productivity standpoint using government-accepted measurement practices in conjunction with your accounting data. By measuring productivity versus a set construction budget goal, set from the perspective of the field according to how your foeman sees the work, you'll improve your estimation accuracy.
Break your budget down so that productivity can be measured and monitored as a weekly trend, both on the overall job and by individual cost codes. This allows your foreman and project manager to note the special and common causes of deviation from the original budget. This information is priceless to estimators, who can use it to understand how their bids can get closer to what happens on the job site.