At work we use Earned Value to track our projects, not the most straightforward way to do it, but one that seems to add a lot of value for the work done.
I like EV, but it does become a little bit of an acronym hell. This is in a business world already overpopulated by PowerPoint’s bearing total gibberish.
EV has a number of issues in actual use. First it tends to be a little complex for the one-slide leaders who like simple stop light charts. There is no chance of having the time, or my boss having the attention span to talk 10 or 12 different EV metrics.
So if I have to choose one or two EV metrics to talk when reporting out to those with the attention span of a Golden Lab at a Frisbee factory, what am I going to use?
A bigger, and perhaps the biggest problem is that the project has to be fully understood and defined. And in a world where bigger projects full of go/no go gates and smaller projects is a little more ad-hock this is becoming difficult to do.
Scope creep destroys the whole basis of EV and makes it meaningless. The baseline must be complete, understood and adhered too. Deviation, creep, rescoping means re-baselining and what I see as probably the most important EV metric, Cost Performance Index (CPI) utterly meaningless.
This brings me to today’s discussion in cubeville that was started by a college project I’m working on.
First we pulled out our PMBOK’s and spent a little time reviewing all the EV metrics. I think we all agreed that first is Cost Performance Index (CPI) is the first we should follow.
CPI is a straightforward; it compares the work actually done to the actual costs of getting that work done. What have we completed, compared to what we have spent.
When CPI=100% for every hour we’ve spent on the project we’ve earned one hour of value.
It’s a snapshot of where we are today. If the actuals are correct (hours, cash, widgets delivered) and the project baseline accurate then CPI gives a good idea how close you are to budget or projections you are. It’s simple and the results are clear.
Last week at the BTEC there was a paper presented that stated that once you got to 20% of project completion, performance to date becomes an accurate indicator of future performance.
It was based on the analysis of a number of Department of Defense contracts and the data showed that once you got to 20% of the way through your project, you could accurately predict the final results to within 10%.
If you were to divide total budget (BAC) by the CPI at 20% project completion, you should be with in 10% of the final cost.
So if you were on time/budget at 20% completion, you would close to budget at project completion.
However if you were at 25% of budget at 20% completion you would end up at somewhere around 125% (+-10%) at project completion.
This potentially a huge “early warning” that a project is in trouble. However you’ve spent the money and CPI shows what we’ve got for that expense. What do we have to do to get back on track?
That’s where my second EV metric comes into this. To-complete Performance Index (TCPI). TCPI shows how efficient we have to be to finish the project on time/budget with where we are today.
TCPI is another simple one it’s “work remaining/funds remaining”.
If the project is behind or over budget and TCPI needs to be moved back to 100% there are two ways to do this. Rescope and reduce the amount of work, or increase the budget to compensate.
It is not uncommon for projects large and small to go over budget and when making a new forecast it’s easy to assume that everything will suddenly go right, that problems are behind us and from here on it’s all going to be OK.
Once the TCPI shows that the budget is no not realistic for the work remaining the first question from leadership should be how much money will it cost to complete the project. The risk at this point is that another unrealistic budget is completed, and every month it’s revisited and matched to actual performance with out ever understanding what was wrong with the initial estimate.
This revising makes the whole point of tracking EV pointless as the metric is not being tracked to the original baseline.