Blogroll

Month: January 2012

With this post, we’re pretty much at the finish line. Now that we have the macro, the baseline mechanics, and the query all figured out, it’s time to flex those Excel muscles and see if we can’t generate a report that looks the way we want it to.

First, off we’ll start with the Pivot Table generated from the query in the last post. That table should look something like this….

Changing the cost fields to a sum, and the status date to a maximum value, we end up with something like this….

Convert the cost fields to a running total, and we get this….

From here, it should be a relatively easy exercise to add a Pivot Chart (shown here with the Status Date removed)….

That would be easy. But remember, in the last post I mentioned that I am taking the Gilyaks of Murakami’s 1Q84 as my inspiration. They would never take the easy route. No, they would look at the easy route with incomprehension, then continue the painful slog along the slow route right next to it….for no apparent reason. And that, I think is an excellent description of this series of posts.

So my issue with the chart above is that it doesn’t display the status date. I want the ACWP and BCWP lines to truncate at the status date. That being said, I don’t feel quite up to the challenge of figuring that bit of functionality out.

So for now, I’ll settle for just adding a vertical line where the status date should be. As it turns out, that gets a bit complicated, because to do so, we need to add a second chart type, which is not allowed when creating a chart based on a Pivot Table.

What to do when stumped in Excel? Why turn to Jon Peltier’s excellent blog on Excel tips and tricks. In this case, this post on developing a non-Pivot Chart chart on a Pivot Table did the trick.

After creating the Pivot Table, I added the series one by one back onto a blank line chart. When it came time to add the Status Date, I added it, then converted it to an X, Y scatter chart where the X coordinate is the Status Date and the Y coordinate is the BCWS value.

The end result is something like this….

In a real implementation, I might recommend leaving it as a Pivot Chart though – as that’s a lot more dynamic and easy to change up for different projects. On the other hand, once the data is available, it’s not too hard to generate a chart like this.

And there we are. From 0 to enterprise timescaled data in four easy posts….

Credits

Finally the credits….thanks to a whole slew of folks for providing the support to figure this out and allowing me to bounce ideas off of them. This effort truly was an international one, with fellow MVPs Barbara Henhapl, Vadim Gerya, Vladimir Ivanov and Rod Gill weighing in from Austria, Ukraine, Russia, and New Zealand respectively – with kind contributions from the fine folks in Redmond as well. As usual, anything that works is credited to them…and anything that doesn’t work is solely my responsibility.

So the pieces are starting to come together. In the last post, I showed you how to write a macro to copy the BCWP data into the Baseline10 Cost timescaled fields. In the post before that, I discussed how we could actually get Project Pro to save the data to Project Server.

So to make a long story short, we now have BCWP stored in the Baseline10 Cost timescaled data. The next step is to pull it back out in some sort of usable format.

Before we get into that, a quick review: to avoid performance issues, the macro is storing the BCWP for the project in the Project Summary Task Baseline10 Cost field. We’re not storing assignment level data. If we were, the reporting would be a lot easier, as we could just leverage the OLAP cubes to achieve pretty much the same thing.

…but if life were easy, there’d be no market for consultants. Hence, like Chekov’s Gilyaks, I metaphorically find myself trudging through the Sakhalinese mud, moving in parallel to a newly paved tsarist road. (Yes, someone might have been reading Murakami over the holiday break.)

Now where were we? Oh right….once we work out the issues with simply getting the data to copy and save correctly, we need to start looking at how we plan to consume the data. As I saw it, I had three options:

OLAP Cubes (Assignments)

Reporting Database (Assignments)

Reporting Database (Tasks)

What I quickly realized is that while assignment level data gave me the most granularity and allowed me to piggy back off of the existing OLAP cubes, it also represented a significant performance issue. Copying data for each timescaled cell for each assignment in a single project could take upwards of a couple minutes…plus saving timescaled data also increases the time required to save to Project Server. Saving the data for each task could also present the same difficulty.

After generating assignment level data and seeing the performance issues, I went back to the drawing board, and decided to try again at the Task level, specifically at the project summary task level. This would mean, I would only be copying data for a single task to generate the data required.

The BCWP data would then be stored in the TaskByDay table. Unfortunately, ACWP is stored in the AssignmentsByDay table. So I had to write a query to pull the data from the appropriate table and assemble it in a form that would be suitable for reporting.

First off, let me preface this post with the comment that all customizations should be tested – especially a solution which edits timescaled data. Solutions such as these are often prone to memory leaks, and as such, it behooves you to follow Lenin’s (paraphrased) advice: “Test, test, and, again, test.” In fact, my DEV environment started exhibiting all sorts of memory related issues as I worked through these posts.

From a performance perspective, when testing the solution below, I found that it seemed to add almost 10-20 seconds to the publish time in my DEV environment. That’s hardly a scientific test, but it’s definitely the sort of performance hit that would cause you to take pause if you thought about, perhaps, running this automatically as part of the publish routine. Personally, I could see a macro that would be triggered manually at each status reporting interval, which would capture all sorts of timescaled reporting data: BCWP over time, snapshots of major milestone variance, etc.

So anyway, here’s the macro. The usual hat tip goes to Rod Gill for his excellent book on VBA for Microsoft Project.

Copies the BCWP values at the Project Summary task level into Baseline10 Cost (at a granularity of 1 week – as opposed to daily, to avoid potential performance issues).

Drops the Project BCWP into the scalar Baseline10 Cost field to trigger the baseline save event. See the last post for more information on that.

Note that instead of simply copying the BCWP value to the Baseline10 Cost line, I am actually subtracting the last period’s BCWP from the current period to generate the delta from one period to another. That makes the reporting a bit easier, as it matches how AC and BCWS are both stored within the reporting database.

For what it’s worth, Rev1 of this macro operated at the assignment level. On a fair sized project, it took almost a minute or so to run. Hence, I decided to keep it at the Project Summary task level. It makes reporting a bit of a hassle as you’ll see in the next post, but it’s still doable. On the other hand, throwing the data into the assignment level will make it available in OLAP cubes – which is also pretty slick – but maybe not worth the performance hit, depending on how forgiving your end users are.

Another hat tip to Ukrainian MVP, Vadim Gerya, for graciously sharing his solution on how to convert scalar cost values to timescaled data by allocating it across a defined time period… The list of beers and/or vodka shots I owe to folks at the Project Conference gets longer by the day.

Well, it’s that time of year again, the down time between the solar and lunar new year festivities when we bid farewell to 2011 and welcome to the year of the water dragon. This year, as I attempt to forestall getting back into “serious” work, I figured that I’d kick off the blog with a return to the basics and a look at reporting.

Seeing as I’ll be presenting on Earned Value at the upcoming Project Conference, this next series of posts combines EVMS and reporting into the same concept. The general gist is how to capture timescaled data within the Project Pro client using a spare baseline cost field. This is a technique that I had heard about from other MVPs, but had never tried myself. When a client request popped up, I figured I’d take some time to see if I could figure out how it works.

The Challenge

So first off the challenge: we’re implementing Earned Value techniques in conjunction with Microsoft Project Server at this client, and I’d gotten to the point of developing a couple reports in Excel Services. When it came time to generate a timescaled EVM curve, I realized that Project Server doesn’t seem to actually maintain timescaled EVM data anywhere. Essentially that means it’s almost impossible to generate the report below from the data stored in the enterprise databases.

Just to throw a quick primer on EVM terminology out there, note that Microsoft Project uses the DOD terminology for EV calculations, hence…

EV = BCWP

PV = BCWS

AC = ACWP

…which I’ll be using for the rest of these blog posts…

From an enterprise data perspective, we can generate a similar graph using BCWS = Baseline0Cost, and ACWP = Actual Cost – both of which are stored in the Project Server databases and readily available for reporting purposes.

The challenge here is to copy the timescaled BCWP data from the BCWP row in Project Pro to the Baseline10Cost row – thereby making it available for enterprise reporting purposes.

The same concept could be applied to any number of other challenges, for instance capturing snapshots of milestone finish variance, or snapshots of EAC calculations on each status date. The possibilities are endless.

Enterprise Baseline Mechanics

So I sat down and wrote a macro to copy BCWP to the Baseline10Cost field. Unexpectedly at that point, I ran into a significant snag….it turns out that I didn’t quite understand how enterprise baseline data is saved. Hence, I had to take a quick detour to figure that out before coming back to the issue at hand.

The key to understanding baselines in Project Server is that baselines are stored in two different data sets. The first data set consists of the scalar data. That is the data on the left of the Task Usage view. (I should point out that I really liked the definition of “scalar” that I found on the Web….”Having value, but no direction.” Definitely sounds like some people I know. )

Editing scalar data is pretty straightforward. Simply update the field and publish the project to Project Server.

Run a query against the reporting database within SQL to display the edit.

Timescaled data is a bit more complicated. That’s stored in any of the tables with the designation “ByDay” – i.e. AssignmentsByDay, TasksByDay, etc. The trick here (and thanks to Brian and Adrian for pointing it out) is that simply editing the timescaled cost values will not work…

When I make the manual edits above, and then publish the project, the SQL query against the TasksByDay table results in the following:

Worked just fine, right? Well, unfortunately, things are not as simple as they appear. It turns out that simply entering timescaled data doesn’t trigger Project Server to capture the data in the enterprise databases. And if Project Server doesn’t capture the data, we can’t generate reports against it.

After doing some testing, it seems that Task and Assignment timescaled data are treated differently.

For Task timescaled data to be saved, at least one of the scalar values must be edited. In the example below, I have edited the Baseline10 Cost field in addition to the two timescaled values.

This triggers Project Server to recognize the timescaled data and to store it on publish.

Assignment timescaled data appears to be a bit more complicated. As far as I can tell, Assignment timescaled data requires a value to be entered in both the scalar and the corresponding Baseline Work field.

In this example, I enter data in Baseline10 Cost, Baseline10 Work, and the scalar Baseline10 Cost field.

Note that any non-null value in the Baseline10 Work field will trigger the save.

So, in short, if you wish to edit Baseline timescaled data, you’ll need to remember the correct triggers to force Project Server to retain the data.

Data

Baseline Save Triggered By

Task

Scalar Data Edit

Assignment

Scalar Data Edit & Timescaled Work Edit

You could apply the same approach to a budget resource assignment, i.e. create a budget resource, assign it to the summary task, and then copy the relevant data into the assignment row – in which case you may not have to mess with the baseline mechanics. I didn’t test this scenario against a custom timescaled budget cost field, but if you’re looking to create more custom timescaled fields, it wouldn’t be hard to just create a budget resource called “BCWP” and then use that to store the data.