Handling Uncertainty In Project Planning

Introduction

The majority of project planning efforts today are still performed in the strictly "deterministic" manner where all project tasks are assigned with well-defined timeframes to execute and resources to implement. At the same time, we all know that real life necessitates corrective actions: Resources are not always available, extra tasks pop up during the course of the projects, and what happens most frequently is the initial estimate of the time required to perform a certain task is no longer valid. It's easy to mentally grasp such uncertainties and have an accurate estimate when dealing with a few variables; however, with large-scale projects having multiple uncertainty factors of a different nature and character, such analysis cannot be performed manually and demands that very special efforts are required for the accurate project assessment.

Such uncertainty in the project planning is widely recognized in a lot of industries and a wide variety of different tools exists to help to optimize the planning process and minimize the associated risks. These tools cover a wide range of offerings from the mathematically rigorous and sophisticated solvers (mainly constraint based programming) to different Excel add-ons (VBA macro based) doing similar things, but on a simpler level (you could find even a freebie doing some optimization of your Excel-based projects or dependencies).

Approach

In this paper, I recommend that you implement their planning, assessment, and optimization when dealing with large and cumbersome projects. You have focused on software development projects; however, this approach is valid for the wider selection of projects in industries.

Your situation is very common for a product development organization: How to rapidly develop an accurate project plan estimate that is critical to the overall development lifecycle (resources, relation to other projects, and dependencies on other deliverables should be always well defined) and at the same time minimize the time spent performing "if-then-else" analysis (your business and technical analysts and project managers have their hands full doing "real things"—mplementing new customer-requested functions and working on the next release of the product).

By setting up a few extra steps in your planning processes and applying a specialized tool against your project plan, I believe that you can find the right solution and be able to define the correct approach to this problem.

Your process of planning and assessment consists of five main steps:

Define the project: During this phase, you put together the regular project plan in a MS Project with the main tasks, timeframes, and deliverables. A Project Plan can be created either manually or be generated as the result of other efforts (doing some preliminary use cases or object design or working with the cost estimate tools).

Identify uncertainly: In this phase, you need to determine which tasks or their inputs in the plan are uncertain, what you know about these uncertainties, and what would be the best way to describe these uncertainties, their behavior, and known range of values they might cover. During this phase, you also need to identify the most critical results or outputs of the plan you want to look at, analyze, and optimize.

Apply the tool and implement the model: I've selected Palisade's @Risk for Project (http://www.palisade.com) as my project optimization tool from the wide variety of the other available packages. I believe that this tool is a good match to my requirements: It can be integrated with the MS project, has an easy-to-set-up model, sufficient depth of simulation and outputs, and ease of analysis. A particular tool selection affects only the way that the uncertainly model will be applied; the general approach to the problem handling will remain the same.

Run simulation: As the result of the Monte Carlo simulation (parameters of this simulation can be configured as well), you obtain the ranges and probabilities of all outcomes for the outputs that you've identified during model setup. Normally, it's the ranges of the duration of the particular phases or the project with the most critical dependencies, overall project timeframe, and so forth.

Make a decision: Based on the simulation results and looking at the outputs of the model, you obtain the information about the most critical characteristics of the project. This data serve as the basis for the following optimization and corrective actions.

Working with the Tool

@Risk is tightly integrated with MS Project and looks like an additional tool set on the MS project panel (see Figure 1). By selecting particular project parameters (for example, the duration of the task) and clicking on the "Define Distribution" icon on the @Risk tool bar, you can set up model characteristics.

Figure 1: Model Setup

The tool is quite flexible in definition of the distribution windows: Out of the box, it's easy to find anything from the uniform or normal distribution up to the more exotic ones; see Figure 2.

Figure 2: Different available distribution functions

Handling Uncertainty In Project Planning

Use your past projects experience and "historical" knowledge to define the possible ranges for a different tasks and distribution functions. For any distribution function, you can define its "fine structure," such as an average value, standard deviation, and so on. There is also another advanced feature that allows you to add additional "custom" uncertainties to the selected tasks or group of tasks.

Model outputs are specified in the same manner: Click on the selected cell (for example, finish date or duration) and specify it as the output. You can see the corresponding notations in the @Risk:Functions column of the Project; see Figure 3.

Figure 3: Specifying model outputs

An implemented model could be enhanced by putting additional conditions (refer to Figure 3; the @Risk icon with an "IF" on it, 6th from the left on the tool bar). For example, you know that if you'd spent all available time on doing one particular task, another dependent task should happen only within a strictly limited timeframe and that's exactly what the "If-Then" function of the model is supposed to take into account. See Figure 4.

Figure 4: Model enhancement with "If/Then" conditions

After running the simulation, the output window will look like as Figure 5.

Figure 5: Simulation results

You can see the defined outputs of the finish dates for the elaboration and construction phases as well as the total duration of the construction phase (as they were defined as the outputs of the model). The results show quite a significant range in the finish date for the construction phase (overall, more than a month uncertainty in the final date) and you might want to re-evaluate the project setup (sequence and task dependencies as well as resource allocation). The simulation data provides the answers to the additional questions; for example, to give the probability that the project or its phase will be finished before a certain date together with providing more granular information about project dynamics and showing the most critical parameters for the project cycle (sensitivity analysis). See Figure 6.

Figure 6: Scenario and Sensitivity Analysis

There are multiple ways to graphically represent of the simulation results; this makes analysis tasks easier; see Figure 7.

Figure 7: Graphical representation of results

Since I mentioned Excel-based enhancements, it's worth saying a few words about them. (Palisade also has another version that works with Excel). For this set of add-ins, you would import your project plan into an Excel spreadsheet and then the similar logic of a setting up a model with the risks and uncertainties and the following simulation should take place. I've tried What's Best from Lindo Systems and OptiRisk from OptiGroup; they work in a similar fashion, although recognized leaders like Solver or Crystal Ball add more functionality and a better user interface.

As discussed above, the general approach will stay the same regardless of the tool being used. There are two other tools similar to @Risk: Risk + from http://www.cs-solutions.com and Pertmaster from http://www.pertmaster.com/. For due diligence, I did set up the same model in Pertmaster; see Figure 8:

Figure 8: Model setup in PERTMASTER

The simulation shows very similar results. The graphical output from this tool is slightly different and, just for the sake of illustration, I've selected a different model output and criteria to show. In Figure 9, you can see the projected finish date of the project and associated probability of this event.

Figure 9: Results graphical representation in Pertmaster

Conclusion

This article has described the approach of how to handle uncertainties in project planning. This approach includes development and implementation of the risk model that describes these uncertainties, followed by running a simulation to generate predefined project outcomes or the most critical parameters. It also touched briefly on several available tools and provided some examples on how they function (mostly @Risk from Palisade and Pertmaster).

About the Author

Dmitri Ilkaev

Dmitri Ilkaev has about 20 years in software and technology development. He holds Ph.D. in Computer Sciences from Moscow Institute of Physics and Technology. Dmitri is the Chief Technologist at Tier Technologies (http://www.tier.com) in Scottsdale, AZ. He can be reached at DILkaev@tier.com

Comments

There are no comments yet. Be the first to comment!

You must have javascript enabled in order to post comments.

Leave a Comment

Your email address will not be published. All fields are required.

Name

Email

Title

Comment

Top White Papers and Webcasts

Live Event Date: April 21, 2015 @ 1:00 p.m. ET / 10:00 a.m. PT
For mid-sized companies, the challenge of serving people and providing better service is forcing a re-evaluation of their contact centers to improve customer experiences. While these growing enterprises are driving much of the market's innovation, they must remain nimble in how they operate, and be very responsive to customer needs. This upcoming eSeminar reveals five ways mid-size organizations can gain needed agility and examines how delivering …

The 2014 State of DevOps Report — based on a survey of 9,200+ people in IT operations, software development and technology management roles in 110 countries — reveals:
Companies with high-performing IT organizations are twice as likely to exceed their profitability, market share and productivity goals.
IT performance improves with DevOps maturity, and strongly correlates with well-known DevOps practices.
Job satisfaction is the No. 1 predictor of performance against organizational …