Pages

Wednesday, October 8, 2014

There are no facts about the future

And, so you might ask: What are facts about?
And, of course, I would answer facts are what we can observe, measure, sense, and conclude about the present, and the same could have been about the past.

And that leaves the future fact free ... where, by the way, a good deal of project activity will happen.
OMG! And, there are no facts out there!

Which brings me to a neat list of maladies (aka, uncertainties) -- all of which can apply to the future -- said list put together by Glen Alleman recently (Glen is prone to big words that drive me to my dictionary, so I paraphrase):

Statistical uncertainty - repeatable random events or outcomes, the range of which is best handled by buffers or margin in the spec, or some other way to immunize the project for outliers.

Subjective judgment - bias in your thinking, anchoring yourself to something you know or have been told, and adjustment to the least difficult or easiest retrieved or nearest solution; these all best understood by reading the stuff written by Amos Tversky and Daniel Kahneman|

Systematic error - unwitting or misunderstood departures or biases -- usually repeated similarly in similar situations -- from an acknowledged expert solution, reference model, or benchmark

Incomplete knowledge - You may know what you don't know, or you may not know what you don't know. This is famously attributed to US Defense Secretary Don Rumsfeld. Fortunately, this lack of knowledge can be improved with effort. Sometimes, you have an epiphany; sometimes the answer falls in your lap; sometimes you can systematically learn what you don't know.

Temporal variation - or better yet: Not stationary. To be "not stationary" means there is a sensitivity in the unit (system) under test -- either time or location -- to when and/or where you make an observation, measurement, etc, or there is instability in the observed and measured system

Inherent stochasticity (irregular, random, or unpredictable) - instability or random effects between and within system elements, some intended, and some not intended or even predicted. If the instability is quite disproportional to the stimulus, we call it a chaotic response.

Looking at this list, the really swell news for the "I hate, hate, hate statistics" crowd is that for most project managers on most projects, statistics play a relatively small role in the overall panoply of uncertainty and risk.

Probability -- that is, frequency -- is a bit more prominent in the PM domain because, when associated with impact, you get (yikes!) a statistic, to wit: expected value.

Well, not really. In projects the probability estimate is subject to all the maladies we just went over -- there are rarely any facts about probabilities -- so what we get is something discovered in the 18th century: expected utility (usefulness) value -- that is, the more or less subjective version of expected value.

And, here's more news you can use: expected utility value is not stationary! Ask me now, I'll give you one estimate; ask me much later, you'll get another estimate. Why? Because my underlying risk attitude (perception) is not stationary...