Tuesday, September 27, 2011

The Planning Fallacy

We were assigned to read the article, "Intuitive Prediction: Biases and Corrective Procedures," by Daniel Kahneman and Amos Tversky for my Federal Acquisitions class this week (one of seven!! articles, each about 45 pages long...woe is me! Thank goodness for 10 pages of endnotes/references.). Their premise is that forecasting is most often fallacious in two predictable manners: an over-reliance on intuition, rather than regressive analysis; and overconfidence with precision of estimates. Quotes from the article are in italics.
"Our  view of  forecasting rests on  the  following  notions. First,  that  most  predictions and  forecasts  contain  an irreducible  intuitive  component.  Second,  that  the  intuitive predictions of  knowledgeable  individuals contain much  useful information.  Third,  that  these  intuitive  judgments  are  often biased  in  a  predictable manner.  Hence,  the  problem is  not whether  to  accept  intuitive  predictions at  face  value or  to reject  them, but  rather  how  they can  be  debiased and  improved."
Usually as I read stuff, especially conceptual stuff, I try to relate it to something with which I am familiar. The article did a good job of providing understandable examples, but for me, what resonated was trying to predict how long something, particularly engineering-related, will take to repair. When something breaks--and it's inevitable that something *will,* every operational planner knows that there's "real estimate" and "engineering estimate" for repairs.

Real time is what it actually takes to fix whatever is broken, and is never actually known until the piece of equipment is fixed. Engineering time is the EO/EPOs best estimate for how long it's going to take.
"...the  element of  uncertainty  is  typically  underestimated in  risky decisions.  The  elimination of  overconfidence  is therefore an  important  objective  in  an attempt  to  improve  the quality of  the  intuitive  judgments  that  serve decision making."
Under extreme duress and lots of nagging on my part, a first-rate, highly skilled, extremely talented EO shared with me his engineering time algorithm: 2*estimated repair time + 20 percent. So if he thought it would actually take an hour to say, fix the fuel leak on the small boat, he'd tell me that the small boat would be FMC in about two and a half hours, give or take. That way he and his engineers looked like rock stars when it was done in an hour and a half, and they still had plenty of time to thwart the annoying gremlin trickery that is inherent to engineering repairs.

Of course, I always tried to reverse engineer his engineering time to get the real time...usually only ended up annoying the hell out of both of us.
"A probability distribution  that  is  conditioned on  restrictive  assumptions  reflects  only  part of  the  existing uncertainty regarding  the  quantity,  and  is  therefore  likely to  yield too many  surprises."
Somehow, though, I was never able to effectively apply the same theory to predicting how long it would take to launch the small boat. Like, *never.* I would always underestimate it, and we'd be late (guaranteed to aggravate me), or overestimate it, and the boat crew would have to haul a mile, usually upswell, to get to the boarding target, arriving thoroughly soaked and more tired than they needed to be (and I always knew it was my fault). I think my "restrictive assumption" was that it would either take 15 minutes to launch the small boat, or 30 minutes (mostly because my brain thinks most easily in quarter-hour increments), when actually it takes, on average, 22 minutes to get the boat in the water and boat crew and boarding team loaded. It's really hard to take the Plan of the Day seriously when it says, "0938 - Set Boat Lowering Detail," for a 1000 arrival time.
"In many  problems of prediction  and estimation, available  information  is  limited, incomplete,  and  unreliable.  If  people  derive almost  as much confidence  from poor  data  as  from good  data, they  are  likely to produce overly narrow confidence  intervals  when  their information  is  of  inferior quality."
I guess my point is that I like what the authors did with the article in trying to break down the nature of uncertainty in planning. I'm poking gentle fun at it because they take it so seriously, and turn it all scientific and statistical. But, in the end, they're right...the important thing about predictions is honestly recognizing where they are weak, and trying, despite ourselves, to compensate for those weaknesses.


Victoria said...

Heh. I think I will take your engineers' advice: 2*estimate + 20% for anything I do regarding grading or class planning. That sounds about right, since a stack of papers I think should take me 4 hours usually ends up taking at least 10.

Also, have you heard of the (strongly supported by evidence) hypothesis that humans are not thinking creatures who have feelings, but are feeling creatures who occasionally have a rational thought? That may help explain why planning is so impossible....

Victoria said...

Clearly MY problem is that I can't do math.