To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.

Planning fallacy

From Wikipedia, the free encyclopedia

Daniel Kahneman who, along with Amos Tversky, proposed the fallacy
Daniel Kahneman who, along with Amos Tversky, proposed the fallacy

The planning fallacy, first proposed by Daniel Kahneman and Amos Tversky in 1979,[1][2] is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed.

This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned.[3][4][5] The bias only affects predictions about one's own tasks; when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.[6][7] The planning fallacy requires that predictions of current tasks' completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current tasks' completion times are more optimistic than the actual time needed to complete the tasks. In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[8]

YouTube Encyclopedic

  • 1/3
    62 034
    5 788
    2 269
  • ✪ Why Everything Takes Longer Than You Expect
  • ✪ The Planning Fallacy
  • ✪ Planning Fallacy 101 with Professor Hardisty | Sauder Elevates Business Education


Every time I go to write or edit a video, I think it’s only going to take me a day. Or maybe two. I have plenty of time! But then I pull an all nighter, don’t sleep, and sit here wondering where everything went wrong! And I’m sure you’ve done this too! With work or, let’s say, a paper you’re writing that never seems to get finished… and of course, it’s not your fault! The book you needed was checked out, your laptop was slow, you kept getting notifications for new YouTube videos! If these things hadn’t happened, you would have finished that paper, gotten a good night’s sleep, and made it to class the next morning with plenty of time to spare! In fact, you’re so sure that the paper would have only taken you five hours, instead of 15, that you budget only five hours for the next paper as well… and it takes you 15 again. This is called the “planning fallacy,” where we underestimate the amount of time a future task will take. This inherent optimism doesn’t just apply to time, though, it also applies to money. Take the Sydney opera house for example. The original estimate put its completion date less than five years in the future, for only seven million dollars. In the end, it was scaled down and still took over 15 years and cost $102 million. The opera house may be an extreme case, but it’s not all that exceptional. Building projects all over the world, throughout history, have cost more and taken longer than originally thought. So, since we have all of this historical data for how long a it will take to build something, why haven’t we gotten better at estimating? One reason is that buildings and bridges are mostly built by the lowest bidder. To win a contract, you have to show that you can complete it faster and cheaper than anyone else. This incentivizes underestimation. If the project takes longer or costs more than the original estimate, the construction company is often still better off than if they hadn’t gotten the contract in the first place. And these overly optimistic estimates don’t change Parkinson’s law also contributes to how much time we spend on a task. It says that the amount of work expands to fill the time available for its completion. If you budget five hours for your paper, but it’s not due for another 15 hours, you might just continue to work right up until time’s up. Whether this is due to a lack of focus at the beginning because you’ve got “plenty of time” or procrastination, it’s hard to call a project done when you’ve still got time to make it better. Another reason we underestimate the amount of time a task will take is Hofstadter’s law, which says that things will always take longer than you think they’ll take, even when you take into account Hofstadler’s law. So, let’s say you think it’ll take 5 hours to write your paper. You know about the planning fallacy, so you double that estimate, just to be safe. And you know about Hodstadler’s law, so you add another hour to your estimate, just to be double safe. According to Hofstadler’s law, even though you took everything into account, your paper will still take longer than your new 11 hour estimate. In fact, if your last paper is any indication, it’ll probably take you 15 hours. And when a task takes longer than expected, we assume it’s because of outside forces, like a missing library book, or weather, or construction delays, but it’s more likely that your initial estimate was just too short to begin with. Studies have shown that time estimates can be improved by focusing on past data, rather than future plans. And when we’ve asked people to estimate how long it would take another person to complete a task, people tend to overestimate, rather than underestimate their completion time. So committing the planning fallacy isn’t inevitable: there’s a few ways to avoid it. Instead of assuming a new plan or technique will mean a shorter completion time, look at the past experience of yourself and others, and planning based on that will help give you a more accurate estimate. And think about how long you would expect that task to take another person. By pretending you’re planning for others and taking that estimate into consideration, you better plan your time and resources for all sorts of tasks, from writing papers, to big projects, to saving money. And now, I’ve finally finished editing this video. In a reasonable amount of planned time.


Empirical evidence

For individual tasks

In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with only about 30% of the students completing their thesis in the amount of time they predicted.[9]

Another study asked students to estimate when they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done.[7]

  • 13% of subjects finished their project by the time they had assigned a 50% probability level;
  • 19% finished by the time assigned a 75% probability level;
  • 45% finished by the time of their 99% probability level.

A survey of Canadian tax payers, published in 1997, found that they mailed in their tax forms about a week later than they predicted. They had no misconceptions about their past record of getting forms mailed in, but expected that they would get it done more quickly next time.[10] This illustrates a defining feature of the planning fallacy; that people recognize that their past predictions have been over-optimistic, while insisting that their current predictions are realistic.[6]

For group tasks

Carter and colleagues conducted three studies in 2005 that demonstrate empirical support that the planning fallacy also affects predictions concerning group tasks. This research emphasizes the importance of how temporal frames and thoughts of successful completion contribute to the planning fallacy.[11]

Additional studies

Bent Flyvbjerg and Cass Sunstein argue that Albert O. Hirschman's Hiding Hand principle is the planning fallacy writ large, and they tested the empirical validity of the principle.[12] See also further reading below for additional studies.

Proposed explanations

  • Kahneman and Tversky originally explained the fallacy by envisaging that planners focus on the most optimistic scenario for the task, rather than using their full experience of how much time similar tasks require.[6]
  • Roger Buehler and colleagues account for the fallacy by examining wishful thinking; in other words, people think tasks will be finished quickly and easily because that is what they want to be the case.[1]
  • In a different paper, Buehler and colleagues suggest an explanation in terms of the self-serving bias in how people interpret their past performance. By taking credit for tasks that went well but blaming delays on outside influences, people can discount past evidence of how long a task should take.[1] One experiment found that when people made their predictions anonymously, they do not show the optimistic bias. This suggests that the people make optimistic estimates so as to create a favorable impression with others,[13] which is similar to the concepts outlined in impression management theory.
  • Another explanation proposed by Roy and colleagues is that people do not correctly recall the amount of time that similar tasks in the past had taken to complete; instead people systematically underestimate the duration of those past events. Thus, a prediction about future event duration is biased because memory of past event duration is also biased. Roy and colleagues note that this memory bias does not rule out other mechanisms of the planning fallacy.[14]
  • Sanna and colleagues examined temporal framing and thinking about success as a contributor to the planning fallacy. They found that when people were induced to think about a deadline as far away (i.e., lots of time remaining) vs. rapidly approaching (i.e., little time remaining), they made more optimistic predictions and had more thoughts of success. In their final study, they found that the ease of generating thoughts also caused more optimistic predictions. [11]
  • One explanation, focalism, proposes that people fall victim to the planning fallacy because they only focus on the future task and do not consider similar tasks of the past that took longer to complete than expected.[15]
  • As described by Fred Brooks in The Mythical Man-Month, adding new personnel to an already-late project incurs a variety of risks and overhead costs that tend to make it even later; this is known as Brooks's law.
  • The "authorization imperative" offers another possible explanation: much of project planning takes place in a context which requires financial approval to proceed with the project, and the planner often has a stake in getting the project approved. This dynamic may lead to a tendency on the part of the planner to deliberately underestimate the project effort required. It is easier to get forgiveness (for overruns) than permission (to commence the project if a realistic effort estimate were provided). Such deliberate underestimation has been named by Jones and Euske "strategic misrepresentation".[16]
  • Apart from psychological explanations, the phenomenon of the planning fallacy has also been explained by Taleb as resulting from natural asymmetry and from scaling issues. The asymmetry results from random events giving negative results of delay or cost, not evenly balanced between positive and negative results. The scaling difficulties relate to the observation that consequences of disruptions are not linear, that as size of effort increases the error increases much more as a natural effect of inefficiencies of larger efforts' ability to react, particularly efforts that are not divisible in increments. Additionally this is contrasted with earlier efforts being more commonly on-time (e.g. the Empire State Building, The Crystal Palace, the Golden Gate Bridge) to conclude it indicates inherent flaws in more modern planning systems and modern efforts having hidden fragility. (For example, that modern efforts – being computerized and less localized invisibly – have less insight and control, and more dependencies on transportation.)[17]

Methods for counteracting

Segmentation effect

The segmentation effect is defined as the time allocated for a task being significantly smaller than the sum of the time allocated to individual smaller sub-tasks of that task. In a study performed by Forsyth in 2008, this effect was tested to determine if it could be used to reduce the planning fallacy. In three experiments, the segmentation effect was shown to be influential. However, the segmentation effect demands a great deal of cognitive resources and is not very feasible to use in everyday situations.[18]

Implementation intentions

Implementation intentions are concrete plans that accurately show how, when, and where one will act. It has been shown through various experiments that implementation intentions help people become more aware of the overall task and see all possible outcomes. Initially, this actually causes predictions to become even more optimistic. However, it is believed that forming implementation intentions "explicitly recruits willpower" by having the person commit themselves to the completion of the task. Those that had formed implementation intentions during the experiments began work on the task sooner, experienced fewer interruptions, and later predictions had reduced optimistic bias than those who had not. It was also found that the reduction in optimistic bias was mediated by the reduction in interruptions.[5]

Reference class forecasting

Reference class forecasting predicts the outcome of a planned action based on actual outcomes in a reference class of similar actions to that being forecast.

Real-world examples

Sydney Opera House, still under construction in 1966, three years after its expected completion date
Sydney Opera House, still under construction in 1966, three years after its expected completion date

The Sydney Opera House was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million.[11]

The Eurofighter Typhoon defense project took six years longer than expected, with an overrun cost of 8 billion euros.[11]

The Boston Central Artery was completed seven years later than planned, costing another $12 billion.[19]

The Denver International Airport opened sixteen months later than scheduled, with a total cost of $4.8 billion, over $2 billion more than expected.[20]

The Berlin Brandenburg Airport is another egregious case. After 15 years of planning, construction began in 2006, with the opening planned for October 2011. There were numerous delays. It is currently estimated that this airport will be opened in 2021. The original budget was €2.83 billion; current projections are close to €10.0 billion.

See also


  1. ^ a b c Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences. 41 (7): 1359–1371. doi:10.1016/j.paid.2006.03.029. ISSN 0191-8869.
  2. ^ Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures". TIMS Studies in Management Science. 12: 313–327.
  3. ^ "Exploring the Planning Fallacy" (PDF). Journal of Personality and Social Psychology. 1994. Retrieved 7 November 2014.
  4. ^ Kruger, Justin; Evans, Matt (15 October 2003). "If you don't want to be late, enumerate: Unpacking Reduces the Planning Fallacy". Journal of Experimental Social Psychology. 40 (5): 586–598. doi:10.1016/j.jesp.2003.11.001.
  5. ^ a b "Overcoming the Planning Fallacy Through Willpower". European Journal of Social Psychology. November 2000. Retrieved 22 November 2014.
  6. ^ a b c Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions". In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press.
  7. ^ a b Buehler, Roger; Dale Griffin; Michael Ross (1995). "It's about time: Optimistic predictions in work and love". European Review of Social Psychology. 6: 1–32. doi:10.1080/14792779343000112.
  8. ^ Lovallo, Dan; Daniel Kahneman (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review: 56–63.
  9. ^ Buehler, Roger; Dale Griffin; Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times". Journal of Personality and Social Psychology. 67 (3): 366–381. doi:10.1037/0022-3514.67.3.366.
  10. ^ Buehler, Roger; Dale Griffin; Johanna Peetz (2010). The Planning Fallacy: Cognitive, Motivational, and Social Origins (PDF). Advances in Experimental Social Psychology. 43. pp. 1–62. doi:10.1016/s0065-2601(10)43001-4. ISBN 9780123809469. Retrieved 2012-09-15.
  11. ^ a b c d Sanna, Lawrence J.; Parks, Craig D.; Chang, Edward C.; Carter, Seth E. (2005). "The Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy". Group Dynamics: Theory, Research, and Practice. 9 (3): 173–188. doi:10.1037/1089-2699.9.3.173.
  12. ^ Flyvbjerg, Bent; Sunstein, Cass R. (2015). "The Principle of the Malevolent Hiding Hand; or, the Planning Fallacy Writ Large". Rochester, NY. SSRN 2654423.
  13. ^ Pezzoa, Stephanie P.; Pezzob, Mark V.; Stone, Eric R. (2006). "The social implications of planning: How public predictions bias future plans". Journal of Experimental Social Psychology. 2006 (2): 221–227. doi:10.1016/j.jesp.2005.03.001.
  14. ^ Roy, Michael M.; Christenfeld, Nicholas J. S.; McKenzie, Craig R. M. (2005). "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?". Psychological Bulletin. 131 (5): 738–756. CiteSeerX doi:10.1037/0033-2909.131.5.738. PMID 16187856.
  15. ^ "Focalism: A source of durability bias in affective forecasting". American Psychological Association. May 2000. Retrieved 21 November 2014.
  16. ^ Jones, Larry R; Euske, Kenneth J (October 1991). "Strategic misrepresentation in budgeting". Journal of Public Administration Research and Theory. 1 (4): 437–460. Retrieved 11 March 2013.
  17. ^ Taleb, Nassem (2012-11-27). Antifragile: Things That Gain from Disorder. ISBN 978-1-4000-6782-4.
  18. ^ Forsyth, D. K. (June 2008). "Allocating time to future tasks: The effect of task segmentation on planning fallacy bias". Memory & Cognition. 36 (4): 791–798. doi:10.3758/MC.36.4.791.
  19. ^ "No Light at the End of his Tunnel: Boston's Central Artery/Third Harbor Tunnel Project". Project on Government Oversight. 1 February 1995. Retrieved 7 November 2014.
  20. ^ "Denver International Airport" (PDF). United States General Accounting Office. September 1995. Retrieved 7 November 2014.

Further reading

This page was last edited on 28 May 2019, at 04:17
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.