Don’t Trust Experts’ Forecasts, O.R. Study Says

Hanover, MD, August 28, 2007 – A study about predicting the outcome of actual conflicts found that the forecasts of experts who use their unaided judgment are little better than those of novices, according to a new study in a publication of the Institute for Operations Research and the Management Sciences (INFORMS®).

When presented with actual crises, such as a disguised version of a 1970s border dispute between Iraq and Syria and an unfolding dispute between football players and management, experts were able to forecast the decisions the parties made in only 32% of the cases, little better than the 29% scored by undergraduate students. Chance guesses at the outcomes would be right 28% of the time.

The study, “The Ombudsman: Value of Expertise for Forecasting Decisions in Conflicts,” is by Kesten C. Green of Monash University in Australia and J. Scott Armstrong of the Wharton School at the University of Pennsylvania. It appears in the INFORMS journal Interfaces, Volume 37. no. 3.

The research can have serious consequences for foreign policy and business. Green says, “Political leaders in the West are pondering how best to deal with the threat of the Iranian government’s nuclear ambitions. Forecasting problems such as this are the stuff of not only international relations but also of takeover battles, commercial competition, and labor-management disputes. In most cases, experts use their judgment to predict what will happen. How good are their forecasts?

“The short answer is that they are of little value in terms of accuracy. In addition, they lead people into false confidence.”

In the study, the authors question experts’ ability to forecast without proven structured methods.

“Accurate prediction is difficult because conflicts tend to be too complex for people to think through in ways that realistically represent their actual progress,” the authors write. “Parties in conflict often act and react many times, and change because of their interactions.”

The authors designed an experiment to test their hypothesis. They wrote descriptions of eight diverse conflicts and presented them to conflict experts, domain experts, and forecasting experts. The conflicts were assigned to experts based on their specialties.

The case studies were diverse: they included a hostile takeover attempt, nations preparing for war, a controversial investment proposal, a nurses’ strike, an action by football players for a larger share of the gate, an employee resisting the downgrading of her job, artists demanding taxpayer funding, and a new distribution arrangement that a manufacturer proposed to retailers.

The authors provided both the expert and novice participants with a set of between three and six potential decisions. The authors received 106 expert responses. The authors presented the same material to undergraduate students and received 169 responses.

Analysis of additional data produced similar results. In one instance, the authors attempted to determine if veteran experts would be more likely to make accurate forecasts than less experienced experts. “Common sense expectations did not prove to be correct,” they write. “The 57 forecasts of experts with less than five years experience were more accurate (36%) than the 48 forecasts of experts with more experience (29%).”

The authors also asked experts about their previous experience with similar conflicts and looked at the relationship with the accuracy of their forecasts. Again, the expected conclusion did not prevail: those who considered themselves as having little experience with similar conflicts produced forecasts that were equally as accurate as those who were long-time veterans in the field.

The authors examined the confidence that the experts had in their forecasts by asking them how likely it was that they would have changed their forecasts had they spent more time on the task. Another surprise: 68 high-confidence forecasts were less accurate (28%) than the 35 low-confidence forecasts (41%).

Based on this study and earlier research, the authors conclude that there are no good grounds for decision makers to rely on experts’ unaided judgments for forecasting decisions in conflicts. Such reliance discourages experts and decision makers from investigating alternative approaches.

Instead, they recommend that experts use reliable decision-support tools. They cite two examples of decision aids that can improve forecasts. In an earlier study, Green reported that simulated interaction, a type of role playing for forecasting behavior in conflicts, reduced error by 47%.

Using another technique, structured analogies, the authors found favorable results. In that study, they asked experts to recall and analyze information on similar situations. When experts were able to think of at least two analogies, forecast error was reduced by 39%. This structured technique requires experts, and those with more expertise were able to contribute much more to making accurate forecasts.

 

About INFORMS

The Institute for Operations Research and the Management Sciences (INFORMS®) is an international scientific society with 10,000 members, including Nobel Prize laureates, dedicated to applying scientific methods to help improve decision-making, management, and operations. Members of INFORMS work in business, government, and academia. They are represented in fields as diverse as airlines, health care, law enforcement, the military, financial engineering, and telecommunications. The INFORMS website is www.informs.org. More information about operations research is at www.scienceofbetter.org.

INFORMS journals are strongly cited in Journal Citation Reports, an industry source. The special MBA issue published by Business Week includes Operations Research and two other INFORMS journals in its list of 20 top academic journals that are used to evaluate business school programs. Financial Times includes five INFORMS journals in its list of academic journals used to evaluate MBA programs.
###

Media Contact

Ashley Smith
Public Affairs Coordinator
INFORMS
Catonsville, MD
[email protected]
443-757-3578

See all Releases