Decision Analysis Software Survey

10th biennial survey demonstrates that visualization and options continue to improve along with Web-based integration, but a wish list of “wants and needs” remain unmet.

decision analysis software survey
© Dmitry Sungatov |

By Don Buckshaw

This marks the 10th decision analysis software survey OR/MS Today has compiled over the past 18 years, and, as before, it features some exciting new software packages along with some old favorites. In introducing the first DA survey for OR/MS Today in 1993, Dennis Buede wrote, “Decision analysis is the discipline of evaluating complex alternatives in the light of uncertainty, value preferences and risk preference” [1]. Decision analysis software has certainly changed since 1993. Computational horsepower and memory were critical concerns in 1993. Today, vendors are producing a richer set of analysis tools and better visualization and analysis options. The functionality of tools seems to increase each year, thanks to the integration of optimization, simulation, forecasting and data analysis techniques. Opportunities for Web-based collaboration and integration into business analytics have never been better.

The Survey

The survey process remains the same as in previous years. Vendors are directed to an online questionnaire to fill out and return. There are no judgments on the quality of the software – this is not a contest. Instead it is an opportunity for a company to describe its DA software to interested users in a way that is helpful for comparisons. The reader can quickly see a list of the capabilities of each package as reported by the vendor to OR/MS Today. Interested vendors who did not respond by the deadline have the opportunity to add their software package to the online version of the survey.

For this year’s survey, questions on the limitations of the tools caused by memory or processing limitations were removed, and questions based on the input of decision analysis practitioners from government, airline, oil and gas and pharmaceutical sectors were added. Many of the new questions focus on additional functionality for analysis, visualization and documentation.

This year’s survey features 36 decision analysis products from 24 vendors, numbers that are consistent with previous surveys. For comparison purposes, there were 21 vendors and 30 packages in the 1994 survey. Six vendors or software packages from the 1994 survey appear in this year’s survey, including Logical Decisions, Lumina Decision Systems, Palisade, Banxia Software and Catalyze. One software package from 1994, DPL, is still around today, but is owned by a different company – Syncopation Software.

The Responses

The software listed in this survey seem to fall into four basic types with quite a bit of overlap. The first is software designed for problem structuring and brainstorming, often with no quantitative component. This includes scenario planning and mind mapping software. The second type of software provides a prioritized list or portfolio of options from a group and typically implements a decision method such as multiple objective decision analysis, multiple criteria decision-making, the analytic hierarchy process and expert systems. The third type of software analyzes single attribute decision-making with uncertainty and often involves influence diagrams and decision trees. The last type of software concentrates on uncertainties and probabilistic analysis and includes Monte-Carlo simulation and Bayesian belief nets. Some forecasting, optimization and general operations research analysis packages are also included. Of course, several packages combine these four ideas in varying degrees of detail.

Prices for software range from free to $10,000 for a professional version. Some Web-based decision analysis software not included in this survey can cost far in excess $10,000 for enterprise versions.

All of the software in the survey runs on the Windows operating system, and several also run on the Macintosh and Unix operating systems. Ten of the packages are Web implementations, up from six packages two years ago. This trend will likely continue. Almost all packages account for uncertainty and have methods to import and export the models; most can export their data to Microsoft Excel.

What I Would Like to See in 2012

Looking back over time, one feature of decision analysis software hasn’t evolved as fast as the analytic side – the support for the new analyst to properly structure and build models that make analytic sense. Back in 1994, Buede encouraged software developers “to explore structuring support because these tools will not be truly valuable until users are able to structure valid representations of their decision problem for analysis” [2]. In the 2002 survey, another survey author, Dan Maxwell, said, “All of the modeling techniques encoded in software have their own underlying axioms, assumptions and limiting conditions. And, the risk of unwittingly creating an apparently elegant and informative model that violates these conditions — potentially misinforming decision-makers — has never been higher” [3]. It seems that software has not replaced the decision analyst, but it has simplified and enhanced the trained analyst’s capabilities to quickly and easily analyze very complex systems. Decision analysis software, however, does little to guide the new analyst along the path to success. The field of decision analysis is possibly unique in that the mathematical underpinnings are very simple, but the underlying assumptions and axioms that cause the model to have real meaning are often hard to understand. There has been some advancement in this area, especially in the areas of brainstorming value hierarchies and eliciting probabilities, but there is still much room for improvement.

As a decision analysis practitioner, I have the opportunity to review or participate in many different decision analysis projects. Recently, I observed a large organization struggle with a very difficult problem. This problem showed many of the traits of a complex decision problem: multiple stakeholders, multiple and conflicting objectives, uncertainty, portfolio of options, sequential decisions and interdependencies. In addition, this organization displayed a systemic inability to develop a process to solve this annual problem.

This year the organization implemented a new solution. Did they decide to take a value-focused modeling approach, create a decision-making process that accounted for conflicting objectives and enforced clarity of decisions? No. Instead they bought a Web-based software tool. Instead of discussing why stakeholders might have greatly different rationale for the weights of the decision variables, they can now quickly get consensus by averaging the opinions of many people over the Web. The result is that rather than having many confused and frustrated people in a room slowly making poor decisions, they now have many confused people in many different rooms making poor decisions at a faster pace than ever before! As Douglas Hubbard said, “An exercise that builds consensus to go down a completely disastrous path probably ensures only that the organization goes down the wrong path even faster” [4].

The lesson is obvious to decision analysis experts: decision analysis software is a tool that should be used to support smart analysis, not to replace it. Over the last few years, INFORMS introduced a Soft Skills Workshop at its annual business practice conference. The purpose of the workshop is to pass on the modeling tricks of the trade that practitioners have developed over the years to simplify and increase the accuracy of their models through smart, structured interactions with customers.

I would like to see software developers design their products in a way that can position both new and old analysts for success. Let’s automate some of these tricks of the trade so that the wisdom that has come so hard over the many years can be made available to another generation of decision analysts. Here are some ideas:

  1. Integrate brainstorming techniques, such as a virtual affinity diagramming process, with decision analytic tools. Include helpful hints for structuring that pop up to ensure that “ends,” not “means,” are being modeled, and that the objectives are mutually exclusive and collectively exhaustive.
  2. Ask for user input in a way that is easy to answer and helps calibrate the decision model with decisions that a person would realistically make. For elicitation, research has shown that humans are good at making comparative assessments (e.g., “I think A is better than B”) but poor at making ratio judgments (e.g., “I think A is five times better than B”) [5].
  3. Provide helpful hints while creating scales to remind the user that the low end of the scale does not include infeasible options (which would render an additive model useless), and remind the user to allow for feasible growth in the upper part of the measure to fully allow for the principles of value-focused thinking to be implemented.
  4. Create new ways to elicit swing weights that automatically form easy to answer preference questions based on the high and low ends of the scales. Also, integrate ways to display and explain weights, such as the swing weight matrix [6], so that weights have both a mathematical and a managerial justification.
  5. Create better displays to visually score alternatives that help ensure consistency across responses and avoid scale compression.
  6. Provide structured interview questions for probability distribution elicitation designed to identify and avoid decision biases, such as structuring the interview process and anchoring effect for probability elicitation but considering the highs and lows of the distribution.
  7. Encourage the use of value-focused thinking by identifying value before alternatives, structuring alternative generation around the value hierarchy, and determining ways to create better alternatives analytically or by creating hybrid alternatives.
  8. Provide better aggregation of group input to highlight disagreement among stakeholders, and if the model is sensitive to these disagreements, alert the user and recommend options (e.g., different preference sets, better definitions, stating assumptions or simply displaying the differences as value ranges).


The decision analysis software field is going strong with some old favorites and interesting new additions. Memory and computation needs are no longer a concern, and more distributed, Web-based implementations are coming to the fore. However, I would like to see more of an effort for software vendors to build in some form of coaching into their products so that even a novice can be confident that their models are producing sensible results. Finally, if you know of any software not represented in this survey, tell the vendor go to, fill out the online survey and the software will be added to the online version of the survey.

Don Buckshaw ( is a senior principal analyst with Innovative Decisions, Inc. a management consulting firm serving business and government clients and specializing in the disciplines of decision and risk analysis, operations research and systems engineering. He has more than 20 years of experience as an intelligence and operations research analyst.


  1. Buede, Dennis, “Decision Analysis Software: Aiding the Development of Insight,” OR/MS Today, April 1993.
  2. Buede, Dennis, “Decision Analysis Software: Aiding Insight II,” OR/MS Today, April 1994.
  3. Maxwell, Daniel, “Decision Analysis Software: Aiding Insight VI,” OR/MS Today, June 2002.
  4. Hubbard, D., “The Failure of Risk Management,” John Wiley and Sons, Hoboken New Jersey, 2009.
  5. Lai, S., “An Empirical Study of Equivalence Judgments vs. Ratio Judgments in Decision Analysis,” Decision Sciences, Vol. 32, No 2, Spring 2001, pp. 277-302.
  6. Parnell, G., Driscoll, P., & Henderson, D., editors, “Decision making for systems engineering and management,” Wiley Series in Systems Engineering. Wiley & Sons, 2008.

View the results of the 2010 Decision Analysis Software Survey