Simulation as an implementation/support platform for O.R.

Two-step approach to model validation helps boost end-user acceptance.

two step approach to model validation helps boost end user acceptance

By Kavitha Lakshmanan and Dayana Cope

Since its inception, operations research has played a significant role in solving industrial problems. With the present trend of markets becoming extremely globalized and competitive, more and more industries are relying on O.R. models to help them make decisions. Current O.R. methods have capabilities to attack complex problems within short computational times and achieve efficient results. These advanced techniques result in the development of quick solutions.

However, when it comes to actual implementation of the proposed solutions, prospective users may have a hard time accepting, and therefore implementing, the proposed solution. In effect, from a user’s perspective, it may seem that the tool does not address his or her needs, regardless of how much effort is put into the required analysis, validation and verification phases.

Even as the use and power of simulation is increasing, wide genres of problems in industry still cannot be addressed by simulation or simulation optimization alone due to the problems’ inherent complexity, constraints and nature. These problems are often solved using optimization techniques such as integer programming or mixed integer programming. It is also common for industries to buy commercial off-the-shelf solutions (COTS) that, among other things, incorporate optimization algorithms as a functionality to a much broader platform such as supply chain planning or execution.

Questions That Need Answers

The inherent advantages of COTS tools are that most of them work on a real-time basis, have capabilities to grab data from multiple sources with excellent user interfaces and produce optimal or near optimal solutions within a short span of time. But their main disadvantages are that users lack visibility of the model or method used, and it is difficult to evaluate the results proposed by them. As a result, when such solutions reach the users who should use the models on a regular basis, these models generate a number of questions that have to be answered before they feel confident that the solution is valid and applicable to their situation.

As long as these issues are not addressed, users will not buy in to the results, and these sophisticated techniques that experts spend considerable time developing will not be implemented.

Evaluation and testing of optimization tools are increasingly expected and required by end users. Some of the questions arising from clients include:

  • How risky is the solution? What is the probability of extreme conditions happening? Are there different solutions with different levels of risk for the same problem?
  • What are the underlying assumptions in the optimization model?
  • How robust is the model? Would it continuously adapt to new conditions or how easy is it to incorporate new conditions in the model?
  • Is it possible to test the model for different scenarios controlling the conditions being created or testing future conditions?
  • Is the proposed solution truly the most optimal and feasible solution?
  • How does the proposed solution affect the different stakeholders involved? Is there any potential increase on workload or resource allocation for any sub team?
  • Does the solution shift the problem to a different area instead of solving it?

“Analysis that does not address sensitivity is a sketchy work to be dismissed on the spot,” observes Patrick S. Noonan in OR/MS Today magazine’s February 2012 edition [1]. The focus on implementation of O.R. and R&D models is not new. Researchers started exploring this area as early as 1957. Study shows that the majority of an O.R. practitioner’s creativity is absorbed by modeling with little focus on implementation, and the major disagreement between O.R. practitioners and managers and end users was in the confidence of the solution. Currently, the O.R. community is grappling with more complex and messy problems – coping with groups rather than single decision-makers on fuzzy problems – while the need for implementation methodologies continues to grow.

Case Study: Eastman Chemical

Eastman Chemical attacked this problem by using simulation during the implementation phase of an O.R. project. After the O.R. model was validated and verified, the results from the model was fed to a high-level simulation model, which incorporated practical uncertainty, risks and rules. The key performance statistics of interest to the end-user were then evaluated from the simulation model and presented along with the O.R. model results to the end-user to get their feedback. When the user was confident with the solution, the project was implemented. Until then the feedback was used to adjust the O.R. model to include more constraints or change the logic accordingly. The same revision process was used with the simulation model when needed, and the entire process was iterated until the entire team reached a significant level of confidence in the results. Figure 1 schematically represents the method.

Implementation method

The two-step validation approach illustrated in Figure 1 provided us with a number of clear advantages. It helped us to confirm the feasibility of the proposed solution and compare the performance improvement from the existing situation. This reduced the concerns for the input and assumptions used during model development and allowed for improved validated solutions. To understand the level of robustness and flexibility of the solution, the users were encouraged to test future “what if “ scenarios. Most important of all, we were able to attack the concerns of a diverse group of people who were likely to be influenced by the model and helped the clients to better visualize the effect of changes proposed by the optimization model in the whole system.

While this method is encouraging, it is prudent to take this approach only when optimization models are large and complex many stakeholders are involved or in the case when models have to be transferred to clients for regular use. This is because some projects may require increased levels of detail in developing simulation models to test O.R. model solutions, further increasing the time and complexity of the O.R. project.

Many successful studies have not been implemented because the details and the benefits are not conveyed effectively to management and users alike. While this is commonly true of any project, it is especially significant in O.R. because of its mathematical content and its potential to not be fully understood by users without a strong quantitative background. The amount of time devoted to such a simulation “test-bed” is well worth the effort when, by doing so, one can greatly increase end-user confidence and therefore increase the probability of implementation success for home-grown or COTS O.R. solutions.

Kavitha Lakshmanan is an operations research analyst and Dayana Cope is the supervisor of operations research at Eastman Chemical Company, headquartered in Kingsport, Tenn.