Software Survey: Simulation — Back to the future

A brief history of discrete-event simulation and the state of simulation tools today.

simulation software survey

By James J. Swain

The simulation software tools in the latest survey are impressive in their range, sophistication and capabilities. As a group they provide unparalleled support for the building, viewing and analysis of models for a wide range of applications. Many include the capability to communicate with other applications or to take advantage of optimization routines to adjust staffing or other simulated resources to reduce cost, risk or customer delay. They represent a culmination of over a half-century of converging development of software, hardware and simulation research efforts.

Simulation was implicit from the earliest computing machines, and Monte Carlo studies were implemented by some of the same people who laid the foundations of the modern stored program computer and designed the first atomic weapons. The central precepts of repetitive sampling using pseudo-random numbers to provide statistical estimates of random processes were among the earliest applications of the new computing technology. Programming limits, computing speeds and storage capacity fundamentally limited the range of those computations, but the basic principles were there from the beginning.

The novelty of the tools and the specialized knowledge required to plan and execute these studies meant that the earliest users were specialists with access to advanced research tools. It took time to develop both the computing hardware, with storage and computations capabilities, as well as the basic programming infrastructure to support simulation-modeling software, to make simulation tools available to a wider audience.

In fact, more than a decade would be required to lay down the basic infrastructure to make simulation practical outside of research labs. Geoffrey Gordan’s General Purpose Simulation System (GPSS) was based upon block diagrams that he thought would make the concepts readily accessible to engineers. GPSS was soon joined by other programs and within a year both SIMSCRIPT and SIMULA were released. Each of these three tools has undergone many improvements and all three are still in use today. By the end of the decade, interest in the field led to the Conference on Applications of Simulation using the General Purpose Simulation System, which soon became the Winter Simulation Conference (www.wintersim.org), reflecting the growing diversity of tools and simulation related research underway.

By the early 1970s when I first encountered simulation, it was hardly widespread nor were the tools very capable by today’s standards. In that first simulation course, I learned how to solve models described by differential equations by means of an electronic analog computer. Programming was done by plugging wires to connect the potentiometers and the amplifiers to set up the equations that defined the model (the amplifiers performed the integration). Setting up a problem included the necessity to rescale the equations to meet the constraints of the computer. One salient advantage of the analog computers over the digital at that time was that the output presented on an oscilloscope was more detailed than would appear on digital machines for years. Yet developments came fast; within a few years the digital program called the Continuous Systems Modeling Program (CSMP) quickly surpassed the analog capabilities, performing numerical calculations of greater range and accuracy than the analog was capable of. I first encountered CSMP as an aid to help compute the necessary scaling for analog computers.

That same semester I was also introduced to GPSS V operating on a campus mainframe. By current standards the capabilities of this program were limited. For instance, random variables such as the exponential were approximated by piecewise linear functions, and the program computed the variance of dependent series without compunction or warning. Initially IBM provided this software for free. However, the GPSS paradigm was so popular that after IBM abandoned GPSS, several versions of the program were produced and several versions still flourish including Minuteman’s GPSSWorld, Wolverine Software’s GPPS/H and SLX and Ingolf Ståhl’s WebGPSS.

Simulation programs benefitted from research advancements as much as the changes in hardware and software. John von Neumann quickly realized the need for pseudo-random numbers as a matter of both practicality and experimental reproducibility. Early attempts such as the mid-square technique proved problematical and were replaced by linear congruential generators in the early 1950s. However, the widespread deficiencies of one popular generator of this type called RANDU helped spark considerable theoretical work over several decades of work on random numbers and their properties to provide a convenient, reliable and almost inexhaustible source of input random numbers today. Even as late as the 1990s users were still dependent upon generators that they could easily cycle through in a moderately sized problem.

The analysis of simulation experiments has undergone considerable maturation, starting with the start-up problem and the question of variance estimation: for long term analysis (or “steady state” estimation) the answer should be independent of the starting conditions. In that case, how should the system be “warmed up” to provide estimates that are unbiased by starting conditions? Over time, procedures have evolved for obtaining comparisons among alternatives, ranking and selection among alternatives, and even methods that can determine optimum solutions over a combination of discrete and continuous parameters.

By the 1990s, advances in computer operating systems made it possible to combine process models with graphical user interfaces, and the drag-and-drop graphical interface supplanted text as the predominant mode of model building. Even more significantly, graphical output and animation rapidly became the norm, so that visualization of the model processes could augment statistical summaries. This development expanded the number of people who could view and understand simulation, and made models much easier to develop and validate. Meanwhile, standard data formats for text, spreadsheets and databases made it increasingly easy to transfer data between programs, either to provide simulation models with access to data or to organize data provided by the simulation for further analysis or use by other tools.

Process models began largely as descriptions of the steps that “transaction” entities (customers, parts or messages, for instance) encountered during processing among “fixed” entities (such as servers, machines or routers, for instance). Of course, the fixed entities might move among locations themselves (transporters such as AGVs, for instance), and programs evolved to provide this enhanced functionality.

The distinction between these sets of entities might become hard to determine. For instance, a maintenance tech could be considered a “resource” that is limited in its ability to perform repair activities, while having to negotiate various process steps to describe the allocation of its capacity to those services. More fundamentally, while parts might be constrained to follow a specified process, human entities may have to select a course of action that reflects either a prescribed hierarchy of actions, or perhaps choose from among priorities and preferences, acting as “agents” capable of choice. Simulation programs increasingly support modeling the behavior of agents in the model.

The early simulation language SIMULA pioneered object-oriented programming through the idea of classes of objects with particular properties, instances of which could be created in the model (a “customer” class, for instance). New classes could be created by combining classes, with the new classes inheriting the properties of the component classes. Object-oriented programming languages (such as C++ or Java) derive from this approach, and increasingly objects are implemented in simulations as well. Such descriptions provide enhanced modeling flexibility and can also support model reuse through the creation of object libraries. Thus objects with specific behaviors can be developed in one model and reused in others, without the necessity of recreating them in each new model.

Today’s simulation products represent many advances in capabilities for building, studying and analyzing simulation models. The evolution of simulation products is intertwined with developments in simulation theory and practice, as well as advances in computer hardware and software. The limitations of one generation of modeling software have often been the basis of the next generation’s tools, as capability evolves to address the issues. This development is roughly paralleled in the career progression of C. Dennis Pegden, who was the developer of three modeling systems, SLAM, SIMAN/ARENA and most recently SIMIO, which collectively span much of the period from early process orientation to the most recent period.

For readers interested in a more details about the history of simulation, a good starting point is Goldsman, Nance and Wilson [1].

Survey

This survey is the eighth biennial survey of simulation software for discrete event systems simulation and related products [2]. All product information has been provided by the vendors. Products that run on personal computers to perform discrete event simulation have been emphasized, since these are the most suitable for usage in management science and operations research. Simulation products whose primary capability is continuous simulation (systems of differential equations observed in physical systems) or training (e.g., aircraft and simulators) are omitted here.

The survey includes 55 products, taken from 29 vendors submitted for the survey, once again surpassing the last survey. The range and variety of these products continues to grow, reflecting the robustness of the products and the increasing sophistication of the users. The information elicited in the survey from the vendors is intended to provide a general gauge of the product’s capability, special features and usage. This survey includes information about experimental run control (e.g., batch run or experimental design capabilities) and special viewing features, including the ability to produce animations or demonstration that can run independent of the simulation software itself. A separate listing gives contact information for all of the vendors whose products are in the survey. This survey is also available on the Lionheart Publishing website (www.lionhrtpub.com) and will include vendors who missed the publishing deadline. Of course, most of the vendors provide their own websites with further details about their products. Many of the vendors also have active users groups that share experience in the specialized use of the software and are provided with special access to training and program updates.

A number of technical and professional organizations and conferences are devoted to the application and methodology of simulation. The INFORMS publications Management Science, Operations Research and Interfaces publish articles on simulation. The INFORMS Simulation Society sponsors simulation sessions at the national INFORMS meeting and presents awards for both the best simulation publication and recognition of service in the area, including the Lifetime Achievement Award for service to the area of simulation. Further information about the Simulation Society can be obtained from the website www.informs-cs.org. This site also contains links to many simulation products vendors and sources of information about simulation, simulation education, and references about simulation. The Society for Modeling and Simulation International (www.scs.org) is also devoted to all aspects of simulation. Their conferences include two annual multi-conferences, SpringSim and SummerSim, which cover all aspects of simulation practice and theory. Huntsville, Ala., will be the site of a new simulation conference, the AlaSim International Conference, May 1-3, 2012.

The INFORMS Simulation Society and the Society for Modeling and Simulation are both sponsors of the annual Winter Simulation Conference. This year’s conference will be held 11-14 December in Phoenix, Ariz. As in past years, the conference will be held together with the Modeling and Analysis of Semiconductor Manufacturing (MASM) conference. Further information and registration information is available from the site www.wintersim.org. This site also archives the complete contents of the Proceedings of the Winter Simulation Conference from 1996 to 2010 for ready access to research and applications of simulation.

Applications

The range of simulation applications can be explored through published sources such as the Proceedings of the Winter Simulation Conference and many of the vendor websites. The latter serve to illustrate the range of the products and to provide illustrations about the application of the software, typical means of analysis and success gained from simulation.

Logistics, supply chains and transportation have been the frequent objects of simulation study. This is a reflection of the critical importance of these logistical issues to retailers and manufacturers to ensure the timely delivery of products and components at a competitive cost. Simulation has been used to study all aspects of transporting cargo within the supply chain, whether by ship, barge, truck or train, or managing the operation of port terminals and distribution centers. Some studies focus on the overall operations of large networks, while other simulations have examined the optimal way to route trucks to loading docks within regional distribution centers. Newer technologies such as RFID labeling have been studied using simulation and their use examined within the warehouse, as well as their potential effect on the overall supply chain. Likewise, once items reach the factory, they may be stocked to await use. Automated retrieval systems may be used and often optimized after study by simulation. Finally, simulation may be used to examine the information system that supports the supply chain, or perhaps to determine ways to quantify risk within the system and to devise strategies to ameliorate risk within the logistics chain.

Simulation has been used for decades to study, improve and optimize manufacturing processes and operations. This is reflected in many of the products in this survey that most often provide support for manufacturing problems in their design and their animation features. This continues to be an important area of simulation analysis that now includes both lean manufacturing initiatives and Six Sigma studies.

The military has used simulation for decades to story all aspects of operations, doctrine and training. Simulation is increasingly used in non-military settings to examine security in a variety of settings, such as airports, power plants and public venues such as large entertainment and sporting events. Simulation analysis has been extended to emergency planning of all kinds, whether it is the need to evaluate an evacuation plan or to plan an emergency response in the aftermath of hurricane, earthquake, tsunami, flood or tornado. The delivery of critical food or shelter items, or delivery of telemedical support to designated receiving centers, are among the studies that have been made using simulation.

Finally, healthcare has been an area of intense simulation study at every level. Simulation has been used to examine national policy, as in the case of transplant organs distribution within the United States [3], or to improve patient care at the local hospital. Simulation has looked at staffing and scheduling, in emergency rooms and outpatient clinics, surgery units, lab-testing units and even the medical supplies logistics. Since healthcare has a major contribution to the GDP and is a the subject of an ongoing national policy debate, it is safe to say that it will remain an area of active study for decades to come, and simulation will remain a valuable tool for this study. It is a challenge that simulation is more than capable of handling.

James J. Swain (jswain@ise.uah.edu) is professor and chair of the ISEEM department at the University of Alabama in Huntsville.

References

  1. Goldsman, D, Nance, R. E., and Wilson, J. R., 2010, “A Brief History of Simulation Revisited,” Proceedings of the 2010 Winter Simulation Conference, ed., B. Johansson, S. Jain, J. Montoya-Torres, J. Hugan, and E. Yücesan, pp. 567-574, Piscataway, N.J.: Institute of Electrical and Electronics Engineers.
  2. Swain, J.J., 2009, “To Boldly Go… Discrete Event Simulation Software Tools,” OR/MS Today, Vol. 35, No. 5, October, pp. 50-61.
  3. Pritsker, A.A.B. et al., 1995, “Organ transplantation policy evaluation,” in Proceedings of the 1995 Winter Simulation Conference, ed. C. Alexopoulos, K. Kang, W. Lilegdon, and D. Goldsman, pp. 1,314-1,323, Piscataway, N.J.: Institute of Electrical and Electronics Engineers.