WINFORMS: The Washington DC Chapter of INFORMS

Meetings and Events 2009

Past Meetings and Events, 2009

“DoD Analytic Agenda” by Charles D. “Chuck” Burdick

Date: Tuesday, December 15, 2009
Location: Arlington Central Library

Above is a recording of this evening program; other recordings will be available exclusively to institutional sponsors.

Most of the large Government analysis organizations have traditionally maintained one or more large campaign models to simultaneously evaluate all their equipment and procedures and justify changes. To feed these models, these organizations had to put together scenarios of likely situations where both the concepts and equipment interacted as they would be expected to work in wartime situations. The results of simulating these scenarios have then been used to determine the relative value of portrayed systems in these operational concepts and to make trade-offs among the alternatives that provide an optimal force.

All of these different campaign models are represented as being joint, meaning they involve the forces of the Army, Navy, Air Force and Marines, however, the scenarios chosen for them in the past have generally reflected the different points of view of each military service. Thus, when data was collected for the models, it often came from a variety of sources and timeframes, making it difficult to merge into a common simulation environment. The various analysis teams usually ended up employing their own Subject Matter Experts (SME) to build unique scenarios for each project and analysis task. These scenarios then had to be defended along with the analysis itself.

To confound the situation, warfare and the way we conduct it has been rapidly changing since the end of the cold war with new technologies, new missions, and unpredictable enemies engaging our forces not only in traditional warfare, but also counterterrorism, Homeland Defense, and irregular warfare. The Department of Defense (DoD) Analytic Community has recognized the problem and has implemented a formal scenario development process that is designed to cover current and future threats in the most likely environments. The scenarios that make up the Analytic Agenda are to be used by all the military components involved in the evaluation of existing and future forces. The objective is to provide a suite of scenarios of sufficient breadth to evaluate the fore in its expected environments against a range of possible enemy situations. In conjunction with the scenarios, the Joint Data Support office was established to maintained and redistribute the data generated as part of the Analytical Baselines and making it available for OSD, Agency, and Service studies, wargames and simulations.

This presentation explains the process of developing the DoD Analytic Agenda scenarios, describes the current status of the program, and discusses some of the challenges that have been encountered along the way.

Chuck Burdick has over 25 years of experience in modeling and simulation of military systems and has applied combat simulations to analytical, training, and testing applications for the Services and OSD agencies. For the past ten years he has worked on the development and application of the Joint Analysis System (JAS). He is a retired Army Military Intelligence Reserve officer with seven years on active duty. Prior to joining Innovative Decisions, Inc (IDI), he spent 17 years each with Lockheed Martin and BDM International. He has been a long time member of the C4ISR Executive Committee for the National Defense Industrial Association and is a past Chairman of the Defense Modeling and Simulation Organization Industry Steering Group. Chuck has authored numerous articles and reports on combat simulation and the simulation of human factors in combat. He is a regular contributor of presentations to the Military Operations Research Society Symposiums and has presented in several other forums.

Chuck received his undergraduate degree from Rensselaer Polytechnic Institute (RPI) in Troy, NY and has a Masters Degree in ORSA from George Washington University in Washington, DC. He is also a graduate of the Army Command and General Staff College and the National Defense University. He is currently serving as the Treasurer of WINFORMS.

“Design of Experiments for Simulation Modeling” by Averill M. Law, Ph.D.

Date: Thursday, November 19, 2009
Location: Holiday Inn Rosslyn at Key Bridge

Discrete-event and agent-based simulation models often have many input factors, and determining which ones have a significant impact on performance measures (responses) of interest can be a difficult task. The common approach of changing one factor at a time is statistically inefficient and, more importantly, is very often just incorrect, because for many models factors interact to impact on the responses. In this tutorial we present an introduction to design of experiments (DOE) specifically for simulation modeling, whose major goal is to determine the important factors with the least amount of simulating. We discuss a simple and widely applicable approach to performing DOE in the context of simulation modeling, whereas methods based on classical statistics (i.e., ANOVA) make assumptions such as constant variances and normally distributed errors that are very often grossly violated.

Averill M. Law is President of Averill M. Law & Associates, a company specializing in simulation training, consulting, and software. Previously, he was a tenured faculty member at the University of Wisconsin-Madison and the University of Arizona. He has a Ph.D. in operations research from the University of California at Berkeley. Dr. Law is the author of numerous papers and books on simulation, operations research, and statistics, including the textbook Simulation Modeling and Analysis that is widely considered the “bible” of the simulation industry with 125,000 copies in print. He has presented more than 475 simulation short courses in 18 countries, including on-site seminars for Boeing, IBM, Lockheed Martin, Northrop Grumman, NSA, U.S. Air Force, U.S. Army, and U.S. Navy. Dr. Law has been a simulation consultant to organizations such as Booz Allen & Hamilton, Oak Ridge National Lab, Sandia, U.S. Air Force, and U.S. Army. His work on model validation has been funded by the Defense Modeling and Simulation Office and the Office of Naval Research. He is the developer of the ExpertFit® distribution-fitting software and two videotapes on simulation modeling.

“Informing the Findings of Truth and Reconciliation Commissions” by Jana Asher, President, StatAid

Date: Tuesday, September 29, 2009
Location: OMNI Engineering & Technology, Tysons Corner
A WINFORMS / IIE Joint Evening Program

A Truth and Reconciliation Commission determines who did what to whom after civil unrest by analyzing of a complex system of inter-related violations that can span years and geography. In this talk, we describe work for the TRCs of Sierra Leone, Peru, and East Timor. We compare and contrast the data and mathematical models used: basic statistics, survey estimates, and multiple systems estimation. We also discuss difficulties in data acquisition and politics of working within the TRC environment.

Jana Asher, M.S., is a statistician who specializes in the collection and analysis of human rights violations data. She has worked on projects for the International Criminal Tribunal for the Former Yugoslavia, the Peruvian Truth and Reconciliation Commission, Physicians for Human Rights, Human Rights Watch, the East Timor Truth and Reconciliation Commission, the Sierra Leone Truth and Reconciliation Commission, the American Bar Association, the Science and Human Rights Program of the American Association for the Advancement of Science, and the U.S. Census Bureau. In 2009, the American Statistical Association selected her as a Fellow for her contributions to the profession, and for excellence in the application of statistical methodology to human rights and humanitarian measurement problems. Jana is currently completing her Ph.D. in Statistics with an Emphasis on Human Rights under the direction of Professor Stephen E. Fienberg at Carnegie Mellon University.

“Kill All the Quants? Models vs. Mania in the Current Financial Crisis” by Professor Andrew Lo, MIT Sloan School of Management

Date: Tuesday, July 14, 2009
Location: MITRE Bedford, MA and MITRE McLean, VA
Special Virtual Joint Evening Program: INFORMS Boston Chapter and WINFORMS

As the shockwaves of the financial crisis of 2008 propagate throughout the global economy, the “blame game” has begun in earnest, with some fingers pointing to the complexity of certain financial securities, and the mathematical models used to manage them. In this talk, I will review the evidence for and against this view, and argue that a broader perspective will show a much different picture. Blaming quantitative analysis for the financial crisis is akin to blaming F = MA for a fallen mountain climber's death. A more productive line of inquiry is to look deeper into the underlying causes of financial crisis, which ultimately leads to the conclusion that bubbles, crashes, and market dislocation are unavoidable consequences of hardwired human behavior coupled with free enterprise and modern capitalism. However, even though crises cannot be legislated away, there are many ways to reduce their disruptive effects, and I will conclude with a set of proposals for regulatory reform.

Andrew W. Lo is the Harris & Harris Group Professor of Finance at the MIT Sloan School of Management and the director of MIT’s Laboratory for Financial Engineering. He received his Ph.D. in economics from Harvard University in 1984, and taught at the University of Pennsylvania's Wharton School as the W.P. Carey Assistant Professor of Finance from 1984 to 1987, and as the W.P. Carey Associate Professor of Finance from 1987 to 1988.

His research interests include the empirical validation and implementation of financial asset pricing models; the pricing of options and other derivative securities; financial engineering and risk management; trading technology and market microstructure; statistics, econometrics, and stochastic processes; computer algorithms and numerical methods; financial visualization; nonlinear models of stock and bond returns; hedge-fund risk and return dynamics and risk transparency; and, most recently, evolutionary and neurobiological models of individual risk preferences and financial markets.

He has published numerous articles in finance and economics journals, and is a co-author of The Econometrics of Financial Markets and A Non-Random Walk Down Wall Street. He is currently an associate editor of the Financial Analysts Journal, the Journal of Portfolio Management, the Journal of Computational Finance, and the Review of Economics and Statistics. His awards include the Alfred P. Sloan Foundation Fellowship, the Paul A. Samuelson Award, the American Association for Individual Investors Award, the Graham and Dodd Award, the 2001 IAFE-SunGard Financial Engineer of the Year award, a Guggenheim Fellowship, the CFA Institute’s James R. Vertin Award, and awards for teaching excellence from both Wharton and MIT. He is a former governor of the Boston Stock Exchange, and currently a research associate of the National Bureau of Economic Research, a member of the NASD’s Economic Advisory Board, and founder and chief scientific officer of AlphaSimplex Group, LLC, a quantitative investment management company based in Cambridge, Massachusetts.

Tour of USPS Dulles Processing & Distribution Center

Date: Wednesday, June 17, 2009
Location: US Postal Service Dulles P&DC

The Flats Sequencing System advances flat mail processing by sorting flat mail in the order that postal carriers walk their route. This significantly improves the efficiency of flat mail processing and allows postal carriers more time to serve customers. FSS is designed to automatically sequence flat mail at a rate of approximately 16,500 pieces per hour, and is capable of sorting and sequencing up to 75,000 pieces of flat mail in one sequencing session. The machine is designed to sequence 280,500 pieces to more than 125,000 delivery addresses on a typical 17 hour daily operating window.

Donald E. Crone, Program Director for the Flats Sequencing System (FSS), Headquarters Engineering, U.S. Postal Service, will lead the tour featuring the breakthrough Flats Sequencing System as well as other operations at the Dulles mail distribution center.

“A Profitability Model of Military Organizations” by Dr. Brett D. Steele

Date: Tuesday, June 2, 2009
Location: OMNI Engineering & Technology, Tysons Corner
A WINFORMS / IIE Joint Evening Program

This talk will present an innovative quantitative approach for analyzing military organizations. It involves applying the return on investment (ROI) model or objective function of both industrial firms and terrorist organizations, which the speaker presented last year at WINFORMS. As a result, this talk will suggest a common logical foundation of military, terrorist, and industrial strategies. This presentation will commence by interpreting military organizations as producers of technological units of destruction or "destructs." Like the output of a commercial industrial production process or "products," these military destructs are associated with an array of losses and gains, which can be shaped by both supply-side and demand-side investment options, respectively. These investments include both technological acquisitions and training programs to improve a military organization's capabilities in maneuvering and force protection, as well as intelligence data acquisition and knowledge distribution. Furthermore, the losses and gains associated with military destructs are also dependent on their delivery rates and destructive-performance levels. The persistent nature of these relationships, regardless of the technology being used, permits the derivation of a generalized profitability model. This model may therefore serve as an objective function of a particular military organization within the context of a particular mission. Assuming sufficiently accurate unit-attack cost and unit-attack-benefit functions, it can be used to assess the optimum levels of the supply-side and demand-side investments, along with the destruct's delivery rate and destructive-performance level. The second half of this talk will address the tactical logic of modern combined-arms actions, followed by a review of classic military investment strategies throughout history.

Brett D. Steele, a senior business development analyst at Ideal Innovations, Inc., received his Ph.D. in the History of Science and Technology from the University of Minnesota. He has taught in the electrical engineering and history departments at UCLA, served as a lecturer at Stanford University and the Royal Institute of Technology in Stockholm, and more recently taught in the Security Studies Program of Georgetown University's School of Foreign Service. Dr. Steele has also held research positions at NASA-Ames, Hewlett-Packard, the RAND Corporation, and the Homeland Security Institute. More recently, he served as a Science and Technology Advisor at the Joint Improvised Explosive Device Defeat Organization (JIEDDO). Dr. Steele's publications include The Heirs of Archimedes: Science and the Art of War through the Age of Enlightenment (MIT Press, 2005), Military Reengineering between the World Wars (RAND, 2005), and “An Economic Theory of Technological Products” in Technological Forecasting and Social Change (March, 1995).

“Social Network Analysis for Homeland Defense and Civil Support” by Marjorie Greene

Date: Tuesday, May 5, 2009
Location: George Washington University, Department of Engineering Management and Systems Engineering

I discuss applying social network analysis to homeland defense and civil support operations. Several experiments have shown how social networks can “spread” information from person to person, contagiously, in the style of an epidemic. Analyses of large datasets shed light on principles of social network evolution over time. However, thus far, there are no models that have been able to apply these principles to inform critical decision making and policy for small-world phenomena. The true challenge is to bridge the gap between the massive and the detailed, to find the points where observing social activity in aggregate can be interpreted at a fine-grained level. Homeland security, homeland defense, and civil support operations are also looking for applications of social network analysis, especially when diverse organizations try to coordinate their activities across a broad spectrum of activity. I suggest a new approach toward analyzing these operations that uses a self-organizing social network ontology to improve our modeling capability for future decision making.

Marjorie Greene, PhD is a Senior Program Manager with SAIC's Defense and Maritime Solutions Business Unit. She has served as Principal Investigator for projects to determine business requirements for Government clients; initiated programs to increase awareness of bio-threats; and facilitated exercises and war games to assist in the development of tools for non-traditional military mission planning. She has published widely on the impact of technology on society. She previously held positions with the Center for Naval Analyses, ANSER, and several organizations in the UK, where she pursued her interest in exploiting operational data for future planning. She has B.S. and M.A. degrees in mathematics and has completed course work for her Ph.D. in Operations Research at Johns Hopkins University.

“Recent Agent-Based Modeling of Security Threats” by Mark Harmon, Jenish Joseph, and Peter Hottenstein

Date: Tuesday, March 31, 2009
Location: University of Maryland, College Park R. H. Smith School of Business
A WINFORMS / University of Maryland Joint Evening Program

The US Secret Service utilized Evacuation Planning Tool (EPT), an agent-based simulation developed by Regal Decision Systems, Inc., for running evacuation scenarios for the Republican National Convention in the Xcel Energy Center. 16,250 entities populated the 3D modeled building and several scenarios with unavailable exits and routes were run to determine overall evacuation time and the impact of closed exits. This study supported event planning. Modeling and running evacuation scenarios of Secret Service headquarters will support emergency planning. Prototyping space allocation of field offices housed in rented space will support space planning by the agency. EPT runs on a laptop computer.

To support analysis of security-threatening incidents, Southwest Research Institute® (SwRI®) developed MAICE Station, a software analysis tool using agent-based modeling to evaluate emergent behaviors based on the actions and interactions of simulated autonomous individuals in one or more groups, in order to re-create, analyze and predict the outcomes of complex human interactions. This tool supports rapid prototyping of crowd or pedestrian scenarios. Analysts can build scenarios from any map image background, populate it with completely customized individuals, and analyzes lethal and non-lethal countermeasures. SwRI has conducted several successful experiments in collaboration with the U.S. Military to validate behavior models and scenarios created with MAICE Station.

Southwest Research Institute® (SwRI®) developed Hydra, a prototype application for integrating a variety of simulation, analysis and data fusion technologies. The resulting architecture addresses the challenges associated with detection and assessment of threats using intelligence gathered from a wide variety of sources. The application's processing stream takes in raw data from widely varying sources; and through plug-in technology, the data is processed into a standardized format. The essence of the application is the data fusion of multi-source intelligence data for informed and timely interpretation. This analytical merging of data allows the system to correlate interpreted threat intelligence with known locations of potential targets. The fused data represents linkages of many contextual dimensions in time and space, where these connections between separate reports can represent potential threats.

Mark Harmon is the Program Manager for the Security and Incident Modeling Laboratory (SimLab) at the Secret Service's training center near Washington , DC. Simlab provides simulated tactical training for Secret Service personnel and security planning capabilities for National Special Security Events. In addition, the Lab hosts live-simulated practical exercises to support training in the National Incident Management System (NIMS) plus assistance to the Emergency Preparedness office for planning non-force on force response and mitigation strategies to incidents. He has a B.S. in Computer Science from the University of Maryland and 27 years technical working experience in the Secret Service, including ten years in the simulation arena.

Jenish Joseph is Senior Analyst and Vice President for Regal Decision Systems, Inc. He was the lead developer for the Evacuation Planning Tool (EPT), BorderWizard, CanSim (a Canadian border model based on BorderWizard), and several other simulation projects. He has a B.S. in engineering from the Indian Institute of Technology, Madras , and an M.S. in transportation planning from the University of Maryland.

Peter Hottenstein is the Manager for the Systems Modeling section in SwRI’s Training, Simulation and Performance Improvement Division in San Antonio, TX . He focuses on crowd and human behavior modeling. Recent technical developments include a desktop crowd modeling simulation toolkit that provides easy-to-use utilities for rapidly building training and planning scenarios for managing large crowds. Other focus technologies include visual analytics, a simulation-based prototype threat identification desktop system to fuse data into a cohesive display. He has an M.S. in Software Engineering from Southern Methodist University and 32 years experience in training and simulation.

“Is Game Theory's Assumed Dependence on Utility Theory Ever Realistic?” by Russell R. Vane III, PhD

Date: Tuesday, March 10, 2009
Location: George Washington University, Department of Engineering Management and Systems Engineering

Von Neumann and Morgenstern combined Cournot's idea of utility theory with a mathematical characterization of competition to create game theory, a field that has been featured in at least two Nobel Prizes for economics. But do game theory's required principles ever actually apply, beyond the well known suppressive effects of the Cold War's “Mutually Assured Destruction”? We will review the six principles of game theory and its four axioms, and then compare them with utility theory's requirements. This will not involve extensive prior knowledge of game theory or its underlying mathematics, but even experienced game theory practitioners may be surprised by the extent of game theory's limitations in other areas of application.

Dr. Russ Vane is a Senior Managing Consultant at IBM in the Operations Analytics practice of the Supply Chain Management Division of Global Business Services. He has been an officer in WINFORMS for about ten years. His main areas of expertise are decision supporting technologies including game theory, modeling and simulation, and agent based systems. He is currently helping the Joint Improvised Explosive Devices Defeat Organization's wargaming team. He is an ex-US Cavalry officer (ground) and was an Airborne Ranger.

Follow WINFORMS: The Washington DC Chapter of INFORMS:

Upcoming WINFORMS Event

How to Measure Efficiency using Data Envelopment Analysis (DEA)
Speaker: David Lengacher

Tuesday, December 11, 2012

Booz Allen Hamilton
3811 N. Fairfax Drive
Arlington, VA 22203