The state of military O.R.

Seventy-five years after its birth in the crucible of WWII, the profession faces a crucial crossroads.

By Greg Parlier

Photos courtesy of the U.S. Army.

Military Px_Transport-Army_0223-1

Post-war drawdown is part of another defense resource boom-and-bust cycle.

Seventy-five years ago, near the beginning of World War II as the Battle of Britain loomed, in an occupied manor renamed Bawdsey Research Station along Suffolk’s southeastern coast overlooking the North Sea, the “operational research” profession was born. The term was first used by A. P. Rowe [1] to define this “system of teams” supporting Air Chief Marshall Hugh Dowding and his Royal Air Force Fighter Command with the integration and use of newly invented radar. The fate of Western Civilization was indeed at stake in those early years of the war, which British Prime Minister Winston Churchill would later describe after the fall of France as Britain’s “darkest hour” [2].

Today, as the United States emerges from the longest sustained war in American history, the U.S. military faces a post-war drawdown as part of yet another defense resource boom-and-bust cycle. This time, however, as the budget pendulum inevitably swings back full force, we also find ourselves on the precarious edge of a domestic inter-generational, financial and economic abyss.

On this diamond anniversary of the marriage between the art of warfare and the application of the scientific method, these conditions warrant a critical, comprehensive, introspective evaluation of the current state of military operations research. Just exactly where are we? Where have we been? Where are we going? Given the challenges the military now faces and the discernable opportunities that could be exploited, is the current trajectory of the military community aligned with what really needs to be done for national security? If such an undertaking is to be pursued, how might it be structured, organized and conducted to be a useful endeavor?

Present Challenges

Beyond the more apparent international geopolitical challenge and accompanying economic crises du jour, a broader national security perspective must also address several other worrisome trends, especially their interactions. This endeavor should also be viewed from the perspective of the current national financial predicament and the ongoing quest for solvency in public policy. Despite an incredible advantage in global military power, the United States faces enormous strategic resource challenges on a perilous cusp of history. Current trends within the federal budget, in both discretionary and “entitlement” programs, render current spending trajectories unsustainable and future programs unachievable.

A broader national security perspective must certainly address the foreseeable geopolitical environment, which a recent Army chief of staff describes as “an era of persistent conflict,” while the current strategic survey from the International Institute for Strategic Studies forecasts a “persistent state of sublimated strategic anarchy.” We are also afflicted with disconcerting socio-demographic patterns and education challenges, national infrastructure decay in energy, transportation and civil works, sagging personal savings and national investment levels, and increasingly intractable social entitlement policies. Realistic projections of these converging trends illuminate enormous risk to the nation – a potentially catastrophic gap between future expectations and unfolding realities.

The challenges two decades ago during the post-Cold War drawdown parallel those we must grapple with now: maintaining “balance” as we draw down our forces, ensuring that we do not negligently allow a “hollow force” to develop, much less precipitously break the Army and set it up for future first battle failures in subsequent wars. Twenty years ago, we grappled with this challenge of maintaining balance, what it really meant and how to define, measure, design, model and actually achieve it – something we had not done well throughout our history as our Army post-war drawdown curves illuminated.

“Balance” was a multi-dimensional concept that military analysts pursued both strategically and analytically: over time by balancing current and future readiness (more precisely, current force readiness, mid-term modernization and long-term R&D investment) and within time by balancing force structure, modernization, manpower and infrastructure. These new methods for strategic analytics, crude back then, helped to chart, and then resource, a viable course for our leadership in what they called “an era of strategic uncertainty” [3].

Military Px_Heliocopters-Army_0118

The current geopolitical environment has been described as “an era of persistent conflict.”

Force Structure, Training, Skill and Technology

During the mid-1990s, much of the conventional wisdom was that the United States was in the midst of a so-called “Revolution in Military Affairs.” Technology would provide a global C4ISR (Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance framework) and precision strike capability that would give us, in the words of one defense analyst, “an ability to bomb any target on the planet with impunity, dominate any ocean, and move forces anywhere to defeat just about any army.” This suggested, in the midst of the post-Cold War drawdown and declining defense budgets (the “peace dividend”), that modernization and investment accounts should be protected, even expanded, at the expense of force structure, training, and readiness during our “strategic pause.”

Nonetheless, while a major premise for “transformation” was that speed, agility and precision could substitute for “mass,” the unfolding reality during the emerging “new world order” was that eradicating root causes of contemporary (“unconventional”) aggression was requiring a completely different skill set.

We also needed to acknowledge and contend with the “base of sand” issue in our combat modeling methods and simulation practices. In 1991 a group of experienced defense analysts and senior scientists from several FFRDCs (federally funded research and development centers), defense agencies and the scientific community boldly challenged existing military combat modeling, defense simulation and wargaming methods. They concluded that underlying concepts in analytical agencies and how they were being managed were “so fatally flawed” that “anything less than structural changes in management and concept” would be essential to correct them. Their recommendations were comprehensive and compelling – a “wake-up call” to the establishment analytical community at the time.

One of the observations that spawned the effort was the enormous gap between “official” model outcomes and the actual evidence of our own recent history. The experienced defense analysts and scientists attributed this to lack of adequate theories and a “vigorous military science,” resulting in an excessive focus on modeling and simulation technology rather than the substance within [4]. For example, medical “planning factors” were derived from attrition-based, theater-level campaign model casualty projections. However, when compared with actual empirical evidence from recent experiences of modern warfare, including Persian Gulf War results, these projections were over-predicting aggregate casualties by orders of magnitude, thereby creating unnecessary and unaffordable requirements for medical force structure and supply support.

As a consequence of those conditions – declining budgets and the emphasis upon “techno-centric” solutions without truly understanding, much less incorporating, critical cause-effect relationships into our models, simulations, and wargames – the U.S. military embarked upon a comprehensive historical, theoretical and empirical evaluation of past operations. Initially as part of the first QDR (Quadrennial Defense Review) effort then, the military’s intent was to identify and verify patterns and relationships that would provide significant predictive power for the future. Fundamentally, military analysts pursued an experimental design approach, grounded in relevant theory, to capture the statistically significant factors influencing future demand using all of the information provided by the empirical evidence of recent experience. It was predicated on the simple notion that practical operational enhancements emerge from the judicious study of discernible reality.

These extensive, comprehensive analyses into the nature of modern warfare and the structure and patterns of casualties that result have yielded major improvements in forecast accuracy and an ability to better design more responsive, effective medical support requirements that, in many cases, result in reduced force structure investments and medical materiel and supply support costs (class VIII). The resulting seminal analyses, which challenged traditional views and standard practices at the time, have since been validated and corroborated by our most recent experiences in both Afghanistan and Iraq and are now being further refined and extended [5].

Army analysts have since expanded this research effort to other classes of supply, focusing recently on class IX for the materiel enterprise. Analysts have empirically identified spare part consumption patterns and readiness “drivers” for Army aviation fleets that either dominate, or differ significantly across operational missions and geographic locations and how they may vary from peacetime training. The Army’s major hypothesis in this regard states: “If empirically derived Class IX usage patterns, profiles, and/or trends can be associated with various operational mission types, then operational planning, demand forecasting and budget requirements can be significantly improved to support a capabilities-based force.”

The potential for improvement is enormous.

Military Px_Mortar-Army2

Seminal analyses, which challenged traditional views and standard practices, have since been validated and corroborated experiences in both Afghanistan and Iraq.

Strategic Analytics

One of the attributes of strategic analytics is alignment of methods and models with the “ends-ways-means” strategy paradigm. Descriptive analytics are used to systematically diagnose structural disorders, perform root-cause analysis and identify enabling remedies (i.e., “means”). Integration challenges are addressed using prescriptive analytics to attain policy objectives for desired “end” states. Design and evaluation then incorporates predictive analytics to develop “analytical architectures” (i.e., “ways”) to guide the change management effort toward desired “ends.” One recurring observation from applying this “analysis, synthesis, design and evaluation” engineering/problem-solving approach to several strategic enterprise challenges is that confusion between ends (what is to be achieved) and ways (how it is to be achieved) can be uncovered and resolved.

For example, during the nation’s last major military recruiting crises in the late 1990s, the all-volunteer force (AVF) was indeed at great risk, although this is not commonly known today. All the services failed either their annual quantity missions or quality goals; the Army failed both objectives for multiple years – for active and reserve components. Forecasts for future economic conditions, youth market demographic patterns and propensity, resources available and near-term recruiting requirements needed to achieve manpower and readiness goals made it clear the military could not sustain the AVF in its current form.

Projections were truly dire; the phrase used to describe the situation then was “imminent catastrophic failure.” Every effort was made to salvage the AVF concept; massive re-engineering was required and undertaken (and it was painful). But what became crystal clear then – and is worth noting now given current and foreseeable trends – is that we were forced into the realization that while our internal objective is to “man the Army,” the Army’s larger purpose is to “serve the nation.” The AVF was then, and is now, a way to achieve those ends; the AVF should not be viewed as an end unto itself. It is one of many military manpower systems (“ways”) that can be considered to reconcile means with ends. From World War II until 1999, at least five different manpower recruitment systems have been used or seriously considered; in 1999 we considered a sixth (and, no it was not the draft). Although we were ultimately able to re-engineer and salvage the AVF, in the beginning there were no guarantees those new concepts, initiatives and changes – many still in place today – would succeed. But they were also designed to allow a graceful transition to the alternative had we not been successful. Once again, the AVF in its current form is becoming unsustainable. We may soon once again confront “imminent catastrophic failure.” If so, will we use strategic analytics to illuminate a better “way” ahead?

In addition to cost growth in manpower and personnel (force structure capacity), we have also experienced considerable growth in operating and support costs (force readiness capability). These are the two largest categories in the defense budget (combined, they account for nearly 75 percent of the total), and, from a historical perspective, they have been dramatically reduced in immediate post-war budgets to achieve the near-term savings needed to meet draw-down targets.

To the extent that significant savings can now be generated from within the institutional support enterprises (man and train, equip and sustain), at least some of the force structure that will otherwise be eliminated could instead be retained. For example, internally transferring savings obtained from a more efficient materiel enterprise (equip and sustain) by transforming military supply chains could help to preclude re-emergence of a “hollow force” caused by the deleterious effects, especially for the Army, of our boom-and-bust cycle as we precipitously draw down during post-war periods – an all-too-persistent pattern in our history.

Although they were masked by the massive infusion of resources for much of the last decade, persisting supply management problems and inadequacies that existed 10 years ago are now again becoming increasingly more apparent. With mounting pressures to generate savings and find efficiencies, these persisting issues include the inability to relate resources to readiness due to lack of strategic plans, poor inventory management and fragmented supply chain operations across the materiel enterprise. Furthermore, the promise for improved performance attributed to large investments in IT solutions in the form of enterprise resource planning (ERP) systems that have been installed in recent years has not been (and is not likely to be) realized, continuing to plague all the services. Now, with overseas operations abating, the budget pendulum swinging back and the adverse effects of our national fiscal reality rapidly converging, the quest for more efficient, cost-effective operations is returning. A phrase used to describe this goal is “cost-wise readiness.”

Could operations research and management innovation be used in imaginative ways across the materiel enterprise to better relate resources to readiness, generate significant savings, improve readiness and preclude another, inadequate, vulnerable (“hollow”) force that could invite rather than deter aggression? One recent comprehensive, extended study suggests this can be achieved using decision-support systems empowered with advanced analytics, including dramatically improved demand forecast methods (e.g., mission-based forecasting), sensor-based technologies for proactive part replacement (e.g., connecting condition-based maintenance – CBM –to the supply chain) and integrating supply chain optimization methods (e.g., multi-echelon readiness-based sparing), in conjunction with enterprise IT data platforms and repositories (e.g., LMP and GCSS-A). [6].

Costs savings for these various advancements have each been estimated to be on the order of many multiples of $100 million. Their combined effects, once fully implemented, are likely to be in the range of many billions of dollars, resulting in a return on investment of several orders of magnitude. Furthermore, it will become possible to better relate resources to readiness across the Department of Defense materiel enterprise, which annually consumes more than $150 billion in maintenance activities, inventory procurement, distribution and sustainment costs. Current year budgets can then be credibly correlated with operational readiness and programs to future capabilities.

Future Directions

As we look to the future, properly the temporal focus of transformation, what are some of the opportunities that could further empower O.R. across a broader context of national security?

During most of the profession’s existence, operations researchers have been cursed with both data challenges and computational power (e.g., Bellman’s “curse of dimensionality” in dynamic programming). That is clearly changing in this new digital technology era of big data. Indeed, data has become ubiquitous; the challenge now is to somehow make sense of it all. Just as diminishing returns finally seemed to be dampening Moore’s Law on computing power, Los Alamos National Lab demonstrated the first supercomputer to achieve a petaflop of sustained performance – a million, billion calculations per second (a thousand times faster than the existing teraflop standard). And Stanford researchers recently announced the creation of a computer using carbon nanotubes, which allegedly can even further improve performance by an order of magnitude over silicon chips.

So these twin banes of our past, data and processing power, rather than hinder our future are more likely to offer opportunities. We already have the link between big data and analytics – the extensive use of data, statistics and quantitative algorithms for descriptive (explanatory), predictive (forecasting) and prescriptive (optimization) modeling and analyses for fact-based, analytic management. Through sensor technology (digital source collectors), RFID, TAV, ERP systems and the Internet, IT has expanded to capture, track, monitor and make visible data in near-real time across disparate, dislocated entities comprising the entire enterprise.

However, we have yet to fully integrate analytical architecture into our enterprise system challenges. Complementary decision-support systems have not yet been developed that could capitalize on all this (overwhelming) enterprise data and, using analytically based methods, to make sense of it all, to enable improved decisions, and to dramatically improve enterprise performance. Hence, for large-scale complex organizations, the greatest return on investment is derived from incorporating relevant analytical tools such as O.R. with the appropriate IT required to enable and provide the decision support needed to achieve cost-effective, performance-oriented results aligned to strategic plans, organizational vision and the purpose for which the enterprise exists. The goal should be effective integration of analytics into organizational decision-making. How can this noble, worthy goal be achieved?

Military Px_Computers-Army

What opportunities could further empower O.R. across a broader context of national security?

Transformational Analytics

While improvements in data storage and processing have been truly astonishing, most organizations struggle to manage, analyze, apply and transform data into useful information for knowledge creation and actionable decision options. The corporate world has come to realize that investment in new IT systems, without first examining and implementing needed business process changes, simply automates existing inefficiencies resulting in negligible benefits. The term “business intelligence” (BI) is now used to encompass both analytics and the data processes and technologies used for collecting, managing and reporting decision-oriented information. Nonetheless, analytic management is often impeded by organizational pathologies: conventional wisdom crowds out critical thinking; high-level managers fail to demand rigor and dispassionate analysis; and organizations lack the capacity for empirical work. What must be created is the analytical capacity for insight, refinement and better decision-making.

Although so-called “IT solutions” have ubiquitous appeal and enormous investment levels, without using the analytical, integrative power of O.R. to focus business process reengineering on desired outcomes, this obsession with IT results in growing complexity and information overload, which exceeds interpretive capacities of organizations responsible for developing and using them – what has been termed an “ingenuity gap.” What is needed is a complementary relationship between decision-support systems (DSS) and management information systems (MIS) both symbiotic and synergistic.

In order to fully capitalize on advances in IT and rapidly growing “big data” challenges, applying the complementary power of operations research, advanced analytics and management innovation for dramatic performance improvement, including cost savings on the order of many billions of dollars, could provide enormous contributions at a crucial time. Transformational analytics can provide engines for innovation that generate and sustain continuous improvement in demanding, increasingly resource-challenged environments. Recognizing these needs and then developing the capacity to achieve them are the first steps toward management innovation as a strategic technology (MIST) for our defense enterprise bureaucracies and, perhaps, the broader national security community.

Finest Hour or Darkest Hour?

O.R. professionals must be challenged and encouraged to integrate their intellectual capacities, apply their considerable strategic planning acumen, focus the power of diverse and transformational analytical capabilities, and bring them all to bear on these challenges of our time. I believe there are important enabling roles that MORS and INFORMS (certainly including the Military Applications Society and others as well; CPMS, Decision Science and Service Science come to mind) should play in attacking these challenges. We must develop guiding visions and plans for such endeavors, work closely with other officials to assemble the teams, create the “brain trust” and then challenge our uniquely talented professional membership to “organize for success” and pursue and achieve the contributions we are quite capable of. In this disconcerting time astride – a perilous cusp of history – the United States and partners across the globe will surely benefit from such a commitment on our part.

Twenty-five years from now we will (hopefully) celebrate the centennial of our unique profession. What then, for example, will volumes 4 and 5 of the “The History of O.R. in the U.S. Army” have to say? Will they exist? Would a retrospective then – a look-back by the professional posterity – regard this period we are entering now as our “darkest hour,” our “finest hour” or perhaps both, as Churchill characterized the Battle of Britain early in World War II when the O.R. profession was invented?

Retired U.S. Army Colonel Greg Parlier is immediate past president of the Military Applications Society (MAS) of INFORMS. A career Air Defense Artillery officer and combat veteran with multiple tours in the 82nd Airborne Division, he served on the West Point faculty as an engineering management instructor and assistant professor of operations research. His advanced degrees are in operations research, systems engineering and national security studies. Col. Parlier was a National Defense Fellow at MIT and concluded his military career as deputy commander for transformation at U.S. Army Aviation and Missile Command. He is currently an independent consultant and member of the adjunct research staff at the Institute for Defense Analyses (IDA).

Note:

This article is adapted from a much longer report available upon request from the author at gparlier@knology.net. A portion of the report also appeared as a guest editorial in Phalanx, a joint publication of the Military Operations Research Society (MORS) and the Military Applications Society (MAS) of INFORMS.

References

  1. “The Origins and Development of Operational Research in the Royal Air Force,” Air Ministry Publication 3368, republished by the MORS Heritage Series as “Operational Research in the RAF,” Her Majesty’s Stationery Office, London, 1963.
  2. David E. Fisher, 2005, “A Summer Bright and Terrible: Winston Churchill, Lord Dowding, Radar and the Impossible Triumph of the Battle of Britain,” Shoemaker and Hoard.
  3. “Dynamic Strategic Resource Planning: Toward Properly Resourcing the Army in an Uncertain Environment,” RPAD tech report 97-03, Office of the Chief of Staff, Army: PAED, 1997.
  4. Paul K. Davis and Donald Blumenthal, 1991, “The Base of Sand Problem: A White Paper on the State of Military Combat Modeling,” RAND N-3148-OSD/DARPA.
  5. HQDA Task Order: “Training, Skill, Technology and Combat Outcomes” under IDA contract DASW01-94-C-0054, August 1997.
    Michael P. Fischerkiller, Wade D. Hinkle and Stephen D. Biddle, 2002, “The Interaction of Skill and Technology in Combat,” Military Operation Research, Vol. 7, No 1, pp. 39-56.
    Stephen Biddle, 2004, “Military Power: Explaining Victory and Defeat in Modern Battle,” Princeton University Press.
  6. Greg H. Parlier, 2011, “Transforming U.S. Army Supply Chains: Strategies for Management Innovation,” Business Expert Press, March 2011.