History of OR: Useful history of operations research

Long-term prospects and aspirations for historical preservation, research and communication.

By William Thomas

Ellis Johnson This Week

Ellis Johnson, appearing in a story on O.R. in the Aug. 5, 1951, issue of This Week, a Sunday newspaper supplement.

In the April issue of OR/MS Today, Mark Eisner and Arjang Assad reported on the new activities of the INFORMS History and Traditions Committee. The committee has rightly concentrated its efforts on those records and memories in the most imminent danger of being lost. However, writing here as a member of the committee and as a professional historian, I would like to take a look at what the longer-term prospects for historical preservation, research and communication can be.

I believe that the ultimate challenge for any history program is to transcend the nostalgic and the fascinating to become genuinely useful. To meet this challenge, history needs to connect to the present, and it needs to be thorough enough to yield concrete answers to important questions. In the case of operations research, management science and analytics, such questions might include: What experiences do key members of this community bring to their work, and what agendas motivate them? How have different analytical methodologies developed, how have they been applied in practice, and by whom? What data sets do we have access to, what motivated their creation, and how are they generated, maintained and modified? How have O.R. and M.S. engaged with other fields prior to their present alliance with analytics, and why have such efforts succeeded, or not?

History that deals with too remote a past, or that is too piecemeal, simply cannot provide satisfactory answers to such specific queries. That said, the distant history of O.R. and M.S. does provide some evidence that such an approach to history could have significant value.

Rational Action: The Sciences of Policy in Britain and America, 1940–1960.

The author’s extensive study of the early history of O.R. and related fields resulted in the book, “Rational Action: The Sciences of Policy in Britain and America, 1940–1960.”

For more than a decade I have devoted a significant part of my time to studying the early histories of O.R. and related fields. That work culminated this spring with the publication of the book, “Rational Action: The Sciences of Policy in Britain and America, 1940–1960.” One of the most striking revelations of my research was just what radically different perspectives and agendas the various figures in this history had. In the beginning, terms such as “operations research” and “management science” had no fixed meaning. Instead, at a tumultuous moment in history, individuals struggled to match their skills and ideas to the various crises and opportunities that they saw around them. As they did so, the aggregate effect was that new activities and fields emerged, converged, diverged, succeeded, failed and evolved at a remarkable rate.

This process occurred more or less blindly, as people at that time naturally had a limited appreciation of perspectives other than their own. Through a careful study of the history of that moment, I have been able to reconstruct the complicated historical dynamics that were only hazily seen and understood by the participants. It stands to reason that a similar history of the present would be of real worth to those hoping today to drive the fields of O.R., M.S. and now analytics in productive new directions.

O.R. as Scientific Investigation

All serious accounts of the origins of O.R. agree that the term was initially applied in Britain just prior to World War II to distinguish research done to integrate radar technology into aerial combat operations from the research and development being done in laboratories and workshops. By early 1941, such “operational research” was found to have value for ordinary military decision-making. So, the military services began assembling scientists, engineers and mathematicians – not to mention a smattering of lawyers, actuaries and schoolteachers – into groups to conduct research directly in support of high-level officers. This work largely entailed gathering and parsing evidence to determine whether tactics and practices needed rethinking. Little of it involved mathematical modeling, and it was often vaguely described as the application of “scientific method.”

This very general description of O.R. also allowed the story of the wartime O.R. groups to be rolled into larger narratives told about the contribution of “science” to the war, alongside radar and the atomic bomb. In 1945 the British crystallographer and Marxist intellectual J. D. Bernal went so far as to suppose that wartime O.R. represented not a new profession, but a total realignment in the relations between science, the state and society. He reckoned that the moment marked the beginning of an entirely new epoch in history in which human progress could be intelligently planned [4].

This sort of grand significance attributed to wartime O.R. motivated many who had participated in it to regard their work as something that needed to be not only preserved but also expanded into civilian institutions. It was common in this period to suppose that O.R. might be made into a kind of super-profession that would absorb and integrate such established specialties as time-and-motion studies, cost accounting and market research. Ellis A. Johnson, head of the U.S. Army’s Operations Research Office, even suggested to a National Research Council committee dedicated to promoting O.R. that an O.R. group be immediately established at the White House to tackle “national political objectives” [9].

The then-prevailing view of O.R. as generic investigation in support of executive decision-making remained influential within the profession, particularly in Britain. In the 1960s and 1970s it motivated enthusiasm for “systems thinking” and “soft O.R.,” which are subjects that continue to attract adherents [7]. Ten years ago, the launch of the “Science of Better” publicity campaign for O.R. was premised on a similar idea. At that time I wrote in OR/MS Today that, like the first attempts to “sell” operations research, success would be contingent on individuals’ abilities to integrate their thinking with that of the decision-makers they served [10]. That observation is true enough. But, had I been further along in my studies of the history of O.R., I might also have noted that, individual successes aside, O.R.’s earliest proponents by and large failed to realize their original, highly ambitious vision for the field.

The Ascent of Mathematical Modeling

By the mid-1950s, as O.R. assumed the mantle of a profession, it began to adopt into its methodology a variety of emerging mathematical methods such as linear programming, inventory theory, search theory and queuing theory. “Adopt” is an apt term here, as, except for search theory, none of these subjects originated under the banner of O.R. By and large, O.R.’s proponents welcomed these new methods, and provided them with a niche where they could flourish, when they might have otherwise languished on the margins of mathematics, statistics and economics.

Kenneth Arrow, David Blackwell and Abraham Girshick

Kenneth Arrow, David Blackwell and Abraham Girshick (l-r) in Santa Monica, Calif., 1948. Courtesy of the Institute of Mathematical Statistics [5].

Yet, within a few short years, these new methods began to dominate the identity of O.R. (as well as M.S., which was organized at this time as an alternative discipline). Those who viewed O.R. as a generalized form of scientific investigation often lamented this development. In their view, it unduly narrowed the scope of the profession’s methodology, and it often led academic research too far away from practical problems. A possibility they did not countenance was that the development actually represented a separation of wheat from chaff.

While O.R.’s proponents often seem to have felt that their field was destined to permeate high-level decision-making in industry and government, they often discounted the sophistication and efficacy of existing professions, the consulting industry and traditional management. As a consequence, O.R. struggled to distinguish its contributions to high-level decision-making. In contrast, even if the new mathematical techniques were less broadly applicable, they did have clear novelty and value. There is a strong case to be made that, without them, the O.R. and M.S. professions would not have been able to sustain their vitality.

The Power of Theory

Oddly, as thoroughly as O.R. and M.S. ultimately embraced theory, the sources of the power of that theory are not often discussed. The optimal solutions to well-formulated engineering and management problems that theory produces are, of course, routinely celebrated. Yet, such methods were at least as important because they inaugurated an immensely productive era of reflection on what exactly constituted an “optimal” solution, and what it meant for a problem to be “well formulated.”

Take as an example “warfare analysis,” which originated in World War II outside of the military O.R. groups and was later expanded into defense systems analysis. Warfare analysis dealt primarily with engineers’ choices among different equipment designs, especially armament configurations. Traditionally, such decisions were made on the basis of a vague sense of how different design features – such as the caliber, rapidity of fire, accuracy and weight of guns installed on airplanes – should be incorporated into effective designs. Warfare analysis placed these factors into probabilistic models of their expected impact on the outcomes of combat scenarios. This allowed such decisions to be evaluated, compared and debated in more concrete terms.

A similar example is sequential analysis. Sequential analysis was also originally developed during World War II, in this case in response to the practices of quality control inspectors. The inspectors had intuited that they could stop a test with a fixed sample size early once it seemed apparent whether the lot was going to pass or fail. This basic intuition allowed the statistician Abraham Wald to develop rigorous tests with no fixed sample size by taking into formal account information gained during the testing process [8].

The crucial commonality among warfare analysis, sequential analysis and a number of related bodies of theory was that they were based on the idea that there were certain considerations bearing on the validity of decisions that had been tacit, but could be formally articulated and incorporated into a theoretical model. Postwar theoreticians took advantage of this fact, and used the modeling process to systematically explore models’ inadequacies. Considerations intrinsic to a decision, but extrinsic to an existing model of that decision, could always be incorporated into new iterations of the model. Questions such as whether a decision had to be made quickly, or in competition with other decision-makers, or if it had to deal only in discrete quantities, supplied an abundance of grist for theoreticians’ mills. Even when such models veered into the abstract and academic, they still constituted intellectually appealing interrogations into the structure of decision-making, which could potentially be steered back into specialized applications [2].

A few modelers, such as the economist Kenneth Arrow, seem to have had an especially intuitive grasp of this power within decision theory, chasing it between mathematical statistics, economics and the new fields of O.R. and M.S. Thus, on a visit to the RAND Corporation in the summer of 1948 with David Blackwell and Abraham Girshick, he contributed one of the earliest elaborations on sequential analysis. While there, Arrow was also exposed to discussions on how to define the value of military technologies from the broad viewpoint of national interest. This, in turn, led him to formulate his “impossibility theorem” in social choice, which became a foundation stone for vast new lines of inquiry in political science [3]. Similarly, that same summer, Blackwell was exposed to RAND’s work on systems analysis, which led him to study game theoretical models of duels. He went on to become a major authority on Bayesian statistics [1].

Bringing History, OR/MS and Analytics Together

When the term “operational research” was first used in the late 1930s, nobody could have predicted the radical turns its history would take in the next 20 years. These turns were guided by the individual experiences and agendas of many different people and institutions. Similarly, nobody can now predict what will come of the present alliance of OR/MS and analytics. But, if a history that attends carefully to individual experiences and agendas can decode the past, a thorough history of the people and institutions of our own time can make it easier to talk concretely about the opportunities and challenges of the present.

Some professions have made significant strides in developing knowledge of their histories. The IEEE has a History Center, while chemists have the Chemical Heritage Foundation. I myself have worked for the Center for History of Physics of the American Institute of Physics as part of a post-doc program the center has been running for many years. Yet, no profession has developed a systematic history that it can use.

OR/MS and analytics may be a good place to try the feat. In 1953 the British historian Margaret Gowing wrote in what was then called Operational Research Quarterly about the potential and challenges of large-scale history. Her experience was with assembling official histories of government policy during World War II, and she saw an affinity between her work and O.R., which was itself no better developed as a guide to policy [6].

Unfortunately, Gowing’s vision for such history was not borne out, but the match between OR/MS and history could yet be a productive one. History’s data is fragmentary and often difficult to locate, where OR/MS and analytics usually deal in the sophisticated analysis of well-quantified phenomena. Yet, these disciplines are united in seeking clarity in complex terrains of fact, and they hope to use that clarity to guide decision-making. If history could develop useful accounts of the contemporary OR/MS-analytics profession, that might provide a testing ground for a productive new alliance between complementary fields.

William Thomas (gwilliamthomas@gmail.com) is a senior historian at History Associates, Inc., in Rockville, Md., and the author of “Rational Action: The Sciences of Policy in Britain and America, 1940–1960.”


  1. Albers, Donald J., 1985, “David Blackwell,” in “Mathematical People: Profiles and Interviews,” ed. Donald J. Albers and G. L. Alexanderson, Birkhäuser, Boston, pp. 17-32.
  2. Arrow, Kenneth J., 1957, “Decision Theory and Operations Research,” Operations Research, Vol. 5, No. 6, pp. 765-774, is an intriguing reflection on these themes.
  3. Arrow, Kenneth J., 2002, “The Genesis of ‘Optimal Inventory Policy,’” Operations Research, Vol. 50, No. 1, pp. 1-2.
  4. Bernal, J. D., 1975, “Lessons of the War for Science [1945],” Proceedings of the Royal Society of London. Series A, Mathematical and Physical Sciences, Vol. 342, No. 1631, pp. 555-574.
  5. DeGroot, Morris H., 1986, “A Conversation with David Blackwell,” Statistical Science, Vol. 1, No. 1, pp. 40-53.
  6. Gowing, Margaret, 1952, “Historical Writing: Some Problems of Material Selection,” Operational Research Quarterly, Vol. 4, No. 2, pp. 35-36.
  7. Kirby, Maurice W., 2007, “Paradigm Change in Operations Research: Thirty Years of Debate,” Operations Research, Vol. 55, No. 1, pp. 1-13.
  8. Klein, Judy L., 2000, “Economics for a Client: The Case of Statistical Quality Control and Sequential Analysis,” in “Toward a History of Applied Mathematics,” ed. Roger E. Backhouse and Jeff Biddle, Duke University Press, Durham, pp. 27-69.
  9. “Minutes of the Committee on Operations Research, March 23, 1951,” papers of the Committee on Operations Research, National Academies Archives, Washington, D.C.
  10. Thomas, William, 2004, “Selling OR: An Historical Perspective,” OR/MS Today, Vol. 31, No. 5, pp. 30-36.