The OR/MS Ecosystem: Strengths, Weaknesses, Opportunities and Threats

In the March/April issue of Operations Research, ManMohan Sodhi and Christopher Tang look at operations research/management science and discuss how research, teaching, and practice interact in our field. From the abstract:


We believe that research, teaching, and practice are becoming increasingly disengaged from one another in the OR/MS ecosystem. This ecosystem comprises researchers, educators, and practitioners in its core along with end users, universities, and funding agencies. Continuing disengagement will result in OR/MS occupying only niche areas and disappearing as a
distinct field even though its tools would live on. To understand the reasons for this disengagement better and to engender discussion among academics and practitioners on how to counter it, we present the ecosystem’s strengths, weaknesses, opportunities, and threats. Incorporated in this paper are insights from a cluster of sessions at the 2006 INFORMS meeting in Pittsburgh (“Where Do We Want to Go in OR/MS?”) and from the literature.


This article, dedicated to Art Geoffrion who, as the authors state, is a “role model of a great research, educator and practitioner” in OR/MS, is a call for increased interaction between all those interested in our field.


For an ecosystem to thrive, efforts have to be made in increasing healthy interaction on many fronts. Specifically,…, we believe that (1) academic journals editors could serve as catalysts for making the ecosystem healthier by publishing more multidisciplinary papers that reflect the core strengths and uniqueness of OR/MS, (2) researchers could initiate efforts for strengthening the links with end users and practitioners, and (3) educators (especially in business schools) could enlist support from practitioners and end users to motivate more students to become OR/MS practitioners or end users.


You can find the full paper pdf here .

The editors of Operations Research have invited two prominent educators and researchers to comment on this paper.

The first is Alexis Tsoukiàs of the Université Paris Dauphine. Dr. Tsoukiàs is the Immediate Past President of EURO, the Association of European Operational Research Societies. In his commentary, Dr. Tsoukiàs raises a number of points with regards to the Sodhi and Tang’s SWOT analysis. For instance, he makes the point that the world of OR/MS is not well defined:


Who is our target? A rough check of the “eco-system” dimension tells that we have around the world 35000 people who actually recognise themselves as Operational Researchers (this is the world membership of IFORS). If we consider that only 1 out of 10 is “conscious” of being an Operational Researchers then our ecosystem has a potential dimension of 350000 world-wide. This is a fraction of any engineering branch in any major Economic Area in the world which makes us automatically a “niche” community at least quantitatively. Now the question is: to whom we are talking? The 35K existing community, the 350K potential community or a 35M target who should be diverted to our discipline?


He also identifies areas of research that he believes are understudied, particularly identifying what it is that OR/MS analysts do:


Research in OR/MS is very sophisticated today and this is a sign of the vitality of our research community. However, my impression is that among the research subjects to address we are missing the problem of analysing the decision aiding process from our perspective. With the noticeable exception of work done mainly in UK on Problem Structuring …, there is very little attention paid in analysing what exactly are the activities of an OR/MS analyst and the interactions with his/her client.


The full commentary of Dr. Tsoukiàs is available pdf here .

Our second commentary comes from Dr. Michael Rothkopf. Dr. Rothkopf spent a career furthering interactions between academia and industry in OR/MS. In his commentary, Dr. Rothkopf takes issue with “the overstatement of the weaknesses of the profession and, especially the role of academics in this weakness”.


For example, the paper states that “It does not
help that practitioners are but little engaged in OR/MS journals. Even the practitioner-oriented
journal Interfaces has only two practitioners on its 26 member editorial board.” The paper fails to appreciate that a substantial number of academics are also practitioners. This includes Art Geoffrion, to whom the paper is dedicated, and over half of the (non-author) academics credited in the paper’s acknowledgements section with being presenters at the motivating cluster at the Pittsburgh INFORMS meeting. In addition, while 24 of the 26 members of the Interfaces editorial board have academic affiliations, at least half of the 26, including Gene Woolsey of the Colorado School of Mines, are serious practitioners.


Dr. Rothkopf agrees that the publishing paradigm can work against better interactions between academia and industry, with some important caveats:


Unfortunately, the paper is correct that, “Improvement on something already published by garnering more mathematical results under slightly different or more general assumptions is one of the formulas for getting a paper published, which in turn advances a young researcher’s career.”
However, this is a short-sighted strategy. It is counter balanced, in part, by the fact that most such papers get few citations while seminal papers that open up important new areas of application get many citations, which in turn advances a researcher’s career.


Tragically, shortly after providing this commentary, Dr. Rothkopf passed away, leaving our field with one less voice bringing together industry and academia. We are glad to have his commentary, and regret that he will not be part of the ongoing discussion about the future of our field.

You can find Dr. Rothkopf’s full commentary pdf here .

We now invite you to read the Sodhi and Tang paper, and the commentaries, and continue the discussion. What is the state of our field? What steps are needed to strengthen it? What lies in the future?


One bit of evidence concerning the gap between INFORMS and practitioners is that we use the term “practitioners” as if it referred to a homogeneous group. I don’t know what the best typology is, but suspect there are at least three groups:
1) Practitioners who use relatively sophisticated OR models to embed OR within ongoing operational decision making. (I believe yield management and crew scheduling at airlines would be examples.)
2) OR analysis that informs strategic or executive decision making.
3) Run of the mill, generic managerial decision making. (Which applicant to hire, what price to charge, how much to budget for an activity, etc.)
My sense is that the profession does best at supporting the first type of practitioner. We (sometimes grudgingly) acknowledge that most managerial decision making is not of that sort, so we talk about moving OR up within the organizational hierarchy to inform strategic decision making. However, I wonder if in doing so we underestimate the collective importance of injecting a little more quantitative analysis into the very large number of routine decisions, including the possibility that 30 year olds who find OR useful for routine decisions might be more likely to call upon OR 20 years later when they are in the executive suite.
I don’t think any other discipline has claimed the intellectual space of scientific study (e.g., with lab experiments and field studies) of what is the best way to improve routine decision making by calling upon math and quantitative analysis. We invent tools and methods. For “high-end” applications we compete them against each other (e.g., documenting improvements in computational running time), but we do less of this for methods that support routine decision making. For example, our journals do not often report empirical evidence about the relative effectiveness of different approaches (e.g., teaching soft systems vs. spreadsheet modeling vs. traditional OR tools textbook or something yet to be invented) at improving routine decision making.
Doing so might (1) bring us closer to practice, (2) reduce pressures to be “too mathematical”, and (3) preserve OR’s generality, since routine decision making is relevant for managers in marketing and production, business and non-profit, etc.

Those of us academics not particularly engaged in consulting are fairly well sheltered from real-world problems. Meanwhile, I believe it is generally agreed that there is a substantial population of potential end-users who do no recognize that OR methods might be fruitfully applied to their problems (and often have never heard of OR). Occasionally someone from the general public pops up on sci.op-research describing a problem and asking if anyone can classify it or suggest approaches, but that is rather rare. Perhaps if we could create and publicize a forum where anyone could post general descriptions of problems (should I say “messes”?) and members of the OR community could offer suggestions, we might bridge the gap a bit? Or is this already being done somewhere?

We very much would like this to be a conversation, so please feel free to add your thoughts about the paper, the commentaries, and the issues they raise.

The SWOT article observations regarding the typical content of OR/MS publications are consistent with my own observations and experiences regarding the INFORMS journals’ acceptance philosophy. But OR practitioners in industry and government also face obstacles related to the publication process, in which reviewers, even when finding the material to be new and technically correct, seem to feel obligated to insist that the paper be in some way expanded to include additional literature review, additional computational experiments, or consideration of alternate model assumptions. This feedback may well result in a better paper, but in the project-based model used in much or industrial OR, it acting on the recommendations is simply not feasible.

Time to write a scholarly paper is not usually included in an industrial OR project plan; it’s just not something most clients are willing to pay for. Thus, when a project is completed, the team moves on to new work. A few ambitious team members may use their non-work hours to draft a paper describing the work, typically including a description of the business problem being addressed, a brief survey of some relevant literature, a model formulation, a description of one or more algorithms, some discussion of computational experience, and a discussion or the deployment and its impact. This draft then has to be reviewed and sometimes redacted by the project team and by management within both the performing and receiving organizations. By the time the paper is submitted for publication the project has been over for several months.

The review process, on average, takes several additional months. Thus by the time the reviews suggesting extensive additional work are received by the authors, a year or more has elapsed since project completion. Some of the suggestions, such as inclusion of more recent references, may simply require a few addition hours of the author’s time. Others, such as additional computational experiments (especially with alternate software packages) or consideration of alternate model assumptions or data sets, essentially define a new project – a project with no funding or clear return on investment. It is highly unlikely that these suggestions will be addressed by the non-academic authors. Having never worked in academia, I don’t know how such requests are dealt with by academic authors, but my sense is that the additional work often becomes part of a student’s research project.

A not uncommon response by industrial authors to such suggestions is to file the paper and the reports in either the bottom drawer (or worse) and return to work on the next project. After a few iterations of this process, may practitioners simply stop trying to publish their work in INFORMS journals.

Even the INFORMS journal Interfaces, “dedicated to improving the practical application of OR/MS to decisions and policies in today’s organizations and industries” has an editorial board that is dominated by academics. A quick review of recent issues (outside of the Edelman issue) revealed that a significant percentage of articles have at least one academic co-author. While I applaud the collaboration evidenced by such papers, the lack of reporting on how OR is actually used and at times misused in industry, and the absence of discussion of quantitative business problems that are not yet well addressed by available OR tools seems to indicate a weak link in our profession.

My post announcing the paper has spawned some discussion at the USENET group sci.op-research . See that discussion here but it would be great to see more discussion here. Logs say 200 people have downloaded the papers and commentaries today alone, so there are lots of people interested in this topic!