Why journal papers are hard to read

By Winfried Grassmann

Many practitioners do not read the papers that appear in our journals because they find them abstract and difficult to understand. I think I know the reason for this unfortunate state of affairs. Essentially, the gatekeepers for what appears in our journals are academics, and they tend to evaluate papers the same way they would evaluate a Ph.D. thesis. In fact, many papers that appear in our journals are Ph.D. theses, and some universities require Ph.D. candidates to publish a least one journal paper.

The problem: the criteria for judging the quality of a Ph.D. thesis are different than the criteria for judging the quality of a paper. A thesis has to demonstrate ability, knowledge of the subject and originality. A paper, on the other hand, should educate the reader. For instance,  a readable explanation of an existing method that is not well known – possibly because its original authors were unable to explain it clearly – could prove very educational and useful to the journal reader. However, it would definitely not be acceptable for a Ph.D. thesis. The same is true for a literature review. On the other hand, an investigation of a complex system that shows mathematical ability and originality may be an excellent thesis, but if this system in the end turns out to have no practical applications, it should not be published.

Should the primary objective of a journal paper be to show the competence of the author or to educate the reader? In this sense, the choice is between “prestige” as measured by the paper’s advanced methodology or its “utility” as measured by its educational value and its influence on the practice of operations research. I claim that there is a strong bias toward prestige to the detriment of utility.

Clearly, nobody is opposed to excellence, and in mathematics, as in the arts and sports, excellence could very well be the main criterion. Art is created for the sake of arts, and the issue of utility is mute. The same is true for some branches of mathematics. There is nothing wrong with this, as striving for excellence should never be discouraged. Moreover, there is a point to be made that mathematics will train the mind, just like sports trains the body. However, operations research is an applied science, and operations research should always be done with a view toward application.

Judging papers solely by their mathematical sophistication has unintended consequences. This process leads to a bias toward complexity and the neglect of areas that are not challenging yet are of great practical importance. Suppose two papers are submitted to two different journals: One provides a simple solution, whereas the other one provides a mathematically interesting and challenging way to solve the same problem. Which of the two has the better chance to be accepted?

Chances are that it is the more challenging paper. However, the simpler solution is obviously more useful. This is a real problem. I have discovered several cases where a complicated solution has been published even after a simple solution was suggested in literature. Some people may even judge papers as trivial just because they are easy to read. The bias toward complexity is amplified because once a difficult method for solving a specific problem is generally accepted, there is a vested interest of those who have promoted this method, and they may not look favorably toward competing methods, even if they are easier to understand and/or to implement. Hence, even if a simpler solution than a solution accepted in literature is proposed, it may not be publishable.

Areas that are neglected because they are not considered very challenging include data collection and the development of effective programs. Unfortunately, models without data are like cars without wheels: They simply do not work. People claim that data collection is a minor issue and that papers in this area should be relegated to journals of minor quality. I do not know of any journal that aspires to be of minor quality, so papers dealing with this important issue are just not published. In many areas, effective programs are also essential. This is clearly demonstrated by the widespread use of the Solver software embedded in spreadsheets. In fact, the availability of programs using spreadsheets has clearly affected the way linear programming is taught today in the universities.

Many areas in operations research are subject to the trend that leads from simple methods to difficult ones. This is, in some cases, counteracted by the demand for applicability, and the area is exposed as being too abstract to be practical. Often, this counteraction arises from outside the area. Consider, for example, queueing theory, which was started in 1909 with the work of Erlang, long before the term “operations research” even existed. Agner Krarup Erlang was extremely practical; he reportedly crawled into manholes to do his measurements. His findings greatly helped the Copenhagen Telephone Company improve its service to customers. His work can be considered as one of the first and most successful applications of what later became known as operations research.

However, not long after Erlang’s groundbreaking work, queueing theory became much more mathematical and less applied. This was well expressed by Alec Lee [1] who stated that most of the literature on queueing explored “the remoter mysteries of the simpler queues.” Today, there are still people who do rather abstract queueing theory, and who, in my opinion, hinder rather than help progress in this area. This unfortunate turn of events called for a counteraction, which came from computer science where queueing found very successful applications. Computer scientist made great advances in this area. One of the most exciting recent discoveries in queueing was that Internet traffic is long-range rather than short-range dependent. To discover this, one had to look at the data. Computer scientists also greatly improved the algorithms used to analyze queueing networks. Even the GI/G/1 queue found its applications in this context. Another new area, service systems, also arose, and it is devoted to more practical applications of waiting-time problems.

As in optimization, one needs efficient programs to make the theory accessible to the average user. Still, algorithmic considerations are often rejected by traditional queueing theorists. For instance, some years ago, I published a paper which gave an easy method to find the waiting time distribution of the GI/G/1 queue, a method that is very efficient and simple enough to be taught to undergraduate students. It was used by others, and it received 50 citations, yet, when first submitted, it was rejected out of hand. It got published only after I complained vigorously.  I do not claim the mathematics behind queueing system is simple (indeed, the papers of Erlang are hard to read), but if good programs are available, the user does not need to know the underlying mathematical intricacies. He just enters his data and off he goes. With the help of programs, even the GI/G/1 queue becomes trivial.

Queueing theory is not the only area were the trend toward complex solutions can be observed. Just consider the development of operations research and its trend toward complexity … and the continuous fight to stem the trend. Russ Ackoff, for example, strongly objected to the theoretical tendencies of operations research. To give weight to his objections, he founded a new society, the International Society of System Sciences. Gene Woolsey is another person strongly opposed to the trend. He wrote many columns on the topic, tellingly not in Operations Research, our flagship journal, but in Interfaces. More recently, the movement toward analytics could be considered as a measure to counteract the tendency toward increasing abstraction.

Though the stress on excellence must never be neglected, the main criterion for accepting papers for publication, in my opinion, should be its use to the reader. This means that papers should never be evaluated like Ph.D. theses. This should be clearly stated in the instructions to editors and referees. Specifically, a paper dealing with a brilliant solution to an unimportant problem should not be accepted. On the other hand, a paper that requires a lot of work, even if the work is of a rather basic nature, may be a good candidate for publication. In addition, readers like clear explanations, and there is no harm done by publishing papers that clearly explain important results discovered by others.

Finally, our journals should encourage discourse and differing opinions, which, according to my experience, they do not. For instance, papers that go against established opinion are difficult to publish, even if they are right on the dot. This is too bad, because these are the papers that lead to discussions, and discussions will move our discipline forward. ORMS

Winfried Grassmann ( is a professor in the Department of Computer Science at the University of Saskatchewan.


1. Alex Lee, 1966, “Applied Queueing Theory,” St. Martins Press, New York, N.Y.