Volume 54, Issue 2

In This Issue

Improving Treatment Quality for Cancer Patients
The efficacy of radiation therapy in curing or controlling cancer is limited by the radiation damage of healthy or functional tissues. Clinical implementation of a modern treatment technique for cancer patients called intensity modulated radiation therapy (IMRT) has shown its ability to reduce side effects by sparing salivary-gland function in head-and-neck radiation therapy and reducing complications of the bowel in prostate radiation therapy. Despite the potential of IMRT, it is very difficult to determine even theoretically the dosimetric benefits of this novel therapy. The quality and efficacy of clinical IMRT is intimately tied to the quality and efficacy of the optimization models and methods used to determine treatment plans. In “A New Linear Programming Approach to Radiation Therapy Treatment Planning Problems,” R. K. Ahuja, J. F. Dempsey, A. Kumar, and H. E. Romeijn propose a new linear programming model that is shown to provide high-quality solutions in a very limited time, allowing physicians to design treatment plans for individual patients that offer a good chance for a cure or control of the disease while maintaining a high quality of life.

A New Method for Simulating Financial Models with Stochastic Volatility and Jumps
Many researchers have been working to extend the Black-Scholes option pricing model in order to be consistent with observed market option prices. Recently, models involving stochastic volatility and jumps have gained popularity, because they accommodate several empirical features of market data and they provide semi-analytical pricing formulas for European option prices. Closed form solutions for the prices of many other derivative securities under these models are not available and one needs to use numerical methods such as Monte Carlo simulation. However, it is usually difficult to get reliable estimates using classical simulation techniques which use time discretization, because of the bias resulting from the approximations in these methods. In “Exact Simulation of Stochastic Volatility and other Affine Jump Diffusion Processes,” M. Broadie and O. Kaya develop an algorithm that simulates these models exactly and eliminates the time discretization bias. Their method, which uses Fourier inversion techniques, converges to the true prices with a faster rate than the discretization methods. This new method enables the calculation of reliable confidence intervals for the simulation estimates.

How to Manage an Index Tracking Portfolio?
For small-size portfolios or funds that concentrate on a relativelynumber of stocks, it is impractical to track a large market index (e.g., SP 500 or Russell 2000) literally, i.e., by holding all constituents’ stocks in proportion, and continuously adjusting their weights. This tracking problem is, however, of interest to money managers, as their performance will usually be measured against certain financial benchmarks, even though their funds are actively managed. In “Tracking a Financial Benchmark Using a Few Assets,” D. D. Yao, S. Zhang, and X. Y. Zhou present an approach to financial tracking by formulating the problem as an instance of the stochastic linear quadratic control, and using semidefinite programming as a computational tool to generate the optimal real-time tracking strategies. Numerical examples using market data from Hong Kong and New York Stock Exchanges illustrate the various features of the model, and its robust performance with rather infrequent data and portfolio updates, and regardless of whether the market is up or down, and what stocks are used in the portfolio.

Model Fitting in a Call Center
When applying stochastic models in OR applications, there is a need to fit the model parameters to data. This requires identifying the parameters that need the greatest attention, and hence, it is useful to know how much the performance predictions we intend to compute depend onchanges in the model parameters. Motivated by call-center applications, W. Whitt investigates this question for the basic multi-server queue with customer abandonments. In “Sensitivity of Performance in the Erlang-A Queueing Model to Changes in the Model Parameters,” W. Whitt shows that the standard steady-state performance measures in this Markovian model are much more sensitive tochanges in the arrival and service rates than tochanges in the abandonment rates. More importantly, he shows how this sensitivity question can be properly formulated in any context by using the notion of “elasticity” and systematically investigated by applying, first, elementary finite-difference numerical calculations and, second, fluid and diffusion approximations. Numerical and asymptotic methods both provide important insight.

Network Design and Demand-Price Economics
Deregulation and rapid technological change in the past 20 years have had a profound impact on the telecommunications industry. Service offerings and growth are no longer determined by monopolistic economics but rather by competition, consumer demand, commoditized infrastructure components and rapidly falling prices. Traditional mechanisms for network expansion and management are severely challenged, as are standard capacity expansion approaches that aim to meet fixed demand growth at minimum cost under nearly constant technology costs. In “Network Design and Multi-Period Pricing, Modeling, Solution Techniques and Computation,” D. Bienstock, O. Raskina, I. Saniee and Q. Wang consider such an alternative optimization model for capacity expansion, where fixed demand growth is replaced by flexible demand-price economics better in line with competitive markets and falling technology costs. The resulting model continues to accommodate important network design considerations such as reliability and robustness but replaces fixed growth with pricing-dependant demand, where price is a new and explicit variable in the model. Unlike traditional capacity expansion, the resulting revenue-maximization model is non-linear and does not lend itself to solution through standard techniques. The authors propose decomposition techniques and show it scales to realistic network sizes. They also benchmark the quality of the solutions on a sample of realistic networks and synthetically large examples and report on computational complexity and time.

Utility-Probability Analogy
Before analyzing any decision situation, we need to capture the alternatives, information, and preferences of the decision maker. Information is captured by a representative probability distribution, while preferences are captured by a utility function that describes the decision maker’s taste for risk. In “Maximum Entropy Utility,” A. E. Abbas introduces an analogy between utility and probability that transfers many tools and techniques from one domain into the other. The core idea of this approach defines a utility density function as the derivative of a normalized utility function. Based on this definition, a utility density function has the same mathematical properties as a probability density function. The author presents an application of this analogy, and introduces the maximum entropy utility principle to assign utility values when only partial information is available about the decision maker’s preferences. This approach can be used in competitive bidding situations where one tries to infer the utility values of others by observing their decisions. The utility-probability analogy leads to further research on utility inference analogous to Bayes’ rule for probability inference, and duals to expected utility formulations with the roles of probability and utility reversed.

Strategies for Selling Goods Online
Online auctions, electronic catalogues with fixed or dynamic posted prices . . . several different strategies are currently available for selling goods online. In “Dynamic Mechanism Design for Online Commerce,” J. Gallien derives the optimal selling strategy (or mechanism) from the seller’s perspective, in a market model where a stream of time-sensitive online shoppers with unit demand arrive sequentially to a site where a finite supply of identical goods is put for sale. The optimal strategy for this model, it turns out, is to use posted prices that progressively increase with each sale, i.e. as the available inventory depletes. Numerical experiments suggest however that the benefit of dynamic pricing over a single, well-chosen posted price may be small. Besides, posted prices seem to be only preferable to online auctions when selling a large number of items or for very time-sensitive sellers. In other cases, auctions are close to optimal and, in situations where the understanding of the market is limited, they are significantly more robust.

Off-Line Inspection Strategy in Lot Production
When production is conducted in “lots” involving setup and the yield is random, high production and inspection costs give rise to several operational questions. In “An Optimal Lot Sizing and Off-Line Inspection Policy in the Case of Non-Rigid Demand,” S. Anily and A. Grosfeld-Nir consider industries, e.g., the printed circuit assembly industry, where production is in lots and a 100% off-line inspection is required. More specifically, this study considers a production process that may get out of control (without signaling) at a constant failure rate. The quality of the items in the in-control and out-of-control states is random, such that the chance that a unit will end up conforming if produced while the process is in-control is constant and higher than the respective conforming probability when the process is out-of-control. Based on the indirect information about the quality of the already inspected items, the operator needs to determine whether to continue inspecting the next unit or to stop inspection since there is sufficient information to conclude that the process is out-of-control. The authors formulate the inspection problem as a Partially Observable Markov Decision Process and show that the optimal policy is of the Control Limit Threshold type.

What is the Impact of Letting Suppliers Manage Component Inventories?
The current trend towards outsourcing has increased the importance of understanding the implications of allowing outside firms to control part of the supply chain. In “Inventory Policies in a Decentralized Assembly System,” F. Bernstein and G. A. DeCroix explore this issue in a setting in which an assembler of finished products allows suppliers to control component inventories. They find that decentralizing decision making changes the incentives of the firms in a way that results in fundamental changes in system behavior and performance. For example, total supply chain costs are higher after outsourcing. In addition, while it is always beneficial to make use of information about in-transit component inventory when all decisions are made by a single firm, using such information may actually increase costs for some firms (including the firm directly using the information) when decision making is decentralized. The authors propose a set of transfer payments between the assembler and the suppliers that can eliminate these types of inefficiencies, allowing the outsourced system to perform as well as one that is controlled by a single firm.

Managing Partial Shipments in Large Supply Chains
Partial shipments are inevitable when suppliers allocate scarce inventories across many customers. Indeed, the frequency and magnitude of partial shipments in the supply chain has significantly increased in recent years due to a variety of quantitative and qualitative factors. Thus, there is a real need for effective decision support tools in this domain. In “Effective Heuristics for Multi-Product Partial Shipment Models,” M. Dawand, S. Gavireni, and S. Tayur recognized this need during their interactions with a Procter and Gamble distribution center in Mexico. For most real-world supply chains (with hundreds of products and customers), a straightforward application of mathematical modeling techniques results in problems that are not easily solvable by off-the-shelf software. The authors proposed a scheme of heuristics that allows the user to select an appropriate balance between accuracy and computational effort. A detailed computational study established that these heuristics are very effective; they solve the problems in a matter of seconds and generate solutions that are typically within 2-3% of the optimal. The authors also showed that these heuristics provide reasonable feasible solutions for extremely large problems.

Determining Optimal Firefighter Staffing Levels
Fire departments face a continuing and important dilemma: how to insure the social welfare of a community at the lowest possible cost. Complicating this decision is the fact that urban fire departments are often required to keep a minimum number of firefighters on duty at all times. Therefore, fire departments must rely on either hiring extra firefighters to be “on-call” or using overtime to bring in additional firefighters whenever absences occur. In “Firefighter Staffing Including Temporary Absences and Wastage,” M. J. Fry, M. Magazine, and U. S. Rao were motivated by interactions with the Cincinnati Fire Department to develop models for determining the optimal number of firefighters to have on staff to minimize total staffing costs. The authors’ models balance the use of overtime and hiring additional firefighters to minimize costs. The models are similar to previous inventory models in operations research, but with significant complications due to the effects of temporary absences and permanent attrition in the firefighter workforce. The authors compare their optimal policies to existing heuristics used in firefighter staffing to show that significant savings are available. The authors also draw useful managerial insights from their parameter analysis to aid practical decision making.

A Sequencing Problem Arising in Aircraft Maintenance and Telecommunications
Two sequencing problems arising in aircraft maintenance and in the design of telecommunications networks can be modeled and solved through the same methodology. The first problem consists of determining a flying sequence for aircraft with upper limits on the number of takeoffs and landings, as well as on the number of flying hours between two successive maintenance connections. The telecommunications problem is to design survivable networks using the SONET technology with upper limits on both the number of links and the chain length between consecutive offices. In “The Black and White Traveling Salesman Problem,” G. Ghiani, G. Laporte, and F. Semet show that both problems can be viewed as the same variant of the famous Traveling Salesman Problem with two vertex types (black and white) and limits on the chain length and the number of white vertices between two consecutive black vertices. They propose a powerful solution methodology capable of solving problems of realistic dimension. The same approach can also be used to solve some cases of the Vehicle Routing Problem arising frequently in distribution management.

Connected Reserve Networks
Conservation reserves have become increasingly important to protect biodiversity and at-risk species. Budget limitations and alternative uses of lands that can be set aside as conservation areas, necessitate efficient allocation of scarce resources. Most previous studies that addressed this problem used heuristic approaches, because of modeling and computational convenience, although heuristics are known to yield in general economically inefficient solutions. Relatively few studies employed formal optimization, specifically linear integer programming, using the set covering and maximal covering formulations. Conservation planners criticize the latter approaches as too simplistic, because many important aspects of practical reserve design, in particular spatial aspects, are not taken into account in those formulations. Incorporation of spatial considerations in an exact optimization framework poses challenging problems and requires sophisticated mathematical programming formulations. In “Optimum Selection of a Connected Reserve Network,” R. A. Briers and H. Önal tackle an important spatial problem in site selection, namely contiguity of the reserve sites. They introduce an innovative approach, based on graph theory, and develop a linear integer programming model to determine a fully connected efficient reserve configuration. The paper also presents an empirical application, which demonstrates significant improvements in efficiency over a heuristic solution to the problem.

Purchasing from Two Suppliers
In most real-world applications, production inputs such as materials, inventory, parts and labor can be sourced from more than one supplier. In “Optimal Inventory Policy with Two Suppliers,” E. Fox, R. Metters and J. Semple determine the optimal policy when the firm can purchase from two different suppliers. In the scenario considered by the authors, the firm incurs supplier dependent procurement costs. One supplier offers a lower price per unit but the firm pays an additional fixed cost. Fixed costs might include setup time, administrative costs, process modifications, etc. The other supplier offers a higher price per unit but the firm avoids any additional fixed cost. Assuming mild conditions on future demand for the production input, the authors show that the firm’s purchasing policy should depend on at most three parameters. The first parameter determines which supplier (if either) to buy from. The other two parameters determine how much to buy. The general policy applies regardless of whether unmet demand results in lost sales or backorders.

Integrating Pricing and Capacitated Production Planning
Planning production under economies of scale and capacity limitations presents substantial difficulties in matching production output with customer demand. Addressing the complexities of these production planning problems have long challenged researchers and practitioners. Moreover, suppliers that can influence production requirements through pricing or order acceptance decisions face the problem of determining the best demand levels for their production system. While incorporating mechanisms for influencing demand in the planning process creates opportunities for enhancing profits, it also leads to additional planning complexity. In “Requirements Planning with Pricing and Order Selection Flexibility,” J. Geunes, H. E. Romeijn, and K. Taaffe provide an optimization model and efficient solution methods that address the problem of simultaneously planning production and demand levels. Practitioners can apply these methods to determine pricing levels or order selection decisions that provide the best match between production requirements and output under production economies of scale and time invariant production capacities.