Volume 54, Issue 4

In This Issue

Electricity Contracts from an Electricity Retailer's Point of View
As today's electricity market is transitioning towards deregulation, market participants find themselves exposed to financial and operational risks related to rapidly changing electricity prices. An interruptible contract, which allows a retailer to interrupt a customer with advance notice, can be used to manage these risks. In "Interruptible Electricity Contracts from an Electricity Retailer's Point of View: Valuation and Optimal Interruption," R. Baldick, S. Kolos, and S. Tompaidis describe a model for forecasting the distribution of future electricity prices based on underlying supply and demand, and use the model to value interruptible contracts, as well as to determine the optimal interruptible contracts. The framework, which is based on stochastic dynamic programming, is flexible enough to accommodate both the case of a monopsonistic retailer as well as multiple, competing retailers. Interruptible contracts turn out to be valuable tools in managing risk, especially in situations where the retailer has limited amounts of generation available.

Multi¬regional Economic Equilibrium Models
There are many useful energy market models that combine economic principles with engineering information about fuels, conversion processes and energy-using devices, in a mathematical programming framework. The first such models, from the 1970s, were linear or nonlinear programs, but the needs to represent fuel substitution by consumers and noncompetitive behavior by producers quickly led to the more general notion of "equilibrium models," which include the class of constrained optimization models, and which share many familiar features of optimization models. Today, equilibrium models are usually formulated as complementary problems, or as variational inequalities. Energy market models of this type can easily become extremely large, with many fuels, processes, devices, time periods, customer classes, geographical regions, etc. In “A New Decomposition Method for Multi-regional Economic Equilibrium Models,” J. D. Fuller, W. Chung and Y. J. Wu show how to extend any optimization-based decomposition method to a type of equilibrium model that has several geographical regions, and it gives the details, such as a convergence proof, for the Dantzig Wolfe method. The benefits of such decomposition may include: easier model development and maintenance, with the data and description in severalpieces, i.e., one piece per region; and the ability to solve a huge model on an inexpensive computer, or on several different computers that communicate over a network.

Replenishment and Pricing under Random Yield
Many manufacturers, when selling directly to their end customers, price their products dynamically based on, among other factors, the availability of inventory on hand. Meanwhile, yield uncertainty in replenishment is a common occurrence in a wide range of production scenarios. A significant problem is thus to study the coordination of pricing and inventory replenishment strategies under yield uncertainty. In "Joint Inventory Replenishment and Pricing Control for Systems with Uncertain Yield and Demand", Q. Li and S. Zheng have developed a model to study this problem. They explore various structural properties of the optimal policy and study the operational effect of uncertain yield. In particular, the authors show that the optimal replenishment policy is threshold type: an order is placed if and only if the starting inventory is lower than a threshold value, and that both the ordering threshold and the optimal price are higher in a system with uncertain yield than in one with certain yield.

Ordering Systems with Multiple Delivery Options
Globalization and market uncertainties are forcing manufacturers to move toward more complex sourcing strategies. Inclusion of alternative sourcing partners has become a common practice in many industries. In "Are Base-stock Policies Optimal in Inventory Problems with Multiple Delivery Modes?" Q. Feng, S. P. Sethi, H. Yan, and H. Zhang present a model that allows the analysis of the ordering decisions when there are multiple supply options with different ordering lead times and unit ordering costs. The authors discuss the complexity of the optimal policy and show via a counter-example that the base-stock policy fails to be optimal for all but two fastest consecutive modes.

Exact Analysis and Efficient Evaluation of Assemble-to-Order Systems
Assemble-to-Order Systems represent important supply chain strategies that allow manufacturers to better match supply and demand. In such a system, manufacturers keep inventory of components and assemble finished goods only after demand is realized. A long standing challenge in the inventory control literature is the exact analysis of even simple assembly systems with random demand and stochastic sequential lead-times. Real world Assemble-to-Order systems, such as the one used by Dell Computers, pose even more substantial challenges both analytically and computationally because they involve both assembly and distribution systems. In "Performance Analysis and Evaluation of Assemble-to-Order Systems with Stochastic Sequential Lead Times," Y. Zhao and D. Simchi-Levi develop a unified modeling approach, namely, the "backward method," that facilitates exact analysis and efficient evaluation of various Assemble-to-Order systems with stochastic sequential lead-times, with multiple products and when component inventories are controlled by either base-stock or batch ordering policies.

Distilling Combinatorial Information from Mixed-Integer Linear Programming Models
Mixed-integer Programming plays a central role in modeling difficult-to-solve combinatorial problems. Exact MIP solvers are nowadays very sophisticated tools designed to deliver, within acceptable computing time, a provable optimal solution of the input MIP model, or at least a heuristic solution with a practically-acceptable error. Unfortunately, in some hard cases a general-purpose MIP solver may prove not adequate even after a clever tuning. This may lead one to quit the MIP framework and to design ad-hoc heuristics for the specific problem at hand, thus loosing the advantage of working in a generic framework. Fortunately, recent developments and advances in solution methods for hard MIP models have moved the edge between solvable and unsolvable problems in a significant way, hence the hope that more and more applications will benefit from the improved MIP technology. Among the MIP models that are still a challenge even for the best commercial solvers, those involving logical implications modeled through big-M coefficients are perhaps the most relevant ones. In "Combinatorial Benders' Cuts for Mixed-Integer Linear Programming", G. Codato and M. Fischetti address precisely this class of hard MIPs. They propose and analyze computationally an automatic problem reformulation of quite general applicability, aimed at removing the model dependency on the big-M coefficients. Computational results on two specific classes of hard- to-solve MIP's indicate the new method produces a reformulation which can be solved some orders of magnitude faster than the original MIP model.

Power Production problems with Ramping Constraints
Optimization problems connected to Power Production and Distribution, such as Unit Commitment and its variants, have been studied for more than 30 years; recently, the interest in these problems has further increased due to the switch from a monopolistic system to a free market regime, which is undergoing in several countries. Historically, these problems have been successfully tackled by Lagrangian approaches where the problem is decomposed by relaxing the constraints that link each generating units' behavior with the others. However, the dynamic programming approaches typically used for solving the corresponding Lagrangian subproblems fail when "ramping constraints" are imposed on the power generation of each unit. In "Solving Nonlinear Single-Unit Commitment Problems with Ramping Constraints", A. Frangioni and C. Gentile presents a new dynamic programming algorithm that efficiently solves those ramp-constrained Lagrangian problems. This new algorithm allows the type of nonlinear convex objective functions that are typically used in real-world applications.

Optimizing Inventory Levels in Complex Networks
Companies are continuing to outsource more portions of their supply chain while at the same time expanding the global presence of their brands. These improvements in supply chain efficiency and effectiveness drive greater complexity in the resulting structure of the supply chain. Rather than a linear progression from raw materials through manufacturing through distribution, there are now numerous vendors providing parts to multiple outsourced manufacturers to a distribution network that includes region-specific packaging before shipping the product to the end customers. In "Optimizing Strategic Safety Stock Placement in Supply Chains with Clusters of Commonality, "S. Humair and S. Willems present a computationally tractable dynamic programming algorithm that optimizes inventory levels and locations in networks that contain multiple sources of commonality. Frequent types of commonality include parts that supply multiple subassemblies or packaging that supports multiple end items. Beyond expanding the class of network topologies that can be optimized, the authors generalize the modeling framework approach to consider arbitrary stage costs, which arise in many practical settings including inventory-dependent warehousing costs.

Finding what Matters in your Simulation Model
When working with discrete-event simulation models of manufacturing or service systems, managers and engineers often need to know which controllable factors in their simulation model have important effects on critical output measures like cycle time or profitability. Simulation models typically incorporate uncertainty in their driving input processes, leading to variability in the output performance estimates. The statistical literature provides many techniques for designing efficient experiments to discover important factors in physical systems with uncontrolled random error. However, since simulations are run on computers, they have far fewer restrictions than physical experiments. For example, simulation experiments can be run automatically and many replications can be obtained cheaply. In "Controlled Sequential Bifurcation: A New Factor-Screening Method for Discrete-Event Simulation," H. Wan, B. E. Ankenman and B. L. Nelson introduce a new screening method that uses the advantages of the discrete event simulation setting to efficiently separate important factors from unimportant factors while providing statistical guarantees concerning the probability of misclassification of factor effects.

Performance Standards in DEA
The modeling of performance in organizations is a long established practice, but has been directed mainly at pure industrial settings. There, production standards are used to allow for the comparison of actual to best possible performance, leading to the development of benchmarks. In operations such as those in service, and not for profit settings, different approaches have been developed for evaluating performance. In their 1978 seminal work "Measuring the Efficiency of Decision Making Units", A. Charnes, W.W. Cooper, and E. Rhodes present an optimization procedure for such non-industrial situations. Their tool, data envelopment analysis (DEA), compares the performance of an organization such as a bank branch to other peer operations. While extensive research on the DEA model structure has appeared in the literature, the tool concentrates only on relative (to its peers), and not absolute efficiency measurement. As a result, attempted application of DEA in many organizations often meets with resistance. More recently, many service settings, such as banks and hospitals, have adopted the practice of developing standards comparable to those in pure industrial situations. In “Incorporating Multi-Process Performance Standards into the DEA Framework,” W. D. Cook and J. Zhu address the gap between relative and absolute performance measurement, by incorporating available standards into the DEA methodology, thus providing not only relative benchmarks, but as well, absolute targets to which the organization can aspire.

Analyzing Internet Interconnection Arrangements
The Internet is a global, complex, yet loosely organized collection of data networks owned by different competing service providers. The collaboration among providers is essential for the end-to-end delivery of network services. In addition, to support the transmission of delay¬ sensitive applications, providers need to establish interconnections among themselves to dynamically trade network capacity, such that high service quality can be maintained even if the network is congested. To make such interconnections possible in a competitive setting, financial agreements have to be carefully designed. In "An Economic Analysis of Interconnection Arrangements between Internet Backbone Providers," Y. Tan, R. I. Chiang, and V. S. Mookerjee derive equilibrium pricing schemes that consider network operational factors such as network utilization, link capacity, and the cost structure of the interconnecting participants. Their research offers guidelines of practical importance, for example, the authors show that the common SKA (Sender Keeps All) mode of settlement does not provide adequate incentives for collaboration.

New Planning Problems in Terrestrial Broadcasting
An important challenge faced by broadcasters is the planning of complex video broadcasting systems where different networks share the available transmitters and frequencies. Each network has its own technical features (e.g., it can be either analog or digital) and commercial goals. Therefore, the problem consists of deciding the emission powers and frequencies of the transmitters of each network so as to optimize a multi-objective performance indicator, e.g., the population coverage for each network. In “The Network Packing Problem in Terrestrial Broadcasting,” S. Smriglio, C. Mannino and F. Rossi propose a two-stage planning methodology. In the first stage emission powers are assigned to each network separately. In the second stage frequencies are assigned to all networks so as to minimize the loss from mutual interference. A software tool incorporating this methodology is currently in use at the major Italian (public) broadcaster to help discover and select high-quality optimized alternatives for the deployment of digital television.

Maximizing Profit by Manipulating Quality
Conventional wisdom holds that sellers should strive to always offer the highest quality, but ethnographers report that sellers of illicit drugs systematically vary the quality of their drugs and, hence, of the deals they offer. They alternate between offering high purity (high value) deals that build up their reputation and attract customers and milking that reputation for short-run profits by giving low quality service in the form of diluted ("cut") drugs. In "Quality Cycles and the Strategic Manipulation of Value," J. Caulkins, G. Feichtinger, J. Haunschmied, and G. Tragler show that oscillatory variation in quality can be an optimal dynamic strategy for sellers of experience goods when there are information lags. With parameters drawn from US cocaine markets, the resulting oscillations help explain the mystery of persistent, extreme dispersion in drug prices in highly competitive retail markets that would otherwise seem to violate the law of one price. Using parameters that are more reasonable for conventional businesses can make the oscillatory behavior become even more pronounced not eliminate it.

Planning Machine Maintenance in Two-Machine Shop Scheduling
One of the extensions of standard scheduling models' is related to the so-called machine non-availability intervals. If such an interval is due to preventive equipment maintenance, the decision-maker is usually fairly free to choose the start time for that maintenance; additionally, the length of the maintenance period may depend on its start time (the sooner the maintenance is started, the better the equipment conditions are and less time is needed for its maintenance). In "Planning Machine Maintenance in Two-Machine Shop Scheduling" M.A. Kubzin and V.A. Strusevich study the situation that the jobs are processed on two consecutive machines, each machine has to be maintained, and the goal is to minimize the completion time of the last activity. In the case of open shop, they give a fast exact algorithm, while the flow shop problem appears harder, and a several approximation algorithms are offered for its solution.

Understanding the Dynamics of New-product Demands
The appropriate modeling of demand is of importance in manufacturing, services, and supply chains. This task is particularly challenging for new products (or for products with a short life cycle), since demands for such products are highly dynamic and unpredictable. Demand models that attempt to capture the effects of consumers' intrinsic propensity and word-of-mouth influence, price, and advertising have been studied extensively in the literature. However, most of the existing models are deterministic, and hence are unable to offer information on the variability of demand. This is a serious drawback. In "A piecewise-Diffusion Model of New-product Demands," S-C. Niu formulates and empirically examines a stochastic new-product demand model. The proposed model is shown to be remarkably accurate. It is also capable of delivering a wealth of quantitative information on the growth of the force of influence between consumers, the joint impact of price and advertising on sales, the evolution of market penetration, and, in particular, the extent of the variability of the demand trajectory.