Operations Research Forum

Ciamac C. Moallemi’s and Mehmet Saglam’s paper “The Cost of Latency in High Frequency Trading” appears in the September-October 2013 issue of Operations Research. In this paper, Moallemi and Saglam try to quantify the cost of delays in processing a sell order for a stock.  Technological advances in data networks and computing power have transformed the way securities are traded.  These advances have created the opportunity to process market information and to profit from momentary informational advantages and have led to the rise of electronic trading platforms.  The widespread use of computerized trading algorithms in the financial markets and the importance of speedy decision making and trade execution make this a fertile area for Operations Research methods.  While it may be self-evident that being able to react quickly to market information is better than being slow, reducing reaction time requires significant investments.  As the authors point out, high-frequency traders need to invest in both algorithm development, computing and communications hardware, and even facilities that are co-located with the exchanges all to reduce trade latency.  This paper helps give a theoretical foundation to these investments. 

Invited Comments

We solicited comments on this work from experts on financial markets micro-structure, high-frequency trading and trading algorithms.  

Terrence Hendershott is the Cheryl and Christian Valentine Chair and Associate Professor at the Haas School of Business at the University of California Berkeley.  He is a member of both the Finance and Operations and Information Technology groups. He has also been a visiting economist at the New York Stock Exchange.  His areas of expertise and interests revolve around the role of information technology in financial markets.  He has a PhD in Operations and Information Technology from Stanford.

    

Robert Almgren is the Co-founder and Head of Research at Quantitative Brokers a venture-funded algorithmic agency brokerage concentrating on fixed-income products and futures. He is also a Visiting Scholar and Adjunct Professor in Financial Mathematics at the Courant Institute of Mathematical Sciences, New York University. He has written on transaction cost measurement, high-frequency trading and trading strategies.  He has a Ph.D. in Applied and Computational Mathematics, from Princeton University.

Discussion

The authors take the perspective of a single seller over a short time horizon who is trying to use a limit order to sell a single unit of stock using information on the current bids, the stochastic process governing the bids, and the frequency of arrival of impatient buyers.  The authors develop a benchmark model without latency in which the seller knows the current bid at the time they set their limit order price.  They then develop a discrete time model to capture the effect of latency by introducing a fixed time lag when setting the limit order price.  So a price level is decided at time ti but only takes effect at time ti+1=ti+Δt when the bid process will be at a different level.  The Δt represents the trading latency. They then derive a closed form asymptotic approximation of the cost increase over the perfect information benchmark created by low latencies.  Using empirical date on high frequency data to generate parameters for their dynamic programming model they then generate estimates for the cost of latency at different points in time over a ten year period.  They find that latency costs, defined as relative to the no latency benchmark, have been increasing and that the absolute latency cost is of comparable scale to other trading costs.

 

"

Moallemi and Saglam examine an significant aspect of optimally implementing an investor’s trading decision: the order type choice problem for how each individual piece of a larger order should be executed. Prior work typically assumes an investor must pay a transitory price impact generated by their trading, which includes the bid-ask spread. An ad hoc transitory price impact function is assumed and optimization proceeds. One way to think of Moallemi and Saglam’s contribution is as studying the details of how to minimize that transitory impact by using limit orders. The paper examines the optimal control problem for an investor wanting to capture the bid-ask spread by placing a limit order rather than paying the spread by placing a market order. The main challenge in using limit orders is that as the underlying stock price varies, the original limit order price becomes stale (no longer optimal). The selling investor would like to keep the order at the best ask price. As the stock price moves up the limit order should be revised upwards to capture as much of the spread as possible. As the stock price moves down the limit order should be revised downwards to remain at the best price to allow execution. Using their model the authors quantify the benefits of being able to revise/reprice limit orders more quickly.

Hendershott

"

Hendershott questions if the results indicate an increase in absolute latency costs are just relative latency costs.

"

The paper’s relative definition latency is clear in Definition 1 where the latency costs are defined as the percentage difference between the latency free value of the optimal policy and the value of the optimal policy with latency. Figure 8 shows that this percentage difference increases over time. What is not clear from Figure 8 is whether the increase is coming from an increase in the numerator or a decrease in the denominator. If the rise in Figure 8 is from an increase in the numerator then the cost of latency shown is increasing in both absolute and relative terms. If the trend in Figure 8 is from a fall in the denominator then absolute latency costs (the numerator) could be falling even though relative latency costs are rising.

The latency free value of the optimal policy is given in Theorem 3 and is proportional to the bid-ask spread. Therefore, I calculate the bid-ask spread for Goldman Sachs from 1999 to 2005. The below figure shows the bid-ask spread in both dollar terms and as a percentage of stock price. Goldman’s share remains close to $100 throughout the period so the two measures track each other closely. The figure shows the spread measures falling roughly seven fold. This decline is substantially greater than the percentage increase shown in Figure 8. Combining my figure and Figure 8 in the paper suggests that latency costs at the beginning of the sample (pre-2001) represent roughly two cents per share, 10% cost of latency times a 20 cent bid-ask spread. At the end of the sample spreads have fallen to roughly three cents per share. Multiplying this times a 20% cost of latency from Figure 8 gives an absolute cost of latency of approximately 0.6 cents per share. This is a decline in of roughly two thirds. Calculations done in basis points are similar. Hence, while I agree that latency costs as a portion of the costs of immediacy have increased, the absolute importance of latency appears to decline. Thus, in the paper’s context, while latency is more important in the trading process, latency may be less important for the investing process.

Hendershott

"

 

Hendershott graph

The authors respond:

"

With regards to the characterization of latency trends, you are absolutely right in that we are empirically arguing that the relative latency cost has increased over time. We agree with your characterization that latency relates to "trading" rather than "investing". Hence, we feel that the relative metric is appropriate; it makes sense to understand the impact of latency in the context of a well-understood trading cost, the bid-offer spread.

Moreover, the trading problem we analyze considers the impact of latency on the difference in value between a limit order and a market order. This is bounded by the bid-offer spread, and if the bid-offer spread dramatically decreases, as in your example, the absolute latency cost must necessarily also become small. That doesn't mean, however, that latency is less important now than before in a practical sense, just as minimizing spread costs remains practically important to many investors in spite of the fact that spreads have decreased dramatically over time.

Moallemi and Saglam

"

Almgren questions whether the “cost of latency” observed in the model is perhaps just a by-product of discretization: 

 

"

The effect observed in this paper is due to time discretization, not to latency. The authors have ignored the price change within the interval Ti to Ti-1, which is of the same asymptotic size O(√Δt)) as the price change from Ti to Ti-1. Correctly calculated, the cost for the discretization-only model in Section 4.3 is of the same order as the overall cost for the latency model.

Almgren

"

Authors respond:

 

"

Dr. Almgren brings up some good and subtle points relating to our model. As Dr. Almgren states, our main conclusions is that a latency of Δt asymptotically creates a cost that is order O(√Δt log 1/Δt). We believe this conclusion is robust to the particular assumptions made in the model of Section 4.2 (e.g., Bernoulli rather than Poisson arrivals, including or excluding changes in market price during the limit order lifetime, etc.) We do wish to clarify the role of the discrete-time model in Section 4.3, however. As Dr. Almgren observes, including changes in market price during the limit order lifetime would alter the asymptotic cost in this setting. However, the main point of Section 4.3 was to understand whether the latency cost between the model of Section 4.2 and the continuous time model arises from:

(A) Discreteness of time, i.e., that decisions are only made at the beginning of n intervals of length \Delta t, as opposed to being made continuously

(B) Latency, i.e., lack of access to the timely information, when a limit order is placed, the reservation price of an impatient buyer who might trade against is not known, as opposed to the continuous time case where it is known

The "more realistic" model that Dr. Almgren suggests, including changes in market price during the limit order lifetime, has both discreteness of time and latency. The reservation price of the next impatient buyer is unknown when the limit order price is set, but involves some random delay--- this is a latency. Hence, it is not surprising that it achieves the same asymptotic cost as that of the model of Section 4.2, but it's also not interesting as a point of comparison in order to disentangle the effects of (A) and (B).

Moallemi and Saglam

"
In the July-August (2013) issue of Operations Research Alan Washburn writes about the optimal allocation of money to states in a presidential election campaign in the United States. He models the competition between two parties for control of the United States Electoral College. According to the Federal Election Committee (http://www.fec.gov/press/press2013/20130419_2012-24m-Summary.shtml) in the 2012 election cycle presidential candidates raised and spent approximately $1.4 bln this does not count the similar spending by the major political parties themselves as well as Political Action Committees. As is well known spending is generally increasing rapidly from election to election although the degree to which primaries are contested can cause variation in this trend. There is considerable public concern about the amount of money spent on elections and its impact on the democratic process. There are laws that regulate election spending and there may be a need for new laws. A better understanding of how money can be used optimally in an election campaign will inform such a discussion. Washburn’s article “Blotto Politics” is a small step in developing this understanding by modeling the competitive game the two major US political parties are engaged in when making spending decisions and showing the impact that funds imbalances can have. Elections are an area with rich potential for applications of Operations Research and over the years have attracted interest from a wide range of perspectives. This article will hopefully spur some more research activity that can help inform the public discussion of campaign finances.

View Full Post »

In the 2012 November-December issue of Operations Research Ed Kaplan writes about the subject of his 2010 Philip McCord Morse Lecture, “Intelligence Operations Research” (http://or.journal.informs.org/content/early/2012/07/03/opre.1120.1059.full.pdf+html). Here he discusses applications of operations research to intelligence problems in national security and counterterrorism. As he illustrates in his review of the literature, this is a distinctive problem area to which he has made notable contributions but also offers many opportunities for new research with the potential to improve the security of our society.

View Full Post »

In the September-October issue of Operations Research, Turgay Ayer, Oguzhan Alagoz and Natasha Stout write about personalizing protocols for breast cancer screening using mammography. (link to full paper in Articles in Advance) The purpose of a mammogram is to detect breast cancer at an early stage. When a cancer is detected early there is greater flexibility in treatment modalities and increased probability of cure. As a result it has become standard for women to receive regular mammograms annually or bi-annually from the age of 40. However, mammograms have high false positive rates leading to unnecessary testing and treatments, as well as anxiety. Mammograms also expose women to radiation that over time may cause cancers as well. The goal of this paper is to develop a method for creating screening protocols that will improve detection and survival rates through more timely detection while at the same time reducing the overall usage of mammography and false positive rates. Current screening protocols are a one size fits all approach and this paper seeks to customize them to an individual woman’s personal risk characteristics and screening history. The paper is indicative of an important trend in healthcare. As medical researchers discover more genetic links to diseases and patient information profiles get richer easier to store, communicate and analyze it will become easier to personalize healthcare customizing diagnostic and treatment protocols it to individuals. The challenges to personalizing healthcare that arise in the mammography context will apply to others as well.

View Full Post »

In the May-June, 2012 issue of Operations Research Professor Ramteen Sioshansi writes about Plug-In Hybrid Electric Vehicles or PHEVs. PHEV is a technology that has a potential to revolutionize how we power transportation and the impact of personal transportation on the environment. Gasoline powered cars have a well established distribution network for the fuel they need. PHEVs will draw energy from the same electric power distribution system (or grid) that we use for all other electric power needs. In his paper Sioshansi investigates different strategies for managing the impact of PHEV charging on the power grid.

View Full Post »

As a followup to discussion on Little’s Law as Viewed on its 50th Anniversary, John Little and Ron Wolff have provided a further discussion of issues related to Little’s Law entitled “The ‘Flaw’ in Little (1961), its identification, and its fixes”. In this commentary, Little and Wolff discuss the history and resolution of issues in Little’s original proof of L = λW.

View Full Post »

In the May-June, 2011 issue of Operations Research, the journal revisits one of its most influential publications: “A Proof for the Queuing Formula: L = λW” by John Little. The formula, now known widely as Little’s Law, has been critical in many following results and applications.

View Full Post »

In the November-December, 2010 issue of Operations Research, David Lane of the London School of Economics and Political Science examines three historical uses of operations research.

View Full Post »

In the July-August, 2009 issue of Operations Research, Larry Wein of the Graduate School of Business provides his Philip McCord Morse Lecture, delivered in 2008.

View Full Post »

In the May-June, 2009 issue of Operations Research, Marshall Fisher, UPS Transportation Professor for the Private Sector at the Wharton School, discusses his experiences with the Consortium for Operational Excellence in Retailing. This paper grew out of Fisher’s 2006 Philip McCord Morse Lecture.

View Full Post »

About

The OR Forum is an area of the journal Operations Research, published by the Institute for Operations Research and the Management Sciences (INFORMS). The purpose of the Forum area is spelled out in its mission statement:

"

The purpose of the OR Forum area is to stimulate discussion about the field of Operations Research and interesting new research challenges. The OR Forum area invites thought-provoking work that challenges the reader to reconsider and revaluate past research streams as well as to consider new emerging areas of research. Analysis of prospects in areas not traditionally covered by Operations Research are strongly encouraged, as are provocative papers that take a strong stand on policy issues. Possible submissions may also include critical reviews of research in a specialized field and closely reasoned commentary on the practice within an area. The work should be accessible and of interest to a significant portion of the readership of Operations Research.

Published work will often be accompanied by supplemental commentary that enhance or dispute the theses developed and an online forum will provide opportunity to continue the discussion after publication. Authors are encouraged to contact the Area Editor early in the process of developing their work to determine suitability for consideration in this area. The Area Editor will seek nominations from the other Area Editors at Operations Research to identify suitable papers to be published and discussed in the OR Forum from among those manuscripts already through the standard review process.

"

This site is an adjunct to the published papers in the journal. At this site, we invite commentary and discussion of each of the OR Forum’s papers. There is no set time-limit to this discussion, and interested readers are invited to check back periodically for updates.

All comments and posts are moderated for content by the Area Editor, Edieal Pinker (ed.pinker@simon.rochester.edu) .

Welcome to the OR Forum!

Recent Posts

Tag cloud