# National Institute of Standards and Technology/NIST (National Bureau of Standards/NBS)

## Share:

**Introduction**

The National Bureau of Standards (NBS) was established in 1901 as the first federal physical science laboratory, in response to the proliferation of inconsistent standards for weights and measures. Important work was carried out to address national needs in commerce, material science, electronics, and public health during its early years. During and after World War II, the need for basic research to support science and technology was recognized, and some important advances in technology and computing were fostered at NBS. In particular, NBS had a crucial role in the development of operations research algorithms and their implementation at the end of the war. So this is where our story begins – and it spans the years until NBS was reconstituted as NIST in 1988.

**Early Days and the Simplex Method**

As a test of feasibility of the simplex method, in 1947 the computational group at NBS carried out the steps of the simplex method on the classic diet problem posed by Stigler. Pieces of the computation were done manually (and in parallel) by clerks using hand-operated desk calculators and then assembled. The result was indeed the optimal solution, achieved after 24 iterations of the simplex method, and it was in fact a few cents cheaper than the solution proposed by Stigler.

In the late 1940s the Pentagon began funding development of early computers. In particular, funds were provided to NBS by the Office of Air Comptroller, US Air Force to design and build the SEAC (Standards Eastern Automatic Computer). SEAC was a 1024 word memory, four address machine, which used 750 vacuum tubes and some 10,000 solid-state devices (germanium diodes). Completed in 1950, it is believed to be the first fully operational stored-program electronic computer in the US and the fastest such machine in the world. It was the first computer to do all of its logic with solid-state devices.

The Air Force, through Project SCOOP (Scientific Computation of Optimum Programs) and direction of George Dantzig, also supported implementation and experimentation with the simplex method. The Applied Mathematics Division at NBS, headed by Alan Hoffman, programmed not only the simplex method, but two other competing methods (the fictitious play method of Brown and Motzkin’s relaxation method) that could be used for solving systems of linear inequalities. The results of this remarkably well executed study (Hoffman, et al. 1953), the first known computational evaluation of mathematical programming software, showed the simplex method to be the clear winner.

**Enter Alan Goldman**

Alan Goldman was hired by Alan Hoffman in 1956 and was subsequently asked to create an Operations Research Section within the Applied Mathematics Division, which he led until 1979. In addition to carrying out research in functional analysis, matrix theory, mathematical logic, and game theory, Alan Goldman simultaneously worked on a number of applied problems in mail sorting, the range of a fleet of aircraft, and target detection. Alan had the vision to extend mathematical analysis to problems arising in both the corporate and government spheres. One of the first government collaborations was with the US Postal System, which introduced him to problems of facility location, to which he and his OR group made valuable contributions. Transportation was another area of longstanding interest to Alan, and the OR group provided expert advice in planning and forecasting for the Department of Transportation, specifically the Federal Aviation Administration (FAA), the Urban Mass Transportation Administration (UMTA), and the Maritime Administration (MA).

In 1960, Alan was able to attract Jack Edmonds to join the OR Section, which already contained Bill Hall and Lambert Joel, members of the team that worked on implementing the simplex method on the SEAC for the Air Force. Jack became interested in combinatorial optimization while working on a project to assign radio frequencies to airplanes. In turn, Jack was able to attract Chris Witzgall to NBS, as they had been office mates in attending the high-powered 1961 RAND Summer Workshop (which featured such luminaries as George Dantzig, Alan Hoffman, D. R. Fulkerson, Claude Berge, and Bill Tutte). It was at this conference that Jack argued strongly for the development of polynomial-time algorithms, not simply algorithms that had finite running times. Many of Jack’s brilliant contributions to combinatorial optimization (matching problems, polyhedral theory, optimum branchings, and matroids) were developed while he was at NBS (through 1967). Edmonds organized an impressive conference on matroids, held at NBS in 1964, and with papers subsequently appearing in the *NBS* *Journal of Research *(National Bureau of Standards 1965); attendees included Bill Tutte, Henry Crapo, Jack Edmonds, D.R. Fulkerson, Albert Lehman, George Minty, Neil Robertson, and Chris Witzgall. Jack subsequently received the John von Neumann Theory Prize in 1985.

Chris Witzgall (who was the first German to program the simplex method) made a number of important contributions to convex analysis as well as developing novel schemes for one-row linear programs and isotone regression. While at NBS, Karla Hoffman (past President of INFORMS and part of the 2018 Franz Edelman Award winning team) published influential work on global minimization of concave functions (Hoffman 1981). She also carried out important research (with Manfred Padberg) on polyhedral theory underlying combinatorial optimization problems (Hoffman and Padberg 1985). R.H.F. Jackson (with Garth McCormick) produced several contributions to nonlinear optimization methods.

Not only did the OR group provide consultation on applied problems arising at NBS, but it provided interdisciplinary expertise in support of effective government decision-making. The summary below illustrates the breadth of such contributions made by the OR group.

**Location Analysis**

Early work for the US Postal System motivated a number of archival publications by Goldman and Witzgall on the optimal location of facilities as well as Witzgall’s seminal report on optimal facility location (Witzgall 1964). William A. Horn was also an early contributor to optimal design of facilities, particularly in connection with postal sorting problems.

Goldman published (in *Transportation Science*) foundational work on the generalized Weber plant location problem and on localization results for optimal facility location. Some of the very first work on obnoxious facility location, motivated by environmental concerns, was carried out by Goldman and P. M. Dearing in 1971. Fitting a circle (and later a sphere) to a given set of points was studied by Witzgall and Saul Gass, in the context of analyzing coordinate measuring machine data.

Applied location studies were also performed on inland consolidation centers as well as for fire service location-allocation and fire vehicle replacement. Site selection models and design of cost-effective electronic messaging systems were some additional projects carried out for the US Postal System. Models were also developed to determine optimal locations for Internal Revenue Service (IRS) Posts-of-Duty. Subsequent work for the IRS involved the creation and validation of a simulation model for deploying IRS resources (Carl Harris, et al. 1987).

### Transportation

A continuing series of collaborations occurred in the 1970s with the Department of Transportation. These included efficient calculation of nationwide railroad distances, simulation models for the Northeast Corridor project, and an analysis of automated transit information systems for UMTA. In addition, consultations for the FAA included the design of a comprehensive system for storing air traffic data, the assignment of transponder codes to aircraft, validation of simulation models for aircraft runway operations, and the study of aircraft vertical separation standards.

**Network Optimization**

In order to implement Edmonds’ breakthrough work on the maximum matching problem, Witzgall and C. T. Zahn designed an alternative data structure for computing such matchings in polynomial-time. J. Gilsinn and Witzgall carried out a landmark analysis and computational comparison of shortest path algorithms (Gilsinn and Witzgall 1967), which continues to be a highly cited work. Network models for evacuation were also studied by NBS staff together with Richard Francis (University of Florida).

**Energy and the Environment**

Witzgall and P. B. Saunders developed simulation models for studying the production and distribution of natural gas reserves in response to the oil embargo in the early 1970s. Saul Gass carried out model evaluation and assessment for the Department of Energy (DOE) in a series of NBS reports and conferences during the late 1970s and early 1980s; see the compilation appearing at the end of the conference listings.

Important environmental work began with the collection of data on the prevalence of lead-based paint hazards in housing and lead poisoning in resident children, which was followed by the collection of blood lead levels in a large-scale study (178,000 children) in New York City during 1970-1976. NBS analyses of these data showed a highly significant association between pediatric blood lead levels and the amount of lead present in gasoline sold during that same time period (I. Billick, et al. 1980). This study reinforced the decision by the Environmental Protection Agency to phase out lead in gasoline.

**Computational Testing**

Beginning in 1977, innovative work was begun to develop suites of test problems and rigorous test protocols for conducting computational experiments on mathematical programming algorithms. Initial work focused on developing test problems with known optimal solutions as well as test problem generators that could be used to explore a wider space of test problems. An important survey and assessment of mathematical programming algorithms and software (spanning the period 1953 to 1977) was carried out by R. H. F. Jackson and J. M. Mulvey (Princeton) (Jackson and Mulvey 1978). Interestingly enough, this assessment ranked highly the 1953 paper by Hoffman et al. on the testing of three codes for solving systems of linear inequalities using the SEAC. A conference on *Evaluating Mathematical Programming Techniques* was held January 1981 at the NBS Boulder campus (Mulvey 1982).

R. H. F. Jackson and Karla Hoffman were founding members of the Mathematical Programming Society Committee on Algorithms, formed in 1974. A subcommittee, chaired by Jackson, developed an updated set of guidelines (Jackson, et al. 1991) for reporting the results of computational experiments. The issues of test suites, reproducibility and performance claims so identified remain just as valid today.

**Conferences**

The Bureau of Standards played an important role in organizing and hosting various conferences on both the theory and practice of operations research:

- June 1951:
*First**Symposium on Linear Inequalities and Programming* - January 1955:
*Second Symposium in Linear Programming* - August 1964:
*Conference on Matroids* - October 1981:
*Symposium on the Matching Problem: Theory, Algorithms and Applications*

Saul Gass organized a series of conferences dealing with large-scale models and energy models:

- April 1977:
*Utility and Use of Large-Scale Mathematical Models* - January 1979:
*Validation and Assessment Issues of Energy Models* - May 1980:
*Validation and Assessment Issues of Energy Models* - June 1980:
*Oil and Gas Supply Modeling* - August 1982:
*Intermediate Future Forecasting System*

- Douglas Shier