Benchmarking and Performance Improvement Tools for Manufacturing and Service Processes

Richard S. Barr, Southern Methodist University, Department of Computer Science and Engineering, Dallas, Texas 75275, email:

Lawrence M. Seiford, University of Massachusetts at Amherst, Department of Mechanical Engineering, Amherst, Massachusetts, email:

Paper prepared for the 1996 National Science Foundation Design and Manufacturing Grantees Conference, Albuquerque, New Mexico, January, 1996.



An integral part of the Malcolm Baldrige quality-award guidelines is the process of benchmarking. The central concept is that, by measuring and comparing the products, services, or work processes representing best practices, an organization can assess its relative position, establish targets and goals, and use the knowledge to meet or surpass industry-best practices.

A key step in benchmarking processes is the comparative analysis of key metrics to establish what constitutes ``best practice," the standard against which all others are compared. Armed with the most primitive data analysis tools, today's benchmarking analysts have no structured means to evaluate the data, characterize and measure performance gaps, and project future performance levels. This project shows that the benchmarking process can be significantly enhanced by a new, unifying data-analysis methodology.

Objectives and Overview of the Research

The objectives of this project are to develop a powerful and practical framework for benchmarking analysis, and to validate its usefulness on a challenging service sector application. We have developed a quantitative foundation for the benchmarking process, grounded in mathematical and economic theory, that provides the following capabilities.

To this end, we have extended Data Envelopment Analysis (DEA) for use as an integrative benchmarking methodology. This relatively new frontier estimation technique measures the relative efficiency of each of a set of decision-making units, provides a quantitative foundation for benchmarking, and exhibits all of the desired features described above (see [1]).

At the heart of this approach is a production function which describes the input-output relationships of an organization. That is, the function shows the maximum amount of outputs that can be achieved by combining various quantities of inputs; it also describes the minimum amount of inputs required to achieve the given output levels. While theoretical production functions are inherently unknowable, empirical production functions---or efficient frontiers---can be constructed from observed data by DEA.

DEA analyzes each decision-making unit separately and measures its relative efficiency with respect to the entire set of units being evaluated. Further, this approach (1) emphasizes best practice, rather than distance-from-average practice, as with regression, (2) does not require parametric assumptions about the underlying data relationships, (3) handles various different assumptions about returns-to-scale (4) provides metrics of inefficiency for those units that are not exhibiting best practice, (5) permits incorporation of user preferences into the analysis, and (6) suggests routes of improvement for inefficient performers.

Research Results

This research has resulted in the development of new DEA models and algorithms that can help practitioners integrate and synthesize their benchmarking efforts.

In progress is a benchmarking of all banking units, based on total organizational performance [2] to validate our DEA models for benchmarking support. Other tests of the new approach are underway at Pier 1 Imports and the manufacturing firm of Lockheed-Martin.


This work is sponsored by the National Science Foundation, grant DMII 93-13346, and by the Federal Reserve Bank of Dallas.


  1. A. Charnes, W.W. Cooper, Arie Y. Lewin, and Lawrence M. Seiford (editors), 1995. Data Envelopment Analysis: Theory, Methodology and Applications Kluwer Academic Press, Boston.

  2. Richard S. Barr, Lawrence M. Seiford, and Thomas F. Siems, 1993. An Envelopment-Analysis Approach to Measuring the Managerial Quality of Banks, Annals of Operations Research, 42, 1-19.

  3. Richard S. Barr and Matthew L. Durchholz, 1995. Parallel and Hierarchical Decomposition Approaches for Solving Large-Scale Data Envelopment Analysis Models, Annals of Operations Research, to appear.

Return to the NSF Design and Manufacturing Grantees Conference Virtual Proceedings