Benchmarking and Performance Improvement Tools for Manufacturing
and Service Processes
Richard S. Barr, Southern Methodist University, Department of
Computer Science and Engineering, Dallas, Texas 75275, email:
barr@smu.edu
Lawrence M. Seiford, University of Massachusetts at Amherst, Department
of Mechanical Engineering, Amherst, Massachusetts, email:
seiford@ecs.umass.edu
Paper prepared for the 1996 National Science Foundation Design and
Manufacturing Grantees Conference, Albuquerque, New Mexico, January, 1996.
Abstract
Benchmarking is the global search for industry's best practices, for
improving an organization's products and processes. The growing use
of this process within TQM programs has highlighted the need for a
unified approach to capturing and analyzing benchmarking data. This
research has developed (a) analytical models for automatic
identification of best-practice companies from within a group, (b)
metrics for evaluating both the degree of leadership by these
companies and degree of inefficiency of aspiring firms, (c) a
framework for process improvement that prescribes realistic goals and
directions for aspiring companies to reach best practice, and (d)
efficient new optimization algorithms to support the framework. A pilot study is underway to
validate this new approach on a large-scale industrial benchmarking
application.
Introduction
An integral part of the Malcolm Baldrige quality-award guidelines is
the process of benchmarking. The central concept is that, by measuring and
comparing the products, services, or work processes representing best practices,
an organization can assess its relative position, establish targets and goals,
and use the knowledge to meet or surpass industry-best
practices.
A key step in benchmarking processes is the comparative
analysis of key metrics to establish what constitutes ``best
practice," the standard against which all others are compared. Armed with
the most primitive data analysis tools, today's benchmarking analysts have no
structured means to evaluate the data, characterize and measure
performance gaps, and project future performance levels. This project
shows that the benchmarking process can be significantly enhanced by
a new, unifying data-analysis methodology.
Objectives and Overview of the Research
The objectives of this project are to develop a powerful and
practical framework for benchmarking analysis,
and to validate its usefulness on
a challenging service sector application.
We have developed a quantitative foundation for the benchmarking process,
grounded in mathematical and economic theory, that provides the following capabilities.
- Models: Robust models of operational processes that
clearly identify best practice. These models also uncover possible
economies of scale and scope, and recognize substitution opportunities between
factors, processes, and products.
- Metrics: Distance measures to explicitly gauge
shortcomings and formalize the traditional gap analysis. These
metrics also prioritize the dimensions for
improvement (movement toward best practice benchmarks) in the
same way that Pareto charts prioritize items for problem-solving.
- Layering: New models and metrics to
permit stratification of companies exhibiting performance gaps into tiers
and subgroups. This tiering procedure sets
short- and intermediate-term goals for aspiring organizations,
as checkpoints in a continuous improvement program.
- Automated strawman: Automatic
construction of virtual benchmarks that capture the essence of many
``best'' performers. By producing an amalgamation of several
actual organizations, these empirically derived reference units
remain attainable entities, not ad hoc theoretical standards
of perfection.
To this end, we have extended Data Envelopment Analysis (DEA)
for use as an integrative benchmarking methodology. This relatively
new frontier estimation technique measures the relative
efficiency of each of a set of decision-making units,
provides a quantitative foundation for
benchmarking, and exhibits all of the desired features described
above (see [1]).
At the heart of this approach is a production
function which describes the input-output relationships of an
organization. That is, the function shows the maximum amount of
outputs that can be achieved by combining various quantities of
inputs; it also describes the minimum amount of inputs required
to achieve the given output levels. While theoretical
production functions are inherently unknowable, empirical
production functions---or efficient frontiers---can be
constructed from observed data by DEA.
DEA analyzes each decision-making unit separately
and measures its relative efficiency with respect to the entire
set of units being evaluated. Further, this approach (1) emphasizes best practice,
rather than distance-from-average practice, as with regression, (2) does not require
parametric assumptions about the underlying data relationships,
(3) handles various different assumptions about returns-to-scale
(4) provides metrics of inefficiency for those units that
are not exhibiting best practice, (5) permits incorporation of user
preferences into the analysis, and (6) suggests routes of
improvement for inefficient performers.
Research Results
This research has resulted in the development of new DEA models and algorithms that
can help practitioners integrate and synthesize their benchmarking
efforts.
- New models.
Two new DEA models were formulated to enhance the
capabilities of conventional approaches. An extremal DEA
(EDEA) model provided insight on the effect of an efficient
decision unit on the shape of the efficient frontier. Up to this
time, there was no ability to differentiate between
frontier-efficient units. Also a new tiered DEA (TDEA)
model partitions decision units into layers
of relative efficiency; this new categorization of inefficient
units has significant meaning from a process improvement
standpoint. Beyond these models, the incorporation of
user-defined constraints on the emphasis a DEA model can place
on its input/output combinations allows management to focus
the analysis and limit the degree of tradeoffs that were
permitted by the DEA.
- New, computationally efficient algorithms.
The algorithms for solving the varieties of EDEA mirror those
for conventional DEA, but TDEA models require substantially
increased computational effort, simply because of the number of
linear programming problems involved. In the worst case of an
analysis of n decision units, (n^2)/2 linear programs must
be solved.
To address this computational burden, a new solution approach
was developed. A hierarchal DEA (HDEA) solution
methodology is a divide-and-conquer algorithm similar in nature
to merge sorting and sorting networks. The HDEA procedure results in a
6- to 12-fold increase in speed over non-hierarchal
procedures, and permits solutions of problems with thousands of
organizations, not the usual dozens [3].
- Empirical testing and validation.
The ability to handle large problem
sets makes possible our large-scale benchmarking of the U.S. banking industry.
Our industrial partner, the Federal Reserve Bank of Dallas has constructed
a massive data set for this project that contains quarterly observations of
29 variables for every U.S. bank over a ten-year period. This
unusually rich source of benchmarking data---with over half a
million observations---will not only allow testing of the benchmarking
system, but will fuel a variety of empirical studies for many years.
In progress is a benchmarking of all banking units, based on total organizational performance [2]
to validate our DEA models for benchmarking support. Other
tests of the new approach are underway at Pier 1 Imports and the manufacturing firm
of Lockheed-Martin.
Acknowledgements
This work is sponsored by the National Science
Foundation, grant DMII 93-13346, and by the
Federal Reserve Bank of Dallas.
References
- A. Charnes, W.W. Cooper, Arie Y. Lewin, and Lawrence M.
Seiford (editors), 1995. Data Envelopment Analysis: Theory,
Methodology and Applications Kluwer Academic Press, Boston.
- Richard S. Barr, Lawrence M. Seiford, and Thomas F. Siems,
1993. An Envelopment-Analysis Approach to Measuring the
Managerial Quality of Banks, Annals of Operations
Research, 42, 1-19.
- Richard S. Barr and Matthew L. Durchholz, 1995. Parallel
and Hierarchical Decomposition Approaches for Solving
Large-Scale Data Envelopment Analysis Models, Annals of
Operations Research, to appear.
Return to the NSF Design and Manufacturing Grantees Conference Virtual
Proceedings