Computational Linear Programming
John Tomlin (IBM Corporation)
This survey will cover some topics in the formulation of
large and complex linear programs and discuss solution
strategies for these problems.
Return to Program Overview
Chair: Ramesh Sharda (Oklahoma State University)
1. On Sharing Decision Technologies Over the WWW
Hemant Bhargava (Naval Postgraduate School), Ramayya Krishnan
(Carnegie Mellon University) and Rudolf Muller (Humboldt
University, Germany)
The emergence and popularity of the World Wide Web offers an
exciting opportunity for the management science (MS) and
information systems (IS) communities to offer their ideas and
technologies to a larger audience than ever before, as well as
to share these technologies within the MS and IS communities.
Our efforts in this area are in developing an environment
called DecisionNet, that results in an electronic market of
decision technologies, in which consumers can use, over the
Internet, technologies that are located and that execute over
providers' machines without having to "own" them. User
functionality in this market is provided via agents who
mediate the interactions (e.g., contact, data exchange,
billing) that need to take place between consumers and
providers.
The set of problems that we will address can be described as
follows. A principal objective of our research is to allow
decision technology providers---presumably mathematical
modelers and decision scientists---to offer their technologies
in the market without having to engage in significant
re-programming of the technology. This feature is critical,
for otherwise there may be no market at all. Our original
premise, since substantiated by our prototype implementation,
was that the additional effort required to integrate an
existing decision technology into a ``use-oriented''
electronic market can be performed, in a general
automated manner, by agents in the market. Given that
these agents must perform a variety of functions (such as
registration, execution, and billing) for different players
over a diverse collection of technologies---effectively
without having the technology itself---what information do
they need about the technologies, how should they reason with
the information, and how should they be designed and
programmed?
2. Using Many Idle Workstations for Solving GAMS Models
Michael C. Ferris (University of Wisconsin)
The Computer Sciences Department at the University of
Wisconsin exploits the power of its workstations using the
CONDOR system, a mechanism that allows jobs to be run on idle
workstations in the department. This is an enormous
computational resource that is exploited in this work.
In many real applications, a sequence of costly but
independent mathematical programs need to be solved,
typically using a loop statement within a modeling language.
Typical examples of this include scenario analysis,
sensistivity analysis, and cross-validation in machine
learning.
We show how a simple modification to the GAMS source file
allows such programs to be solved independently on other
workstations and the results recovered in the original model.
Some details of how this procedure is carried out will be
given, along with some examples of the use of this procedure
in sensitivity analysis for equilibrium models.
3. Introduction of the Interactive Transactions of ORMS
Ramesh Sharda (Oklahoma State University)
We describe the purpose and scope of the new electronic
journal to be published by INFORMS. Interactive Transactions
of ORMS will publish scholarly articles that take advantage
of the interactivity offered by the World Wide Web.
Bibliographic papers as well as papers that use the
additional capabilities of the WWW beyond print medium will
be the primary focus of this journal. This session describes
the journal, and includes demonstrations of sample papers
that have been written for the journal.
Return to Program Overview
Chair: Stavros Zenios (University of Cyprus)
1. Productivity in Financial Services: Empirical Analysis of
Process, Human Resource and Technology Management.
Patrick T. Harker (University of Pennsylvania) Francis Frei
(University of Rochester)
This paper summarizes the first three years of a multi-year
effort to understand the drivers of performance in financial
services organizations with particular attention given to the
role of technology, human resource and process management in
creating high-performance financial institutions. Financial
services are the largest single consumer of information
technology in the economy, investing $38.7 billion dollars in
1991. This investment has had a profound effect on the
overall structure of the industry. However, its effect on
financial performance of the industry remains elusive. Why
this productivity paradox exists is an important part of the
project described herein. Our approach to studying the
performance of technology in financial services relies heavily
on a process-oriented methodology. That is, we must look at
the production processes that technology is meant to support
in order fully understand the key success factors in
technology management. In addition, we shall not focus on
traditional measures of productivity (e.g., transactions per
FTE) but rather, on the key competitive measure of
performance in order to highlight the process/technology
management success factors for financial institutions. This
paper presents a detailed analysis of the management of key
production processes in retail banking and the role that
technology can and must play in their efficiency. More
specifically, a sample of key production processes in over
100 major U.S. banks is used to ascertain the efficiency of
processes relative to the Rbest in class. By using a variation
on frontier estimation (DEA-like) techniques, the impact
technology and process management practices on a variety of
value-added measures is explained.
2. Predicting Bank Failure Using DEA to Assess Managerial Quality
Thomas F. Siems (Federal Reserve Bank) and Richard S. Barr
(Southern Methodist University)
Presented are new failure-prediction models for detecting a
bank's troubled status up to two years prior to insolvency
using publicly available data and a new category of
explanatory variable to capture the elusive, yet critical,
element of institutional success: management quality. Quality
is assessed using Data Envelopment Analysis (DEA), which views
a bank as transforming multiple inputs into multiple outputs,
focusing on the primary functions of attracting deposits and
making loans. The new models are robust and accurate, as
verified by in-depth empirical evaluations, cost sensitivity
analyses, and comparisons with other published approaches.
3. A DEA Approach For Assessing Intertemporal Operating Efficiency
in Seasonal Environments
Andreas C. Soteriou, Stavros A. Zenios, Emmy Gabriel (University
of Cyprus)
Over the last few years Data Envelopment Analysis (DEA) is
gaining considerable attention as an efficiency evaluation
technique, by a number of for-profit organizations where the
underlying input-output transformation relationship is either
not known nor easily identified. DEA is typically used to
compare a group of DMUs, identify relatively inefficient units
and provide insights towards reduction of their
inefficiencies. In this paper we present a DEA formulation
which can be used to identify inefficient units when DMUs
operate in highly seasonal environments. It is important to
establish whether efficient DMUs maintain their efficiency
throughout a time horizon or whether their efficiency erodes
depending on the seasonality pattern. This formulation does
not necessarily assume the existence of a different
transformation process across time periods, a basic assumption
in the traditional "window analysis" approach. It can be used
to provide a better picture about the efficiency of a DMU
across time, as well as provide inter-temporal strategic
insights on benchmarking the effects of the external
environment. Results from an empirical study undertaken in a
banking institution are reported.
Return to Program Overview
Chair: Andrew Boyd (Texas A&M University)
1. Visualizing Mixed Integer Programming
Ed Rothberg (SiliconGraphics)
When solving mixed integer programming problems, prudent
choices of a few important parameters (node selection
strategies, tree traversal strategies, branching priorities,
etc.) can dramatically reduce the time required to solve a
problem. Unfortunately, the best choices for these parameters
are problem dependent. This talk will describe a tool we are
building, called MIPVIS, that depicts information about the
progress of a MIP solution in graphical form. This graphical
information allows users to build intuition about the nature
of their particular problems, often allowing them to make
better decisions about the strategies used to solve the
problem.
2. How Much Communication Does Parallel Branch and Bound Need?
Jonathan Eckstein (Rutgers University)
Using the CM-5 and a selection of MIPLIB problems, earlier
work has established the broad scalability of the classical
general MIP branch-and-bound algorithm in parallel
environments with fast, low-overhead, multipath communication.
If one modifies the implementation to greatly reduce
communication, how does scalability suffer?
3. New Lower Bounds for Job-Shop Scheduling
E. Andrew Boyd and Rusty Burlingame (Texas A&M University)
The job-shop scheduling problem was deemed by Applegate and
Cook to be "one of the most computationally stubborn
combinatorial problems considered to date." Many problems
containing as few as 15 jobs and 15 machines remain unsolved
to this day. We take steps toward solving these problems by
introducing a new lower bounding procedure. Computational
results are presented.
Return to Program Overview
Chair: Bruce L. Golden (University of Maryland)
1. A Large-Scale Neural Network for Airline Forecasting in Revenue
Management
Sharon Hormby, Xiaoyun Sun, and Erik Brauner (BehavHeuristics,
Inc.)
A large-scale neural network forecasting system for decision
support in airline seat inventory management has been
implemented by BehavHeuristics, Inc. and is in use at USAir
and Icelandair. This presentation will discuss the various
issues related to the application of neural network
technology, including knowledge representation, dynamic model
configuration, multi-network architectures, and our unique,
non-traditional training algorithms. The forecasting accuracy
and revenue benefit over years of performance monitoring and
data trials will be discussed.
2. Tractable Theories for the Synthesis of Neural Networks
V. Chandru (Indian Institute of Science), M. Vidyasagar, and V.
Vinay (Centre for AI and Robotics, Bangalore).
The Radius of Direct attraction of a discrete neural network
is a measure of stability of the network. It is known that
Hopfield networks designed using Hebb's Rule have a radius of
direct attraction of O(n/p) where n is the size of the input
patterns and p is the number of them. This lower bound is
tight if p is no larger than 4. We construct a family of such
networks with radius of direct attraction O( n/sqrt(p log
p)), for any p > 5. The techniques used to prove the result
led us to the first polynomial-time algorithm for designing a
neural network with maximum radius of direct attraction
around arbitrary input patterns. The optimal synaptic matrix
is computed using the ellipsoid method of linear programming
in conjunction with an efficient separation oracle.
Restrictions of symmetry and non-negative diagonal entries in
the synaptic matrix can be accommodated within this scheme.
We also present a stability theory for generalized analog
neural networks with energy functions that are multilinear
polynomials. The main conclusions are that (i) the network is
totally stable, and (ii) "almost all" trajectories of the
network converge to a local minimum of the energy function.
This is the largest class of functions for which sharp
convergence properties are known. As a consequence, many
combinatorial optimization problems, particularly in the class
PLS(Polynomial Local Search), may be modeled as analog neural
networks in a natural way. We illustrate these ideas on the
WEIGHTED CNF SAT problem which is known to be PLS-complete.
3. Neural Networks In Practice: Survey Results
Bruce L. Golden (University of Maryland), Edward A. Wasil
(American University), Steven P. Coy (University of Maryland),
and Cihan H. Dagli (University of Missouri-Rolla)
Neural networks will soon lose their "sex appeal."
Neural networks will rapidly "infiltrate" everyday life.
Alianna J. Maren (1990)
In early 1995, we prepared a questionnaire to determine how
companies are using neural networks to solve their problems.
We mailed this questionnaire to approximately 1050
individuals. Some of these were academics, but most worked in
industry. The questions focus on these key issues: (1) neural
network environment, (2) neural network results, and (3)
recent project success. We received 131 responses. In this
paper, we report on our findings.
Return to Program Overview