The CONOPT Story

By Arne Stolbjerg Drud, July 2025.

CONOPT has been a project in progress for over 50 years. It started as part of my Ph.D. project with the title “Methods for control of complex dynamic systems – illustrated by econometric models”, 1976.

Some fellow students at IMSOR (Institute of Mathematical Statistics and Operations Research, Technical University of Denmark) tried to use optimization to SMEC (Simulation Model of the Economic Counsil), a new economic model that at that time had around 50 equations per year. The initial work used Jean Abadie’s GRG66 code on an IBM-360 mainframe. The challenge was that models with more than 50 constraints exceeded the daytime memory limit of 104 Kbytes and had to be submitted for overnight processing.

After participating in a course given by John Reid on “Sparse Matrix Techniques”, I decided to try to combine Abadie’s Optimal Control version of the GRG method with these new sparse matrix techniques. The theoretical part was developed in my Ph.D. project (where Abadie was one of the external examiners) and the first software implementation was started in 1976 in a project funded by the Danish Research Council. It was not finished on time (systematic software development was yet to come) and the project continued for several years at various departments at the Technical University. An optimal control package called OptCon existed already; my software was therefore named CONOPT.

The first tests with practical models required information on the sparsity of the model and subroutines that could compute function-values and derivatives. All was handwritten, and all had to be consistent. It could take several weeks to make a small model ready for optimization, and there was no way to make sure the model was implemented correctly. Therefore, I started to use Formula Translation software with built-in differentiation capabilities to automate the implementation.

Many econometric models were developed in the 1970’s and many people were working on using optimization to these models and we met at conferences. Leon Lasdon (author of one of my textbooks, Optimization Theory for Large Systems) was an advisor for Joe Mantell in a Ph.D. project that worked on optimizing an 8000-equation model from the US Federal Reserve. In general, the econometric models grew much faster than the capabilities of the optimization software so there were few real successful applications. I did not solve a model with 8000 equations for another 20 years.

Through Leon Lasdon, I met Alex Meeraus from the Analytic Support Unit in the Development Research Center (DRC) in the World Bank. Alex had just started the GAMS project and his interest in general model development, model representation, and optimization combined with my interests in econometric models and optimal control resulted in a 6-month research assistant job at the World Bank in 1979. It was followed by a permanent position at DRC from 1981 to 1987. During this period the first practical versions of GAMS and CONOPT were made available, both for users inside and outside the World Bank. The appearance of Personal Computers and software that could be used to formulate and solve models on these computers gave a big boost to practical use of optimization.

In the late 1980s the World Bank reduced, for political reasons, the use of planning and optimization models, and the Analytic Support Unit was closed. I returned to Denmark, where I in 1987 started ARKI Consulting and Development as a one-person business. Shortly after, Alex started GAMS Development Corp.

GAMS and ARKI worked closely together for the next 10-15 years to improve the usefulness of our software. A large part of the development, both of GAMS and CONOPT, was financed by large industrial users based on practical needs. We were only paid when certain performance targets were reached. At the same time, intensive collaboration with researchers proved the usefulness of numerical optimization and created new demand when students finished at university and moved to industry. We had users in many areas, e.g. in economics, the oil business, finance, chemical engineering, and agricultural economics. Demand fluctuated wildly within each group, but there were always users somewhere, giving stable business.

During this period the optimal control part of CONOPT was removed. The special code for dynamic models was difficult to maintain and general sparse procedures proved to be more efficient than the specialized code. Today, CONOPT is, despite the name, a code for general nonlinear programming models without any optimal control components.

ARKI has worked with other modeling systems than GAMS and CONOPT has over the years been available as a solver under AIMMS, AMPL, ASCEND, LINDO, MPL and TOMLAB. There was a subroutine version, but the interface was more difficult to use, and since ARKI with a single employe only could provide limited support, the number of licenses was limited.

CONOPT has many components that are like those used in the Simplex method for Linear Programming (LP), and the many improvements in LP techniques quickly found their way into CONOPT. In 1987 a model with 300 constraints were considered large; a decade later models with 5000 constraints were routinely being solved. Better hardware and increased memory helped, but most of the improvements came from better software that could take advantage of hardware and memory.

Since the World Bank years, GAMS have had many users of Economic General Equilibrium Models, and the special CNS model type (Constrained Nonlinear Systems) has been widely used. CONOPT implemented special procedures for CNS that allow very large models to be solved efficiently. Around year 2000 ARKI got a local client with large models. The DREAM group (Danish Rational Economic Agent Model) wanted to solve CNS models with more than 1 million constraints and variables. Work over the next decade removed many computational bottlenecks and pushed the limits for both CONOPT and GAMS. The software is now being used for the recent Danish GREEN REFORM model with more than 7 million constraints and variables.

Throughout the years there has been continuous development based on user demand, and there have so far been four major versions. The first CONOPT1 was a GRG implementation with sparse matrix techniques, but otherwise like contemporary dense-matrix solvers. Later versions added scaling and an interface to routines that could provide 2nd derivatives. The routines used to find search directions were updated to use Linear Programming (LP) and Quadratic Programming (QP) techniques and the selection between these techniques was based on statistics from the solution process. Pre-processing was introduced early on to identify simple pre-triangular and post-triangular components. Pre-processing split the model into components that could be solved independently which made models easier to solve. The pre-processor has recently been improved so intermediate variables (called definitional constraints and defined variables) are logically eliminated and the resulting model becomes smaller. Interval arithmetic now allows CONOPT to find more pre-, post-, and definitional- parts and an even smaller resulting model.

CONOPT was for many years a one-man show, at least for the internal parts. The interfaces to modeling systems were always joint efforts. With many loyal users and user groups, some of which I have known and worked with for ten or twenty years, it has been important for me to secure a future for CONOPT. Handing over a large piece of software from a single developer to a group is a large and risky undertaking. Discussions with potential partners started in the late 2010s but were delayed during the Corona years. GAMS was, after many years of collaboration, a prime candidate. To explore the possibilities and evaluate the difficulties I worked in parallel with a group of talented people with many different backgrounds from GAMS. They moved the development process from a simple single-user serial environment into a multi-user, parallel and distributed environment. I learned about efficient software development and continuous automatic testing, and they learned about the many internal secrets of CONOPT. After joint work for a year, we decided it would work, and we are now in the middle of a very successful two-year transfer process.

I am happy that the life of CONOPT is secure and no longer is limited by me.