CONOPT
|
CONOPT is a solver for large-scale nonlinear optimization problems (NLP). CONOPT is a feasible path solver based on the proven GRG method with many newer extensions. CONOPT has been designed to be efficient and reliable for a broad class of models. The original GRG method helps achieve reliability and speed for models with a large degree of nonlinearity, i.e. difficult models, and CONOPT is often preferable for very nonlinear models and for models where feasibility is difficult to achieve. Extensions to the GRG method include, preprocessing, a special phase 0, linear mode iterations, sequential linear programming, and sequential quadratic programming. These extensions make CONOPT efficient on easier and mildly nonlinear models as well. The multi-method architecture of CONOPT combined with built-in logic for dynamic algorithm selection makes CONOPT a strong all-round NLP solver.
All components of CONOPT have been designed for large and sparse models. Models with over 10,000 constraints are routinely being solved. Specialized models with up to 1 million constraints have also been solved with CONOPT. The limiting factor is difficult to define. It is a combination of the number of constraints or variables with the number of super basic variables, a measure of the degrees of freedom around the optimal point. Models with over 500 super basic variables can sometimes be slow.
CONOPT is designed to solve smooth, continuous nonlinear optimization problems. This means CONOPT requires that all variables are continuous, and all constraints are smooth (i.e. with well-defined first derivatives). The Jacobian (matrix of first derivatives) is expected to be sparse. A sparse Jacobian allows for efficient numerical handling. Under these assumptions, CONOPT aims to find a local optimum that satisfies the Karush-Kuhn-Tucker (KKT) conditions.
To ensure reliable and efficient performance, we assume:
Note that while Second-order derivatives do not need to be provided, if provided they will used internally in some solver components. For models with many degrees of freedom, providing second derivatives can significantly enhance performance.
While CONOPT is a powerful tool for a wide range of problems, the following should be kept in mind: