library.glopt
Submodule¶
Module Summary¶
Interfaces for the NAG Mark 30.2 glopt Chapter.
glopt
- Global Optimization of a Function
Global optimization involves finding the absolute maximum or minimum value of a function (the objective function) of several variables, possibly subject to restrictions (defined by a set of bounds or constraint functions) on the values of the variables.
Where submodule opt
treats local nonlinear optimization problems as well as convex optimization problems (for which local optima are by definition also global), this module tries to handle finding global optima of non-convex nonlinear optimization problems.
Such problems can be much harder to solve than local optimization problems because it is difficult to determine whether a potential optimum found is global, and because of the nonlocal methods required to avoid becoming trapped near local optima.
This introduction is a brief guide to the subject of global optimization, designed for the casual user. For further details you may find it beneficial to consult a more detailed text, see Neumaier (2004). Furthermore, much of the material in the E04 Introduction is also relevant in this context and it is strongly recommended that you read the E04 Introduction.
Some solvers in this module have been integrated in to the NAG optimization modelling suite, which provides an easy-to-use and flexible way of defining optimization problems among numerous solvers of the NAG library. All utility routines from the suite can be used to define a compatible problem to be solved by the integrated solvers; see the E04 Introduction for more details.
See Also¶
naginterfaces.library.examples.glopt
:This subpackage contains examples for the
glopt
module. See also the Examples subsection.
Functionality Index¶
Nonlinear programming (NLP) – global optimization
bound constrained
branching algorithm, multi-level coordinate search:
handle_solve_mcs()
branching algorithm, multi-level coordinate search (D):
bnd_mcs_solve()
heuristic algorithm, particle swarm optimization (PSO):
bnd_pso()
generic, including nonlinearly constrained
heuristic algorithm, particle swarm optimization (PSO):
nlp_pso()
multi-start:
nlp_multistart_sqp()
Nonlinear least squares, data fitting – global optimization
generic, including nonlinearly constrained
multi-start:
nlp_multistart_sqp_lsq()
Service functions
option setting functions
initialization:
bnd_mcs_init()
check whether option has been set:
bnd_mcs_option_check()
retrieve character option values:
bnd_mcs_optget_char()
retrieve integer option values:
bnd_mcs_optget_int()
retrieve real option values:
bnd_mcs_optget_real()
supply character option values:
bnd_mcs_optset_char()
supply integer option values:
bnd_mcs_optset_int()
supply option values from character string:
bnd_mcs_optset_string()
supply option values from external file:
bnd_mcs_optset_file()
supply real option values:
bnd_mcs_optset_real()
bnd_pso()
,nlp_pso()
,nlp_multistart_sqp()
andnlp_multistart_sqp_lsq()
For full information please refer to the NAG Library document
https://support.nag.com/numeric/nl/nagdoc_30.2/flhtml/e05/e05intro.html
Examples¶
- naginterfaces.library.examples.glopt.handle_solve_mcs_ex.main()[source]¶
Example for
naginterfaces.library.glopt.handle_solve_mcs()
.Global optimization by multi-level coordinate search.
>>> main() naginterfaces.library.glopt.handle_solve_mcs Python Example Results. Global optimization of the Peaks objective function. ... Final objective value is -6.55113 Global optimum is (0.22828, -1.62553)
- naginterfaces.library.examples.glopt.nlp_multistart_sqp_lsq_ex.main()[source]¶
Example for
naginterfaces.library.glopt.nlp_multistart_sqp_lsq()
.Global optimization of a sum of squares problem using multi-start.
Demonstrates catching a
NagAlgorithmicWarning
and accessing itsreturn_data
attribute.>>> main() naginterfaces.library.glopt.nlp_multistart_sqp_lsq Python Example Results. Minimizes the sum of squares function based on Problem 57 in Hock and Schittkowski (1981). Solution number 1. Final objective value = 0.0142298.