Note:this function usesoptional parametersto define choices in the problem specification and in the details of the algorithm. If you wish to use default settings for all of the optional parameters, you need only read Sections 1 to 10 of this document. If, however, you wish to reset some or all of the settings please refer to Section 11 for a detailed description of the algorithm and to Section 12 for a detailed description of the specification of the optional parameters.
e05sac is designed to search for the global minimum or maximum of an arbitrary function, using Particle Swarm Optimization (PSO). Derivatives are not required, although these may be used by an accompanying local minimization function if desired. e05sac is essentially identical to e05sbc, but with a simpler interface and with various optional parameters removed; otherwise most arguments are identical. In particular, e05sac does not handle general constraints.
The function may be called by the names: e05sac or nag_glopt_bnd_pso.
Before calling e05sac, e05zkcmust be called with optstr set to ‘Initialize = e05sac’. Optional parameters may also be specified by calling e05zkc before the call to e05sac.
3Description
e05sac uses a stochastic method based on Particle Swarm Optimization (PSO) to search for the global optimum of a nonlinear function , subject to a set of bound constraints on the variables. In the PSO algorithm (see Section 11), a set of particles is generated in the search space, and advances each iteration to (hopefully) better positions using a heuristic velocity based upon inertia, cognitive memory and global memory. The inertia is provided by a decreasingly weighted contribution from a particles current velocity, the cognitive memory refers to the best candidate found by an individual particle and the global memory refers to the best candidate found by all the particles. This allows for a global search of the domain in question.
Further, this may be coupled with a selection of local minimization functions, which may be called during the iterations of the heuristic algorithm, the interior phase, to hasten the discovery of locally optimal points, and after the heuristic phase has completed to attempt to refine the final solution, the exterior phase. Different options may be set for the local optimizer in each phase.
Without loss of generality, the problem is assumed to be stated in the following form:
where the objective is a scalar function, is a vector in and the vectors are lower and upper bounds respectively for the variables. The objective function may be nonlinear. Continuity of is not essential. For functions which are smooth and primarily unimodal, faster solutions will almost certainly be achieved by using Chapter E04 functions directly.
For functions which are smooth and multi-modal, gradient dependent local minimization functions may be coupled with e05sac.
For multi-modal functions for which derivatives cannot be provided, particularly functions with a significant level of noise in their evaluation, e05sac should be used either alone, or coupled with e04cbc.
The lower and upper box bounds on the variable are included to initialize the particle swarm into a finite hypervolume, although their subsequent influence on the algorithm is user determinable (see the option in Section 12). It is strongly recommended that sensible bounds are provided for all variables.
e05sac may also be used to maximize the objective function (see the option ).
Due to the nature of global optimization, unless a predefined target is provided, there is no definitive way of knowing when to end a computation. As such several stopping heuristics have been implemented into the algorithm. If any of these is achieved, e05sac will exit with NW_SOLUTION_NOT_GUARANTEED, and the parameter inform will indicate which criteria was reached. See inform for more information.
In addition, you may provide your own stopping criteria through monmod and objfun.
e05sbc provides a comprehensive interface, allowing for the inclusion of general nonlinear constraints.
4References
Gill P E, Murray W and Wright M H (1981) Practical Optimization Academic Press
Kennedy J and Eberhart R C (1995) Particle Swarm Optimization Proceedings of the 1995 IEEE International Conference on Neural Networks 1942–1948
Koh B, George A D, Haftka R T and Fregly B J (2006) Parallel Asynchronous Particle Swarm Optimization International Journal for Numerical Methods in Engineering67(4) 578–595
Vaz A I and Vicente L N (2007) A Particle Swarm Pattern Search Method for Bound Constrained Global Optimization Journal of Global Optimization39(2) 197–219 Kluwer Academic Publishers
5Arguments
Note: for descriptions of the symbolic variables, see Section 11.
1: – IntegerInput
On entry: , the number of dimensions.
Constraint:
.
2: – IntegerInput
On entry: , the number of particles to be used in the swarm. Assuming all particles remain within bounds, each complete iteration will perform at least npar function evaluations. Otherwise, significantly fewer objective function evaluations may be performed.
Suggested value:
.
Constraint:
.
3: – doubleOutput
On exit: the location of the best solution found, , in .
4: – double *Output
On exit: the objective value of the best solution, .
5: – const doubleInput
6: – const doubleInput
On entry: is , the array of lower bounds, bu is , the array of upper bounds. The ndim entries in bl and bu must contain the lower and upper simple (box) bounds of the variables respectively. These must be provided to initialize the sample population into a finite hypervolume, although their subsequent influence on the algorithm is user determinable (see the option in Section 12).
If for any , variable will remain locked to regardless of the option selected.
It is strongly advised that you place sensible lower and upper bounds on all variables, even if your model allows for variables to be unbounded (using the option ) since these define the initial search space.
Constraints:
, for ;
for at least one .
7: – function, supplied by the userExternal Function
objfun must, depending on the value of mode, calculate the objective function and/or calculate the gradient of the objective function for a -variable vector . Gradients are only required if a local minimizer has been chosen which requires gradients. See the option for more information.
On entry: indicates which functionality is required.
should be returned in objf. The value of objf on entry may be used as an upper bound for the calculation. Any expected value of that is greater than objf may be approximated by this upper bound; that is objf can remain unaltered.
only First derivatives can be evaluated and returned in vecout. Any unaltered elements of vecout will be approximated using finite differences.
only must be calculated and returned in objf, and available first derivatives can be evaluated and returned in vecout. Any unaltered elements of vecout will be approximated using finite differences.
must be calculated and returned in objf. The value of objf on entry may not be used as an upper bound.
only All first derivatives must be evaluated and returned in vecout.
only must be calculated and returned in objf, and all first derivatives must be evaluated and returned in vecout.
On exit: if the value of mode is set to be negative, e05sac will exit as soon as possible with NE_USER_STOP and .
2: – IntegerInput
On entry: the number of dimensions.
3: – const doubleInput
On entry: , the point at which the objective function and/or its gradient are to be evaluated.
4: – double *Input/Output
On entry: the value of objf passed to objfun varies with the argument mode.
objf is an upper bound for the value of , often equal to the best value of found so far by a given particle. Only objective function values less than the value of objf on entry will be used further. As such this upper bound may be used to stop further evaluation when this will only increase the objective function value above the upper bound.
vecout can contain components of the gradient of the objective function for some , or acceptable approximations. Any unaltered elements of vecout will be approximated using finite differences.
or
vecout must contain the gradient of the objective function for all . Approximation of the gradient is strongly discouraged, and no finite difference approximations will be performed internally (see e04dgc).
6: – IntegerInput
On entry: nstate indicates various stages of initialization throughout the function. This allows for permanent global arguments to be initialized the least number of times. For example, you may initialize a random number generator seed.
objfun is called for the very first time. You may save computational time if certain data must be read or calculated only once.
objfun is called for the first time by a NAG local minimization function. You may save computational time if certain data required for the local minimizer need only be calculated at the initial point of the local minimization.
Used in all other cases.
7: – Nag_Comm *
Pointer to structure of type Nag_Comm; the following members are relevant to objfun.
user – double *
iuser – Integer *
p – Pointer
The type Pointer will be void *. Before calling e05sac you may allocate memory and initialize these pointers with various quantities for use by objfun when called from e05sac (see Section 3.1.1 in the Introduction to the NAG Library CL Interface).
Note:objfun should not return floating-point NaN (Not a Number) or infinity values, since these are not handled by e05sac. If your code inadvertently does return any NaNs or infinities, e05sac is likely to produce unexpected results.
8: – function, supplied by the userExternal Function
A user-specified monitoring and modification function. monmod is called once every complete iteration after a finalization check. It may be used to modify the particle locations that will be evaluated at the next iteration. This permits the incorporation of algorithmic modifications such as including additional advection heuristics and genetic mutations. monmod is only called during the main loop of the algorithm, and as such will be unaware of any further improvement from the final local minimization. If no monitoring and/or modification is required, monmod may be NULLFN.
Note: the th component of the th particle, , is stored in .
On entry: the npar particle locations, , which will currently be used during the next iteration unless altered in monmod.
On exit: the particle locations to be used during the next iteration.
4: – const doubleInput
On entry: the location, , of the best solution yet found.
5: – doubleInput
On entry: the objective value, , of the best solution yet found.
6: – const doubleInput
Note: the th component of the position of the th particle's cognitive memory, , is stored in .
On entry: the locations currently in the cognitive memory,
, for (see Section 11).
7: – const doubleInput
On entry: the objective values currently in the cognitive memory,
, for .
8: – const IntegerInput
On entry: iteration and function evaluation counters (see description of itt below).
9: – Nag_Comm *
Pointer to structure of type Nag_Comm; the following members are relevant to monmod.
user – double *
iuser – Integer *
p – Pointer
The type Pointer will be void *. Before calling e05sac you may allocate memory and initialize these pointers with various quantities for use by monmod when called from e05sac (see Section 3.1.1 in the Introduction to the NAG Library CL Interface).
10: – Integer *Input/Output
On entry:
On exit: setting will cause near immediate exit from e05sac. This value will be returned as inform with NE_USER_STOP. You need not set inform unless you wish to force an exit.
Note:monmod should not return floating-point NaN (Not a Number) or infinity values, since these are not handled by e05sac. If your code inadvertently does return any NaNs or infinities, e05sac is likely to produce unexpected results.
9: – IntegerCommunication Array
Note: the dimension, , of this array is dictated by the requirements of associated functions that must have been previously called. This array MUST be the same array passed as argument iopts in the previous call to e05zkc.
On entry: optional parameter array as generated and possibly modified by calls to e05zkc. The contents of iopts MUST NOT be modified directly between calls to e05sac, e05zkcore05zlc.
10: – doubleCommunication Array
Note: the dimension, , of this array is dictated by the requirements of associated functions that must have been previously called. This array MUST be the same array passed as argument opts in the previous call to e05zkc.
On entry: optional parameter array as generated and possibly modified by calls to e05zkc. The contents of opts MUST NOT be modified directly between calls to e05sac, e05zkcore05zlc.
11: – Nag_Comm *
The NAG communication argument (see Section 3.1.1 in the Introduction to the NAG Library CL Interface).
12: – IntegerOutput
On exit: integer iteration counters for e05sac.
Number of complete iterations.
Number of complete iterations without improvement to the current optimum.
Number of particles converged to the current optimum.
Number of improvements to the optimum.
Number of function evaluations performed.
Number of particles reset.
13: – Integer *Output
On exit: indicates which finalization criterion was reached. The possible values of inform are:
The provided objective target has been achieved. ().
2
The standard deviation of the location of all the particles is below the set threshold (). If the solution returned is not satisfactory, you may try setting a smaller value of , or try adjusting the options governing the repulsive phase (, ).
3
The total number of particles converged () to the current global optimum has reached the set limit. This is the number of particles which have moved to a distance less than from the optimum with regard to the norm. If the solution is not satisfactory, you may consider lowering the . However, this may hinder the global search capability of the algorithm.
4
The maximum number of iterations without improvement () has been reached, and the required number of particles () have converged to the current optimum. Increasing either of these options will allow the algorithm to continue searching for longer. Alternatively if the solution is not satisfactory, re-starting the application several times with may lead to an improved solution.
5
The maximum number of iterations () has been reached. If the number of iterations since improvement is small, then a better solution may be found by increasing this limit, or by using the option with corresponding exterior options. Otherwise if the solution is not satisfactory, you may try re-running the application several times with and a lower iteration limit, or adjusting the options governing the repulsive phase (, ).
6
The maximum allowed number of function evaluations () has been reached. As with , increasing this limit if the number of iterations without improvement is small, or decreasing this limit and running the algorithm multiple times with , may provide a superior result.
14: – NagError *Input/Output
The NAG error argument (see Section 7 in the Introduction to the NAG Library CL Interface).
e05sac will return NE_NOERROR if and only if a finalization criterion has been reached which can guarantee success. This may only happen if the option has been set and reached at a point within the search domain. The finalization criterion is not activated using default option settings, and must be explicitly set using e05zkc if required.
e05sac will return NW_SOLUTION_NOT_GUARANTEED if no error has been detected, and a finalization criterion has been achieved which cannot guarantee success. This does not indicate that the function has failed, merely that the returned solution cannot be guaranteed to be the true global optimum.
The value of inform should be examined to determine which finalization criterion was reached.
6Error Indicators and Warnings
NE_ALLOC_FAIL
Dynamic memory allocation failed.
See Section 3.1.2 in the Introduction to the NAG Library CL Interface for further information.
NE_BAD_PARAM
On entry, argument had an illegal value.
NE_BOUND
On entry, for all .
Constraint: for at least one .
On entry, and .
Constraint: for all .
NE_DERIV_ERRORS
Derivative checks indicate possible errors in the supplied derivatives.
Gradient checks may be disabled by setting .
NE_INT
On entry, .
Constraint: .
On entry, .
Constraint: , where num_threads is the value returned by the OpenMP environment variable OMP_NUM_THREADS, or num_threads is for a serial version of this function.
NE_INTERNAL_ERROR
An internal error has occurred in this function. Check the function call and any array sizes. If the call is correct then please contact NAG for assistance.
See Section 7.5 in the Introduction to the NAG Library CL Interface for further information.
NE_INVALID_OPTION
Either the option arrays have not been initialized for e05sac, or they have become corrupted.
NE_NO_LICENCE
Your licence key may have expired or may not have been installed correctly.
See Section 8 in the Introduction to the NAG Library CL Interface for further information.
If the option has been activated, this indicates that the has been achieved to specified tolerances at a sufficiently constrained point, either during the initialization phase, or during the first two iterations of the algorithm. While this is not necessarily an error, it may occur if:
(i)The target was achieved at the first point sampled by the function. This will be the mean of the lower and upper bounds.
(ii)The target may have been achieved at a randomly generated sample point. This will always be a possibility provided that the domain under investigation contains a point with a target objective value.
(iii)If the has been set, then a sample point may have been inside the basin of attraction of a satisfactory point. If this occurs repeatedly when the function is called, it may imply that the objective is largely unimodal, and that it may be more efficient to use the function selected as the directly.
Assuming that objfun is correct, you may wish to set a better , or a stricter .
NW_SOLUTION_NOT_GUARANTEED
A finalization criterion was reached that cannot guarantee success.
On exit, .
7Accuracy
If NE_NOERROR (or NW_FAST_SOLUTION) or NW_SOLUTION_NOT_GUARANTEED on exit, either a or finalization criterion has been reached, depending on user selected options. As with all global optimization software, the solution achieved may not be the true global optimum. Various options allow for either greater search diversity or faster convergence to a (local) optimum (See Sections 11 and 12).
Provided the objective function and constraints are sufficiently well behaved, if a local minimizer is used in conjunction with e05sac, then it is more likely that the final result will at least be in the near vicinity of a local optimum, and due to the global search characteristics of the particle swarm, this solution should be superior to many other local optima.
Caution should be used in accelerating the rate of convergence, as with faster convergence, less of the domain will remain searchable by the swarm, making it increasingly difficult for the algorithm to detect the basin of attraction of superior local optima. Using the options and described in Section 12 will help to overcome this, by causing the swarm to diverge away from the current optimum once no more local improvement is likely.
On successful exit with guaranteed success, NE_NOERROR. This may only happen if a is assigned and is reached by the algorithm.
On successful exit without guaranteed success, NW_SOLUTION_NOT_GUARANTEED is returned. This will happen if another finalization criterion is achieved without the detection of an error.
In both cases, the value of inform provides further information as to the cause of the exit.
8Parallelism and Performance
The code for e05sac is not directly threaded for parallel execution. In particular, none of the user-supplied functions will be called from within a parallel region generated by e05sac.
9Further Comments
The memory used by e05sac is relatively static throughout. As such, e05sac may be used in problems with high dimension number () without the concern of computational resource exhaustion, although the probability of successfully locating the global optimum will decrease dramatically with the increase in dimensionality.
Due to the stochastic nature of the algorithm, the result will vary over multiple runs. This is particularly true if arguments and options are chosen to accelerate convergence at the expense of the global search. However, the option may be set to initialize the internal random number generator using a preset seed, which will result in identical solutions being obtained.
10Example
This example uses a particle swarm to find the global minimum of the Schwefel function:
In two dimensions the optimum is , located at .
The example demonstrates how to initialize and set the options arrays using e05zkc, how to query options using e05zlc, and finally how to search for the global optimum using e05sac. The function is minimized several times to demonstrate using e05sac alone, and coupled with local minimizers. This program uses the non-default option to produce repeatable solutions.
apply required behaviour for outside bounding box, (see )
new ()
true if , , were updated at this iteration
Additionally a repulsion phase can be introduced by changing from the default values of options (), () and (). If the number of static particles is denoted then the following can be inserted after the new() check in the pseudo-code above.
12Optional Parameters
This section can be skipped if you wish to use the default values for all optional parameters, otherwise, the following is a list of the optional parameters available and a full description of each optional parameter is provided in Section 12.1.
For each option, we give a summary line, a description of the optional parameter and details of constraints.
The summary line contains:
the keywords;
a parameter value,
where the letters , and denote options that take character, integer and real values respectively;
the default value, where the symbol is a generic notation for machine precision (see X02AJC), and represents the largest representable integer value (see X02BBC).
All options accept the value ‘DEFAULT’ in order to return single options to their default states.
Keywords and character values are case insensitive, however they must be separated by at least one space.
For e05sac the maximum length of the argument cvalue used by e05zlc is .
Advance Cognitive
Default
The cognitive advance coefficient, . When larger than the global advance coefficient, this will cause particles to be attracted toward their previous best positions. Setting will cause e05sac to act predominantly as a local optimizer. Setting may cause the swarm to diverge, and is generally inadvisable. At least one of the global and cognitive coefficients must be nonzero.
Advance Global
Default
The global advance coefficient, . When larger than the cognitive coefficient this will encourage convergence toward the best solution yet found. Values will inhibit particles overshooting the optimum. Values cause particles to fly over the optimum some of the time. Larger values can prohibit convergence. Setting will remove any attraction to the current optimum, effectively generating a Monte Carlo multi-start optimization algorithm. At least one of the global and cognitive coefficients must be nonzero.
Boundary
Default
Determines the behaviour if particles leave the domain described by the box bounds. This only affects the general PSO algorithm, and will not pass down to any NAG local minimizers chosen.
This option is only effective in those dimensions for which , .
IGNORE
The box bounds are ignored. The objective function is still evaluated at the new particle position.
RESET
The particle is re-initialized inside the domain. and are not affected.
FLOATING
The particle position remains the same, however the objective function will not be evaluated at the next iteration. The particle will probably be advected back into the domain at the next advance due to attraction by the cognitive and global memory.
HYPERSPHERICAL
The box bounds are wrapped around an -dimensional hypersphere. As such a particle leaving through a lower bound will immediately re-enter through the corresponding upper bound and vice versa. The standard distance between particles is also modified accordingly.
FIXED
The particle rests on the boundary, with the corresponding dimensional velocity set to .
Distance Scaling
Default
Determines whether distances should be scaled by box widths.
ON
When a distance is calculated between and , a scaled norm is used.
OFF
Distances are calculated as the standard norm without any rescaling.
Distance Tolerance
Default
This is the distance, between particles and the global optimum which must be reached for the particle to be considered converged, i.e., that any subsequent movement of such a particle cannot significantly alter the global optimum. Once achieved the particle is reset into the box bounds to continue searching.
Constraint:
.
Function Precision
Default
The parameter defines , which is intended to be a measure of the accuracy with which the problem function can be computed. If or , the default value is used.
The value of should reflect the relative precision of ; i.e., acts as a relative precision when is large, and as an absolute precision when is small. For example, if is typically of order and the first six significant digits are known to be correct, an appropriate value for would be . In contrast, if is typically of order and the first six significant digits are known to be correct, an appropriate value for would be . The choice of can be quite complicated for badly scaled problems; see Chapter 8 of Gill et al. (1981) for a discussion of scaling techniques. The default value is appropriate for most simple functions that are computed with full accuracy. However when the accuracy of the computed function values is known to be significantly worse than full precision, the value of should be large enough so that no attempt will be made to distinguish between function values that differ by less than the error inherent in the calculation.
Local Boundary Restriction
Default
Contracts the box boundaries used by a box constrained local minimizer to, , containing the start point , where
Smaller values of thereby restrict the size of the domain exposed to the local minimizer, possibly reducing the amount of work done by the local minimizer.
Constraint:
.
Local Interior Iterations
Local Interior Major Iterations
Local Exterior Iterations
Local Exterior Major Iterations
The maximum number of iterations or function evaluations the chosen local minimizer will perform inside (outside) the main loop if applicable. For the NAG minimizers these correspond to:
Unless set, these are functions of the parameters passed to e05sac.
Setting will disable the local minimizer in the corresponding algorithmic region. For example, setting and will cause the algorithm to perform no local minimizations inside the main loop of the algorithm, and a local minimization with upto iterations after the main loop has been exited.
Constraint:
, .
Local Interior Tolerance
Default
Local Exterior Tolerance
Default
This is the tolerance provided to a local minimizer in the interior (exterior) of the main loop of the algorithm.
Constraint:
,.
Local Interior Minor Iterations
Local Exterior Minor Iterations
Where applicable, the secondary number of iterations the chosen local minimizer will use inside (outside) the main loop. Currently the relevant default values are:
Accurate derivatives must be provided, and will not be approximated internally. Additionally, each call to objfun during a local minimization will require either the objective to be evaluated alone, or both the objective and its gradient to be evaluated. Hence on a call to objfun, or .
Use e04ucc as the local minimizer.
This operates such that any derivatives of the objective function that you cannot supply, will be approximated internally using finite differences.
Either, the objective, objective gradient, or both may be requested during a local minimization, and as such on a call to objfun, , or .
The box bounds forwarded to this function from e05sac will have been acted upon by . As such, the domain exposed may be greatly smaller than that provided to e05sac.
Maximum Function Evaluations
Default
The maximum number of evaluations of the objective function. When reached this will return NW_SOLUTION_NOT_GUARANTEED and .
Constraint:
.
Maximum Iterations Completed
Default
The maximum number of complete iterations that may be performed. Once exceeded e05sac will exit with NW_SOLUTION_NOT_GUARANTEED and .
Unless set, this adapts to the parameters passed to e05sac.
Constraint:
.
Maximum Iterations Static
Default
The maximum number of iterations without any improvement to the current global optimum. If exceeded e05sac will exit with NW_SOLUTION_NOT_GUARANTEED and . This exit will be hindered by setting to larger values.
Constraint:
.
Maximum Iterations Static Particles
Default
The minimum number of particles that must have converged to the current optimum before the function may exit due to with NW_SOLUTION_NOT_GUARANTEED and .
Constraint:
.
Maximum Particles Converged
Default
The maximum number of particles that may converge to the current optimum. When achieved, e05sac will exit with NW_SOLUTION_NOT_GUARANTEED and . This exit will be hindered by setting ‘Repulsion’ options, as these cause the swarm to re-expand.
Constraint:
.
Maximum Particles Reset
Default
The maximum number of particles that may be reset after converging to the current optimum. Once achieved no further particles will be reset, and any particles within of the global optimum will continue to evolve as normal.
Constraint:
.
Maximum Variable Velocity
Default
Along any dimension , the absolute velocity is bounded above by . Very low values will greatly increase convergence time. There is no upper limit, although larger values will allow more particles to be advected out of the box bounds, and values greater than may cause significant and potentially unrecoverable swarm divergence.
Constraint:
.
Optimize
Default
Determines whether to maximize or minimize the objective function.
MINIMIZE
The objective function will be minimized.
MAXIMIZE
The objective function will be maximized. This is accomplished by minimizing the negative of the objective.
Repeatability
Default
Allows for the same random number generator seed to be used for every call to e05sac. is recommended in general.
OFF
The internal generation of random numbers will be nonrepeatable.
ON
The same seed will be used.
Repulsion Finalize
Default
The number of iterations performed in a repulsive phase before re-contraction. This allows a re-diversified swarm to contract back toward the current optimum, allowing for a finer search of the near optimum space.
Constraint:
.
Repulsion Initialize
Default
The number of iterations without any improvement to the global optimum before the algorithm begins a repulsive phase. This phase allows the particle swarm to re-expand away from the current optimum, allowing more of the domain to be investigated. The repulsive phase is automatically ended if a superior optimum is found.
Constraint:
.
Repulsion Particles
Default
The number of particles required to have converged to the current optimum before any repulsive phase may be initialized. This will prevent repulsion before a satisfactory search of the near optimum area has been performed, which may happen for large dimensional problems.
Constraint:
.
Seed
Default
Sets the random number generator seed to be used when . If set to 0, the default seed will be used. If not, the absolute value of will be used to generate the random number generator seed.
Swarm Standard Deviation
Default
The target standard deviation of the particle distances from the current optimum. Once the standard deviation is below this level, e05sac will exit with NW_SOLUTION_NOT_GUARANTEED and . This criterion will be penalized by the use of ‘Repulsion’ options, as these cause the swarm to re-expand, increasing the standard deviation of the particle distances from the best point.
Constraint:
.
Target Objective
Default
Target Objective Value
Default
Activate or deactivate the use of a target value as a finalization criterion. If active, then once the supplied target value for the objective function is found (beyond the first iteration if is active) e05sac will exit with NE_NOERROR and . Other than checking for feasibility only (), this is the only finalization criterion that guarantees that the algorithm has been successful. If the target value was achieved at the initialization phase or first iteration and is active, e05sac will exit with NW_FAST_SOLUTION. This option may take any real value , or the character ON/OFF as well as DEFAULT. If this option is queried using e05zlc, the current value of will be returned in rvalue, and cvalue will indicate whether this option is ON or OFF. The behaviour of the option is as follows:
Once a point is found with an objective value within the of , e05sac will exit successfully with NE_NOERROR and .
OFF
The current value of will remain stored, however it will not be used as a finalization criterion.
ON
The current value of stored will be used as a finalization criterion.
DEFAULT
The stored value of will be reset to its default value (), and this finalization criterion will be deactivated.
Target Objective Safeguard
Default
If you have given a target objective value to reach in (the value of the optional parameter ), sets your desired safeguarded termination tolerance, for when is close to zero.
Constraint:
.
Target Objective Tolerance
Default
The optional tolerance to a user-specified target value.
Constraint:
.
Target Warning
Default
Activates or deactivates the error exit associated with the target value being achieved before entry into the main loop of the algorithm, NW_FAST_SOLUTION.
OFF
No error will be returned, and the function will exit normally.
ON
An error will be returned if the target objective is reached prematurely, and the function will exit with NW_FAST_SOLUTION.
Verify Gradients
Default
Adjusts the level of gradient checking performed when gradients are required. Gradient checks are only performed on the first call to the chosen local minimizer if it requires gradients. There is no guarantee that the gradient check will be correct, as the finite differences used in the gradient check are themselves subject to inaccuracies.
OFF
No gradient checking will be performed.
ON
A cheap gradient check will be performed on both the gradients corresponding to the objective through objfun.
OBJECTIVE FULL
A more expensive gradient check will be performed on the gradients corresponding to the objective objfun.
Weight Decrease
Default
Determines how particle weights decrease.
OFF
Weights do not decrease.
INTEREST
Weights decrease through compound interest as , where is the and is the current number of iterations.
LINEAR
Weights decrease linearly following , where is the iteration number and is the maximum number of iterations as set by .
Weight Initial
Default
The initial value of any particle's inertial weight, , or the minimum possible initial value if initial weights are randomized. When set, this will override or , and as such these must be set afterwards if so desired.
Constraint:
.
Weight Initialize
Default
Determines how the initial weights are distributed.
INITIAL
All weights are initialized at the initial weight, , if set. If has not been set, this will be the maximum weight, .
MAXIMUM
All weights are initialized at the maximum weight, .
RANDOMIZED
Weights are uniformly distributed in or if has been set.
Weight Maximum
Default
The maximum particle weight, .
Constraint:
(If has been set then .)
Weight Minimum
Default
The minimum achievable weight of any particle, . Once achieved, no further weight reduction is possible.
Constraint:
(If has been set then .)
Weight Reset
Default
Determines how particle weights are re-initialized.
INITIAL
Weights are re-initialized at the initial weight if set. If has not been set, this will be the maximum weight.
MAXIMUM
Weights are re-initialized at the maximum weight.
RANDOMIZED
Weights are uniformly distributed in or if has been set.