NAG FL Interface
h05abf (best_​subset_​given_​size)

Settings help

FL Name Style:


FL Specification Language:


1 Purpose

Given a set of m features and a scoring mechanism for any subset of those features, h05abf selects the best n subsets of size p using a direct communication branch and bound algorithm.

2 Specification

Fortran Interface
Subroutine h05abf ( mincr, m, ip, nbest, la, bscore, bz, f, mincnt, gamma, acc, iuser, ruser, ifail)
Integer, Intent (In) :: mincr, m, ip, nbest, mincnt
Integer, Intent (Inout) :: iuser(*), ifail
Integer, Intent (Out) :: la, bz(m-ip,nbest)
Real (Kind=nag_wp), Intent (In) :: gamma, acc(2)
Real (Kind=nag_wp), Intent (Inout) :: ruser(*)
Real (Kind=nag_wp), Intent (Out) :: bscore(nbest)
External :: f
C Header Interface
#include <nag.h>
void  h05abf_ (const Integer *mincr, const Integer *m, const Integer *ip, const Integer *nbest, Integer *la, double bscore[], Integer bz[],
void (NAG_CALL *f)(const Integer *m, const Integer *drop, const Integer *lz, const Integer z[], const Integer *la, const Integer a[], double score[], Integer iuser[], double ruser[], Integer *info),
const Integer *mincnt, const double *gamma, const double acc[], Integer iuser[], double ruser[], Integer *ifail)
The routine may be called by the names h05abf or nagf_mip_best_subset_given_size.

3 Description

Given Ω={xi:i,1im}, a set of m unique features and a scoring mechanism f(S) defined for all S Ω then h05abf is designed to find So1 Ω , |So1| = p , an optimal subset of size p. Here |So1| denotes the cardinality of So1, the number of elements in the set.
The definition of the optimal subset depends on the properties of the scoring mechanism, if
f(Si) f(Sj) , for all ​ Sj Ω ​ and ​ Si Sj (1)
then the optimal subset is defined as one of the solutions to
maximize SΩ f(S)   subject to   |S| = p  
else if
f(Si) f(Sj) , for all ​ Sj Ω ​ and ​ Si Sj (2)
then the optimal subset is defined as one of the solutions to
minimize S Ω f(S)   subject to   |S| = p .  
If neither of these properties hold then h05abf cannot be used.
As well as returning the optimal subset, So1, h05abf can return the best n solutions of size p. If Soi denotes the ith best subset, for i=1,2,,n-1, then the (i+1)th best subset is defined as the solution to either
maximize S Ω - {Soj:j,1ji} f(S)   subject to   |S| = p  
or
minimize S Ω - {Soj:j,1ji} f(S)   subject to   |S| = p  
depending on the properties of f.
The solutions are found using a branch and bound method, where each node of the tree is a subset of Ω. Assuming that (1) holds then a particular node, defined by subset Si, can be trimmed from the tree if f(Si) < f^(Son) where f^(Son) is the nth highest score we have observed so far for a subset of size p, i.e., our current best guess of the score for the nth best subset. In addition, because of (1) we can also drop all nodes defined by any subset Sj where SjSi, thus avoiding the need to enumerate the whole tree. Similar short cuts can be taken if (2) holds. A full description of this branch and bound algorithm can be found in Ridout (1988).
Rather than calculate the score at a given node of the tree h05abf utilizes the fast branch and bound algorithm of Somol et al. (2004), and attempts to estimate the score where possible. For each feature, xi, two values are stored, a count ci and μ^i, an estimate of the contribution of that feature. An initial value of zero is used for both ci and μ^i. At any stage of the algorithm where both f(S) and f(S-{xi}) have been calculated (as opposed to estimated), the estimated contribution of the feature xi is updated to
ciμ^i + [f(S)-f(S-{xj})] ci+1  
and ci is incremented by 1, therefore, at each stage μ^i is the mean contribution of xi observed so far and ci is the number of observations used to calculate that mean.
As long as cik, for the user-supplied constant k, then rather than calculating f(S-{xi}) this routine estimates it using f^(S-{xi}) = f(S) - γ μ^i or f^(S) - γ μ^i if f(S) has been estimated, where γ is a user-supplied scaling factor. An estimated score is never used to trim a node or returned as the optimal score.
Setting k=0 in this routine will cause the algorithm to always calculate the scores, returning to the branch and bound algorithm of Ridout (1988). In most cases it is preferable to use the fast branch and bound algorithm, by setting k>0, unless the score function is iterative in nature, i.e., f(S) must have been calculated before f(S-{xi}) can be calculated.
h05abf is a direct communication version of h05aaf.

4 References

Narendra P M and Fukunaga K (1977) A branch and bound algorithm for feature subset selection IEEE Transactions on Computers 9 917–922
Ridout M S (1988) Algorithm AS 233: An improved branch and bound algorithm for feature subset selection Journal of the Royal Statistics Society, Series C (Applied Statistics) (Volume 37) 1 139–147
Somol P, Pudil P and Kittler J (2004) Fast branch and bound algorithms for optimal feature selection IEEE Transactions on Pattern Analysis and Machine Intelligence (Volume 26) 7 900–912

5 Arguments

1: mincr Integer Input
On entry: flag indicating whether the scoring function f is increasing or decreasing.
mincr=1
f(Si) f(Sj) , i.e., the subsets with the largest score will be selected.
mincr=0
f(Si) f(Sj) , i.e., the subsets with the smallest score will be selected.
For all SjΩ and SiSj.
Constraint: mincr=0 or 1.
2: m Integer Input
On entry: m, the number of features in the full feature set.
Constraint: m2.
3: ip Integer Input
On entry: p, the number of features in the subset of interest.
Constraint: 1ipm.
4: nbest Integer Input
On entry: n, the maximum number of best subsets required. The actual number of subsets returned is given by la on final exit. If on final exit lanbest then ifail=42 is returned.
Constraint: nbest1.
5: la Integer Output
On exit: the number of best subsets returned.
6: bscore(nbest) Real (Kind=nag_wp) array Output
On exit: holds the score for the la best subsets returned in bz.
7: bz(m-ip,nbest) Integer array Output
On exit: the jth best subset is constructed by dropping the features specified in bz(i,j), for i=1,2,,m-ip and j=1,2,,la, from the set of all features, Ω. The score for the jth best subset is given in bscore(j).
8: f Subroutine, supplied by the user. External Procedure
f must evaluate the scoring function f.
The specification of f is:
Fortran Interface
Subroutine f ( m, drop, lz, z, la, a, score, iuser, ruser, info)
Integer, Intent (In) :: m, drop, lz, z(lz), la, a(la)
Integer, Intent (Inout) :: iuser(*), info
Real (Kind=nag_wp), Intent (Inout) :: ruser(*)
Real (Kind=nag_wp), Intent (Out) :: score(max(la,1))
C Header Interface
void  f (const Integer *m, const Integer *drop, const Integer *lz, const Integer z[], const Integer *la, const Integer a[], double score[], Integer iuser[], double ruser[], Integer *info)
1: m Integer Input
On entry: m=|Ω|, the number of features in the full feature set.
2: drop Integer Input
On entry: flag indicating whether the intermediate subsets should be constructed by dropping features from the full set (drop=1) or adding features to the empty set (drop=0). See score for additional details.
3: lz Integer Input
On entry: the number of features stored in z.
4: z(lz) Integer array Input
On entry: z(i), for i=1,2,,lz, contains the list of features which, along with those specified in a, define the subsets whose score is required. See score for additional details.
5: la Integer Input
On entry: if la>0, the number of subsets for which a score must be returned.
If la=0, the score for a single subset should be returned. See score for additional details.
6: a(la) Integer array Input
On entry: a(j), for j=1,2,,la, contains the list of features which, along with those specified in z, define the subsets whose score is required. See score for additional details.
7: score(max(la,1)) Real (Kind=nag_wp) array Output
On exit: the value f(Sj) , for j=1,2,,la, the score associated with the jth subset. Sj is constructed as follows:
drop=1
Sj is constructed by dropping the features specified in the first lz elements of z and the single feature given in a(j) from the full set of features, Ω. The subset will, therefore, contain m-lz-1 features.
drop=0
Sj is constructed by adding the features specified in the first lz elements of z and the single feature specified in a(j) to the empty set, . The subset will, therefore, contain lz+1 features.
In both cases the individual features are referenced by the integers 1 to m with 1 indicating the first feature, 2 the second, etc., for some arbitrary ordering of the features, chosen by you prior to calling h05abf. For example, 1 might refer to the first variable in a particular set of data, 2 the second, etc..
If la=0, the score for a single subset should be returned. This subset is constructed by adding or removing only those features specified in the first lz elements of z. If lz=0, this subset will either be Ω or .
8: iuser(*) Integer array User Workspace
9: ruser(*) Real (Kind=nag_wp) array User Workspace
f is called with the arguments iuser and ruser as supplied to h05abf. You should use the arrays iuser and ruser to supply information to f.
10: info Integer Input/Output
On entry: info=0.
On exit: set info to a nonzero value if you wish h05abf to terminate with ifail=82.
f must either be a module subprogram USEd by, or declared as EXTERNAL in, the (sub)program from which h05abf is called. Arguments denoted as Input must not be changed by this procedure.
Note: f should not return floating-point NaN (Not a Number) or infinity values, since these are not handled by h05abf. If your code inadvertently does return any NaNs or infinities, h05abf is likely to produce unexpected results.
9: mincnt Integer Input
On entry: k, the minimum number of times the effect of each feature, xi, must have been observed before f(S-{xi}) is estimated from f(S) as opposed to being calculated directly.
If k=0 then f(S-{xi}) is never estimated. If mincnt<0 then k is set to 1.
10: gamma Real (Kind=nag_wp) Input
On entry: γ, the scaling factor used when estimating scores. If gamma<0 then γ=1 is used.
11: acc(2) Real (Kind=nag_wp) array Input
On entry: a measure of the accuracy of the scoring function, f.
Letting ai = ε1 |f(Si)| + ε2 , then when confirming whether the scoring function is strictly increasing or decreasing (as described in mincr), or when assessing whether a node defined by subset Si can be trimmed, then any values in the range f(Si) ± ai are treated as being numerically equivalent.
If 0acc(1)1 then ε1=acc(1), otherwise ε1=0.
If acc(2)0 then ε2=acc(2), otherwise ε2=0.
In most situations setting both ε1 and ε2 to zero should be sufficient. Using a nonzero value, when one is not required, can significantly increase the number of subsets that need to be evaluated.
12: iuser(*) Integer array User Workspace
13: ruser(*) Real (Kind=nag_wp) array User Workspace
iuser and ruser are not used by h05abf, but are passed directly to f and may be used to pass information to this routine.
14: ifail Integer Input/Output
On entry: ifail must be set to 0, −1 or 1 to set behaviour on detection of an error; these values have no effect when no error is detected.
A value of 0 causes the printing of an error message and program execution will be halted; otherwise program execution continues. A value of −1 means that an error message is printed while a value of 1 means that it is not.
If halting is not appropriate, the value −1 or 1 is recommended. If message printing is undesirable, then the value 1 is recommended. Otherwise, the value 0 is recommended. When the value -1 or 1 is used it is essential to test the value of ifail on exit.
On exit: ifail=0 unless the routine detects an error or a warning has been flagged (see Section 6).

6 Error Indicators and Warnings

If on entry ifail=0 or −1, explanatory error messages are output on the current error message unit (as defined by x04aaf).
Errors or warnings detected by the routine:
ifail=11
On entry, mincr=value.
Constraint: mincr=0 or 1.
ifail=21
On entry, m=value.
Constraint: m2.
ifail=31
On entry, ip=value and m=value.
Constraint: 1ipm.
ifail=41
On entry, nbest=value.
Constraint: nbest1.
ifail=42
On entry, nbest=value.
But only value best subsets could be calculated.
ifail=81
On exit from f, score(value)=value, which is inconsistent with the score for the parent node. Score for the parent node is value.
ifail=82
A nonzero value for info has been returned: info=value.
ifail=-99
An unexpected error has been triggered by this routine. Please contact NAG.
See Section 7 in the Introduction to the NAG Library FL Interface for further information.
ifail=-399
Your licence key may have expired or may not have been installed correctly.
See Section 8 in the Introduction to the NAG Library FL Interface for further information.
ifail=-999
Dynamic memory allocation failed.
See Section 9 in the Introduction to the NAG Library FL Interface for further information.

7 Accuracy

The subsets returned by h05abf are guaranteed to be optimal up to the accuracy of the calculated scores.

8 Parallelism and Performance

Background information to multithreading can be found in the Multithreading documentation.
h05abf is threaded by NAG for parallel execution in multithreaded implementations of the NAG Library.
Please consult the X06 Chapter Introduction for information on how to control and interrogate the OpenMP environment used within this routine. Please also consult the Users' Note for your implementation for any additional implementation-specific information.

9 Further Comments

The maximum number of unique subsets of size p from a set of m features is N= m! (m-p)!p! . The efficiency of the branch and bound algorithm implemented in h05abf comes from evaluating subsets at internal nodes of the tree, that is subsets with more than p features, and where possible trimming branches of the tree based on the scores at these internal nodes as described in Narendra and Fukunaga (1977). Because of this it is possible, in some circumstances, for more than N subsets to be evaluated. This will tend to happen when most of the features have a similar effect on the subset score.
If multiple optimal subsets exist with the same score, and nbest is too small to return them all, then the choice of which of these optimal subsets is returned is arbitrary.

10 Example

This example finds the three linear regression models, with five variables, that have the smallest residual sums of squares when fitted to a supplied dataset. The data used in this example was simulated.

10.1 Program Text

Program Text (h05abfe.f90)

10.2 Program Data

Program Data (h05abfe.d)

10.3 Program Results

Program Results (h05abfe.r)