NAG FL Interface
g05pvf (kfold_​xyw)

Settings help

FL Name Style:


FL Specification Language:


1 Purpose

g05pvf generates training and validation datasets suitable for use in cross-validation or jack-knifing.

2 Specification

Fortran Interface
Subroutine g05pvf ( k, fold, n, m, sordx, x, ldx, usey, y, usew, w, nt, state, ifail)
Integer, Intent (In) :: k, fold, n, m, sordx, ldx, usey, usew
Integer, Intent (Inout) :: state(*), ifail
Integer, Intent (Out) :: nt
Real (Kind=nag_wp), Intent (Inout) :: x(ldx,*), y(*), w(*)
C Header Interface
#include <nag.h>
void  g05pvf_ (const Integer *k, const Integer *fold, const Integer *n, const Integer *m, const Integer *sordx, double x[], const Integer *ldx, const Integer *usey, double y[], const Integer *usew, double w[], Integer *nt, Integer state[], Integer *ifail)
The routine may be called by the names g05pvf or nagf_rand_kfold_xyw.

3 Description

Let Xo denote a matrix of n observations on m variables and yo and wo each denote a vector of length n. For example, Xo might represent a matrix of independent variables, yo the dependent variable and wo the associated weights in a weighted regression.
g05pvf generates a series of training datasets, denoted by the matrix, vector, vector triplet (Xt,yt,wt) of nt observations, and validation datasets, denoted (Xv,yv,wv) with nv observations. These training and validation datasets are generated as follows.
Each of the original n observations is randomly assigned to one of K equally sized groups or folds. For the kth sample the validation dataset consists of those observations in group k and the training dataset consists of all those observations not in group k. Therefore, at most K samples can be generated.
If n is not divisible by K then the observations are assigned to groups as evenly as possible, therefore, any group will be at most one observation larger or smaller than any other group.
When using K=n the resulting datasets are suitable for leave-one-out cross-validation, or the training dataset on its own for jack-knifing. When using Kn the resulting datasets are suitable for K-fold cross-validation. Datasets suitable for reversed cross-validation can be obtained by switching the training and validation datasets, i.e., use the kth group as the training dataset and the rest of the data as the validation dataset.
One of the initialization routines g05kff (for a repeatable sequence if computed sequentially) or g05kgf (for a non-repeatable sequence) must be called prior to the first call to g05pvf.

4 References

None.

5 Arguments

1: k Integer Input
On entry: K, the number of folds.
Constraint: 2kn.
2: fold Integer Input
On entry: the number of the fold to return as the validation dataset.
On the first call to g05pvf fold should be set to 1 and then incremented by one at each subsequent call until all K sets of training and validation datasets have been produced. See Section 9 for more details on how a different calling sequence can be used.
Constraint: 1foldk.
3: n Integer Input
On entry: n, the number of observations.
Constraint: n1.
4: m Integer Input
On entry: m, the number of variables.
Constraint: m1.
5: sordx Integer Input
On entry: determines how variables are stored in x.
Constraint: sordx=1 or 2.
6: x(ldx,*) Real (Kind=nag_wp) array Input/Output
Note: the second dimension of the array x must be at least m if sordx=1 and at least n if sordx=2.
The way the data is stored in x is defined by sordx.
If sordx=1, x(i,j) contains the ith observation for the jth variable, for i=1,2,,n and j=1,2,,m.
If sordx=2, x(j,i) contains the ith observation for the jth variable, for i=1,2,,n and j=1,2,,m.
On entry: if fold=1, x must hold Xo, the values of X for the original dataset, otherwise, x must not be changed since the last call to g05pvf.
On exit: values of X for the training and validation datasets, with Xt held in observations 1 to nt and Xv in observations nt+1 to n.
7: ldx Integer Input
On entry: the first dimension of the array x as declared in the (sub)program from which g05pvf is called.
Constraints:
  • if sordx=2, ldxm;
  • otherwise ldxn.
8: usey Integer Input
On entry: if usey=1, the original dataset includes yo and yo will be processed alongside Xo.
Constraint: usey=0 or 1.
9: y(*) Real (Kind=nag_wp) array Input/Output
Note: the dimension of the array y must be at least n if usey=1.
If usey=0, y is not referenced on entry and will not be modified on exit.
On entry: if fold=1, y must hold yo, the values of y for the original dataset, otherwise y must not be changed since the last call to g05pvf.
On exit: values of y for the training and validation datasets, with yt held in elements 1 to nt and yv in elements nt+1 to n.
10: usew Integer Input
On entry: if usew=1, the original dataset includes wo and wo will be processed alongside Xo.
Constraint: usew=0 or 1.
11: w(*) Real (Kind=nag_wp) array Input/Output
Note: the dimension of the array w must be at least n if usew=1.
If usew=0, w is not referenced on entry and will not be modified on exit.
On entry: if fold=1, w must hold wo, the values of w for the original dataset, otherwise w must not be changed since the last call to g05pvf.
On exit: values of w for the training and validation datasets, with wt held in elements 1 to nt and wv in elements nt+1 to n.
12: nt Integer Output
On exit: nt, the number of observations in the training dataset.
13: state(*) Integer array Communication Array
Note: the actual argument supplied must be the array state supplied to the initialization routines g05kff or g05kgf.
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
14: ifail Integer Input/Output
On entry: ifail must be set to 0, −1 or 1 to set behaviour on detection of an error; these values have no effect when no error is detected.
A value of 0 causes the printing of an error message and program execution will be halted; otherwise program execution continues. A value of −1 means that an error message is printed while a value of 1 means that it is not.
If halting is not appropriate, the value −1 or 1 is recommended. If message printing is undesirable, then the value 1 is recommended. Otherwise, the value −1 is recommended since useful values can be provided in some output arguments even when ifail0 on exit. When the value -1 or 1 is used it is essential to test the value of ifail on exit.
On exit: ifail=0 unless the routine detects an error or a warning has been flagged (see Section 6).

6 Error Indicators and Warnings

If on entry ifail=0 or −1, explanatory error messages are output on the current error message unit (as defined by x04aaf).
Errors or warnings detected by the routine:
Note: in some cases g05pvf may return useful information.
ifail=11
On entry, k=value and n=value.
Constraint: 2kn.
ifail=21
On entry, fold=value and k=value.
Constraint: 1foldk.
ifail=31
On entry, n=value.
Constraint: n1.
ifail=41
On entry, m=value.
Constraint: m1.
ifail=51
On entry, sordx=value.
Constraint: sordx=1 or 2.
ifail=61
More than 50% of the data did not move when the data was shuffled. value of the value observations stayed put.
ifail=71
On entry, ldx=value and n=value.
Constraint: if sordx=1, ldxn.
ifail=72
On entry, ldx=value and m=value.
Constraint: if sordx=2, ldxm.
ifail=81
Constraint: usey=0 or 1.
ifail=101
Constraint: usew=0 or 1.
ifail=131
On entry, state vector has been corrupted or not initialized.
ifail=-99
An unexpected error has been triggered by this routine. Please contact NAG.
See Section 7 in the Introduction to the NAG Library FL Interface for further information.
ifail=-399
Your licence key may have expired or may not have been installed correctly.
See Section 8 in the Introduction to the NAG Library FL Interface for further information.
ifail=-999
Dynamic memory allocation failed.
See Section 9 in the Introduction to the NAG Library FL Interface for further information.

7 Accuracy

Not applicable.

8 Parallelism and Performance

g05pvf is threaded by NAG for parallel execution in multithreaded implementations of the NAG Library.
g05pvf makes calls to BLAS and/or LAPACK routines, which may be threaded within the vendor library used by this implementation. Consult the documentation for the vendor library for further information.
Please consult the X06 Chapter Introduction for information on how to control and interrogate the OpenMP environment used within this routine. Please also consult the Users' Note for your implementation for any additional implementation-specific information.

9 Further Comments

g05pvf will be computationality more efficient if each observation in x is contiguous, that is sordx=2.
Because of the way g05pvf stores the data you should usually generate the K training and validation datasets in order, i.e., set fold=1 on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different cross-validation analyses on different threads. This is possible, as long as the following is borne in mind:
For example, if you have three threads, you would call g05pvf once with fold=1. You would then copy the x returned onto each thread and generate the remaing k-1 sets of data by splitting them between the threads. For example, the first thread runs with fold=2,,L1, the second with fold=L1+1,,L2 and the third with fold=L2+1,,k.

10 Example

This example uses g05pvf to facilitate K-fold cross-validation.
A set of simulated data is split into 5 training and validation datasets. g02gbf is used to fit a logistic regression model to each training dataset and then g02gpf is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.

10.1 Program Text

Program Text (g05pvfe.f90)

10.2 Program Data

Program Data (g05pvfe.d)

10.3 Program Results

Program Results (g05pvfe.r)