NAG FL Interface
g05pvf (kfold_xyw)
1
Purpose
g05pvf generates training and validation datasets suitable for use in cross-validation or jack-knifing.
2
Specification
Fortran Interface
Subroutine g05pvf ( |
k, fold, n, m, sordx, x, ldx, usey, y, usew, w, nt, state, ifail) |
Integer, Intent (In) |
:: |
k, fold, n, m, sordx, ldx, usey, usew |
Integer, Intent (Inout) |
:: |
state(*), ifail |
Integer, Intent (Out) |
:: |
nt |
Real (Kind=nag_wp), Intent (Inout) |
:: |
x(ldx,*), y(*), w(*) |
|
C Header Interface
#include <nag.h>
void |
g05pvf_ (const Integer *k, const Integer *fold, const Integer *n, const Integer *m, const Integer *sordx, double x[], const Integer *ldx, const Integer *usey, double y[], const Integer *usew, double w[], Integer *nt, Integer state[], Integer *ifail) |
|
C++ Header Interface
#include <nag.h> extern "C" {
void |
g05pvf_ (const Integer &k, const Integer &fold, const Integer &n, const Integer &m, const Integer &sordx, double x[], const Integer &ldx, const Integer &usey, double y[], const Integer &usew, double w[], Integer &nt, Integer state[], Integer &ifail) |
}
|
The routine may be called by the names g05pvf or nagf_rand_kfold_xyw.
3
Description
Let denote a matrix of observations on variables and and each denote a vector of length . For example, might represent a matrix of independent variables, the dependent variable and the associated weights in a weighted regression.
g05pvf generates a series of training datasets, denoted by the matrix, vector, vector triplet of observations, and validation datasets, denoted with observations. These training and validation datasets are generated as follows.
Each of the original observations is randomly assigned to one of equally sized groups or folds. For the th sample the validation dataset consists of those observations in group and the training dataset consists of all those observations not in group . Therefore at most samples can be generated.
If is not divisible by then the observations are assigned to groups as evenly as possible, therefore any group will be at most one observation larger or smaller than any other group.
When using the resulting datasets are suitable for leave-one-out cross-validation, or the training dataset on its own for jack-knifing. When using the resulting datasets are suitable for -fold cross-validation. Datasets suitable for reversed cross-validation can be obtained by switching the training and validation datasets, i.e., use the th group as the training dataset and the rest of the data as the validation dataset.
One of the initialization routines
g05kff (for a repeatable sequence if computed sequentially) or
g05kgf (for a non-repeatable sequence) must be called prior to the first call to
g05pvf.
4
References
None.
5
Arguments
-
1:
– Integer
Input
-
On entry: , the number of folds.
Constraint:
.
-
2:
– Integer
Input
-
On entry: the number of the fold to return as the validation dataset.
On the first call to
g05pvf should be set to
and then incremented by one at each subsequent call until all
sets of training and validation datasets have been produced. See
Section 9 for more details on how a different calling sequence can be used.
Constraint:
.
-
3:
– Integer
Input
-
On entry: , the number of observations.
Constraint:
.
-
4:
– Integer
Input
-
On entry: , the number of variables.
Constraint:
.
-
5:
– Integer
Input
-
On entry: determines how variables are stored in
x.
Constraint:
or .
-
6:
– Real (Kind=nag_wp) array
Input/Output
-
Note: the second dimension of the array
x
must be at least
if
and at least
if
.
The way the data is stored in
x is defined by
sordx.
If , contains the th observation for the th variable, for and .
If , contains the th observation for the th variable, for and .
On entry: if
,
x must hold
, the values of
for the original dataset, otherwise,
x must not be changed since the last call to
g05pvf.
On exit: values of for the training and validation datasets, with held in observations to and in observations to .
-
7:
– Integer
Input
-
On entry: the first dimension of the array
x as declared in the (sub)program from which
g05pvf is called.
Constraints:
- if , ;
- otherwise .
-
8:
– Integer
Input
-
On entry: if , the original dataset includes and will be processed alongside .
Constraint:
or .
-
9:
– Real (Kind=nag_wp) array
Input/Output
Note: the dimension of the array
y
must be at least
if
.
If
,
y is not referenced on entry and will not be modified on exit.
On entry: if
,
y must hold
, the values of
for the original dataset, otherwise
y must not be changed since the last call to
g05pvf.
On exit: values of for the training and validation datasets, with held in elements to and in elements to .
-
10:
– Integer
Input
-
On entry: if , the original dataset includes and will be processed alongside .
Constraint:
or .
-
11:
– Real (Kind=nag_wp) array
Input/Output
Note: the dimension of the array
w
must be at least
if
.
If
,
w is not referenced on entry and will not be modified on exit.
On entry: if
,
w must hold
, the values of
for the original dataset, otherwise
w must not be changed since the last call to
g05pvf.
On exit: values of for the training and validation datasets, with held in elements to and in elements to .
-
12:
– Integer
Output
-
On exit: , the number of observations in the training dataset.
-
13:
– Integer array
Communication Array
Note: the actual argument supplied
must be the array
state supplied to the initialization routines
g05kff or
g05kgf.
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
-
14:
– Integer
Input/Output
-
On entry:
ifail must be set to
,
or
to set behaviour on detection of an error; these values have no effect when no error is detected.
A value of causes the printing of an error message and program execution will be halted; otherwise program execution continues. A value of means that an error message is printed while a value of means that it is not.
If halting is not appropriate, the value
or
is recommended. If message printing is undesirable, then the value
is recommended. Otherwise, the value
is recommended since useful values can be provided in some output arguments even when
on exit.
When the value or is used it is essential to test the value of ifail on exit.
On exit:
unless the routine detects an error or a warning has been flagged (see
Section 6).
6
Error Indicators and Warnings
If on entry
or
, explanatory error messages are output on the current error message unit (as defined by
x04aaf).
Errors or warnings detected by the routine:
Note: in some cases g05pvf may return useful information.
-
On entry, and .
Constraint: .
-
On entry, and .
Constraint: .
-
On entry, .
Constraint: .
-
On entry, .
Constraint: .
-
On entry, .
Constraint: or .
-
More than of the data did not move when the data was shuffled. of the observations stayed put.
-
On entry, and .
Constraint: if , .
-
On entry, and .
Constraint: if , .
-
Constraint: or .
-
Constraint: or .
-
On entry,
state vector has been corrupted or not initialized.
An unexpected error has been triggered by this routine. Please
contact
NAG.
See
Section 7 in the Introduction to the NAG Library FL Interface for further information.
Your licence key may have expired or may not have been installed correctly.
See
Section 8 in the Introduction to the NAG Library FL Interface for further information.
Dynamic memory allocation failed.
See
Section 9 in the Introduction to the NAG Library FL Interface for further information.
7
Accuracy
Not applicable.
8
Parallelism and Performance
g05pvf is threaded by NAG for parallel execution in multithreaded implementations of the NAG Library.
g05pvf makes calls to BLAS and/or LAPACK routines, which may be threaded within the vendor library used by this implementation. Consult the documentation for the vendor library for further information.
Please consult the
X06 Chapter Introduction for information on how to control and interrogate the OpenMP environment used within this routine. Please also consult the
Users' Note for your implementation for any additional implementation-specific information.
g05pvf will be computationality more efficient if each observation in
x is contiguous, that is
.
Because of the way
g05pvf stores the data you should usually generate the
training and validation datasets in order, i.e., set
on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different cross-validation analyses on different threads. This is possible, as long as the following is borne in mind:
- g05pvf must be called with first.
- Other than the first set, you can obtain the training and validation dataset in any order, but for a given x you can only obtain each once.
For example, if you have three threads, you would call
g05pvf once with
. You would then copy the
x returned onto each thread and generate the remaing
sets of data by splitting them between the threads. For example, the first thread runs with
, the second with
and the third with
.
10
Example
This example uses g05pvf to facilitate -fold cross-validation.
A set of simulated data is split into
training and validation datasets.
g02gbf is used to fit a logistic regression model to each training dataset and then
g02gpf is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.
10.1
Program Text
10.2
Program Data
10.3
Program Results