mlpack  git-master
SPALeRAStepsize Class Reference

Definition of the SPALeRA stepize technique, which implementes a change detection mechanism with an agnostic adaptation scheme. More...

Public Member Functions

 SPALeRAStepsize (const double alpha=0.001, const double epsilon=1e-6, const double adaptRate=3.10e-8)
 Construct the SPALeRAStepsize object with the given parameters. More...

 
double AdaptRate () const
 Get the agnostic learning rate update rate. More...

 
double & AdaptRate ()
 Modify the agnostic learning rate update rate. More...

 
double Alpha () const
 Get the agnostic learning rate adaptation parameter. More...

 
double & Alpha ()
 Modify the agnostic learning rate adaptation parameter. More...

 
void Initialize (const size_t rows, const size_t cols, const double lambda)
 The Initialize method is called by SPALeRASGD Optimizer method before the start of the iteration update process. More...

 
bool Update (const double stepSize, const double objective, const size_t batchSize, const size_t numFunctions, arma::mat &iterate, const arma::mat &gradient)
 This function is called in each iteration. More...

 

Detailed Description

Definition of the SPALeRA stepize technique, which implementes a change detection mechanism with an agnostic adaptation scheme.

For more information, please refer to:

@misc{Schoenauer2017,
title = {Stochastic Gradient Descent:
Going As Fast As Possible But Not Faster},
author = {Schoenauer-Sebag, Alice; Schoenauer, Marc; Sebag, Michele},
journal = {CoRR},
year = {2017},
url = {https://arxiv.org/abs/1709.01427},
}

Definition at line 38 of file spalera_stepsize.hpp.

Constructor & Destructor Documentation

◆ SPALeRAStepsize()

SPALeRAStepsize ( const double  alpha = 0.001,
const double  epsilon = 1e-6,
const double  adaptRate = 3.10e-8 
)
inline

Construct the SPALeRAStepsize object with the given parameters.

The defaults here are not necessarily good for the given problem, so it is suggested that the values used be tailored to the task at hand.

Parameters
alphaMemory parameter of the agnostic learning rate adaptation.
epsilonNumerical stability parameter.
adaptRateAgnostic learning rate update rate.

Definition at line 51 of file spalera_stepsize.hpp.

Member Function Documentation

◆ AdaptRate() [1/2]

double AdaptRate ( ) const
inline

Get the agnostic learning rate update rate.

Definition at line 179 of file spalera_stepsize.hpp.

Referenced by SPALeRASGD< DecayPolicyType >::AdaptRate().

◆ AdaptRate() [2/2]

double& AdaptRate ( )
inline

Modify the agnostic learning rate update rate.

Definition at line 181 of file spalera_stepsize.hpp.

◆ Alpha() [1/2]

double Alpha ( ) const
inline

Get the agnostic learning rate adaptation parameter.

Definition at line 174 of file spalera_stepsize.hpp.

Referenced by SPALeRASGD< DecayPolicyType >::Alpha().

◆ Alpha() [2/2]

double& Alpha ( )
inline

Modify the agnostic learning rate adaptation parameter.

Definition at line 176 of file spalera_stepsize.hpp.

◆ Initialize()

void Initialize ( const size_t  rows,
const size_t  cols,
const double  lambda 
)
inline

The Initialize method is called by SPALeRASGD Optimizer method before the start of the iteration update process.

Parameters
rowsNumber of rows in the gradient matrix.
colsNumber of columns in the gradient matrix.
lambdaPage-Hinkley update parameter.

Definition at line 69 of file spalera_stepsize.hpp.

◆ Update()

bool Update ( const double  stepSize,
const double  objective,
const size_t  batchSize,
const size_t  numFunctions,
arma::mat &  iterate,
const arma::mat &  gradient 
)
inline

This function is called in each iteration.

Parameters
stepSizeStep size to be used for the given iteration.
objectiveThe current function loss.
batchSizeBatch size to be used for the given iteration.
numFunctionsThe number of functions.
iterateParameters that minimize the function.
gradientThe gradient matrix.
Returns
Stop or continue the learning process.

Definition at line 91 of file spalera_stepsize.hpp.


The documentation for this class was generated from the following file: