mlpack  git-master
AdamUpdate Class Reference

Adam is an optimizer that computes individual adaptive learning rates for different parameters from estimates of first and second moments of the gradients as given in the section 7 of the following paper. More...

Public Member Functions

 AdamUpdate (const double epsilon=1e-8, const double beta1=0.9, const double beta2=0.999)
 Construct the Adam update policy with the given parameters. More...

 
double Beta1 () const
 Get the smoothing parameter. More...

 
double & Beta1 ()
 Modify the smoothing parameter. More...

 
double Beta2 () const
 Get the second moment coefficient. More...

 
double & Beta2 ()
 Modify the second moment coefficient. More...

 
double Epsilon () const
 Get the value used to initialise the squared gradient parameter. More...

 
double & Epsilon ()
 Modify the value used to initialise the squared gradient parameter. More...

 
void Initialize (const size_t rows, const size_t cols)
 The Initialize method is called by SGD Optimizer method before the start of the iteration update process. More...

 
void Update (arma::mat &iterate, const double stepSize, const arma::mat &gradient)
 Update step for Adam. More...

 

Detailed Description

Adam is an optimizer that computes individual adaptive learning rates for different parameters from estimates of first and second moments of the gradients as given in the section 7 of the following paper.

For more information, see the following.

@article{Kingma2014,
author = {Diederik P. Kingma and Jimmy Ba},
title = {Adam: {A} Method for Stochastic Optimization},
journal = {CoRR},
year = {2014},
url = {http://arxiv.org/abs/1412.6980}
}

Definition at line 42 of file adam_update.hpp.

Constructor & Destructor Documentation

◆ AdamUpdate()

AdamUpdate ( const double  epsilon = 1e-8,
const double  beta1 = 0.9,
const double  beta2 = 0.999 
)
inline

Construct the Adam update policy with the given parameters.

Parameters
epsilonThe epsilon value used to initialise the squared gradient parameter.
beta1The smoothing parameter.
beta2The second moment coefficient.

Definition at line 53 of file adam_update.hpp.

Member Function Documentation

◆ Beta1() [1/2]

double Beta1 ( ) const
inline

Get the smoothing parameter.

Definition at line 116 of file adam_update.hpp.

◆ Beta1() [2/2]

double& Beta1 ( )
inline

Modify the smoothing parameter.

Definition at line 118 of file adam_update.hpp.

◆ Beta2() [1/2]

double Beta2 ( ) const
inline

Get the second moment coefficient.

Definition at line 121 of file adam_update.hpp.

◆ Beta2() [2/2]

double& Beta2 ( )
inline

Modify the second moment coefficient.

Definition at line 123 of file adam_update.hpp.

◆ Epsilon() [1/2]

double Epsilon ( ) const
inline

Get the value used to initialise the squared gradient parameter.

Definition at line 111 of file adam_update.hpp.

◆ Epsilon() [2/2]

double& Epsilon ( )
inline

Modify the value used to initialise the squared gradient parameter.

Definition at line 113 of file adam_update.hpp.

◆ Initialize()

void Initialize ( const size_t  rows,
const size_t  cols 
)
inline

The Initialize method is called by SGD Optimizer method before the start of the iteration update process.

Parameters
rowsNumber of rows in the gradient matrix.
colsNumber of columns in the gradient matrix.

Definition at line 71 of file adam_update.hpp.

◆ Update()

void Update ( arma::mat &  iterate,
const double  stepSize,
const arma::mat &  gradient 
)
inline

Update step for Adam.

Parameters
iterateParameters that minimize the function.
stepSizeStep size to be used for the given iteration.
gradientThe gradient matrix.

It should be noted that the term, m / (arma::sqrt(v) + eps), in the following expression is an approximation of the following actual term; m / (arma::sqrt(v) + (arma::sqrt(biasCorrection2) * eps).

Definition at line 84 of file adam_update.hpp.


The documentation for this class was generated from the following file: