mlpack
git-master
|
adam.hpp
Go to the documentation of this file.
bool Shuffle() const
Get whether or not the individual functions are shuffled.
Definition: sgd.hpp:154
bool & Shuffle()
Modify whether or not the individual functions are shuffled.
Definition: adam.hpp:173
size_t MaxIterations() const
Get the maximum number of iterations (0 indicates no limit).
Definition: adam.hpp:161
double Epsilon() const
Get the value used to initialise the mean squared gradient parameter.
Definition: adam.hpp:156
The core includes that mlpack expects; standard C++ includes and Armadillo.
size_t & MaxIterations()
Modify the maximum number of iterations (0 indicates no limit).
Definition: adam.hpp:163
double Optimize(DecomposableFunctionType &function, arma::mat &iterate)
Optimize the given function using Adam.
Definition: adam.hpp:130
size_t MaxIterations() const
Get the maximum number of iterations (0 indicates no limit).
Definition: sgd.hpp:144
Adam is an optimizer that computes individual adaptive learning rates for different parameters from e...
Definition: adam.hpp:87
bool Shuffle() const
Get whether or not the individual functions are shuffled.
Definition: adam.hpp:171
AdamType(const double stepSize=0.001, const size_t batchSize=32, const double beta1=0.9, const double beta2=0.999, const double eps=1e-8, const size_t maxIterations=100000, const double tolerance=1e-5, const bool shuffle=true)
Construct the Adam optimizer with the given function and parameters.
double & Epsilon()
Modify the value used to initialise the mean squared gradient parameter.
Definition: adam.hpp:158
double Optimize(DecomposableFunctionType &function, arma::mat &iterate)
Optimize the given function using stochastic gradient descent.
Generated by
