ELUType< MatType > Class Template Reference

The ELU activation function, defined by. More...

Inheritance diagram for ELUType< MatType >:

Public Member Functions

 ELUType ()
 Create the ELU object. More...

 
 ELUType (const double alpha)
 Create the ELU object using the specified parameter. More...

 
 ELUType (const ELUType &other)
 
 ELUType (ELUType &&other)
 
virtual ~ELUType ()
 
double const & Alpha () const
 Get the non zero gradient. More...

 
double & Alpha ()
 Modify the non zero gradient. More...

 
void Backward (const MatType &input, const MatType &gy, MatType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...

 
ELUTypeClone () const
 Clone the ELUType object. This handles polymorphism correctly. More...

 
void Forward (const MatType &input, MatType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...

 
double const & Lambda () const
 Get the lambda parameter. More...

 
ELUTypeoperator= (const ELUType &other)
 
ELUTypeoperator= (ELUType &&other)
 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
- Public Member Functions inherited from Layer< MatType >
 Layer ()
 Default constructor. More...

 
 Layer (const Layer &layer)
 Copy constructor. This is not responsible for copying weights! More...

 
 Layer (Layer &&layer)
 Move constructor. This is not responsible for moving weights! More...

 
virtual ~Layer ()
 Default deconstructor. More...

 
virtual void ComputeOutputDimensions ()
 Compute the output dimensions. More...

 
virtual void CustomInitialize (MatType &, const size_t)
 Override the weight matrix of the layer. More...

 
virtual void Forward (const MatType &, const MatType &)
 Takes an input and output object, and computes the corresponding loss of the layer. More...

 
virtual void Gradient (const MatType &, const MatType &, MatType &)
 Computing the gradient of the layer with respect to its own input. More...

 
const std::vector< size_t > & InputDimensions () const
 Get the input dimensions. More...

 
std::vector< size_t > & InputDimensions ()
 Modify the input dimensions. More...

 
virtual double Loss ()
 Get the layer loss. More...

 
virtual Layeroperator= (const Layer &layer)
 Copy assignment operator. This is not responsible for copying weights! More...

 
virtual Layeroperator= (Layer &&layer)
 Move assignment operator. This is not responsible for moving weights! More...

 
const std::vector< size_t > & OutputDimensions ()
 Get the output dimensions. More...

 
virtual size_t OutputSize () final
 Get the number of elements in the output from this layer. More...

 
virtual const MatType & Parameters () const
 Get the parameters. More...

 
virtual MatType & Parameters ()
 Set the parameters. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
virtual void SetWeights (typename MatType::elem_type *)
 Reset the layer parameter. More...

 
virtual bool const & Training () const
 Get whether the layer is currently in training mode. More...

 
virtual bool & Training ()
 Modify whether the layer is currently in training mode. More...

 
virtual size_t WeightSize () const
 Get the total number of trainable weights in the layer. More...

 

Additional Inherited Members

- Protected Attributes inherited from Layer< MatType >
std::vector< size_t > inputDimensions
 Logical input dimensions of each point. More...

 
std::vector< size_t > outputDimensions
 Logical output dimensions of each point. More...

 
bool training
 If true, the layer is in training mode; otherwise, it is in testing mode. More...

 
bool validOutputDimensions
 This is true if ComputeOutputDimensions() has been called, and outputDimensions can be considered to be up-to-date. More...

 

Detailed Description


template
<
typename
MatType
=
arma::mat
>

class mlpack::ann::ELUType< MatType >

The ELU activation function, defined by.

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > 0 \\ \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ f(x) + \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Clevert2015,
author = {Djork{-}Arn{\'{e}} Clevert and Thomas Unterthiner and
Sepp Hochreiter},
title = {Fast and Accurate Deep Network Learning by Exponential Linear
Units (ELUs)},
journal = {CoRR},
year = {2015},
url = {https://arxiv.org/abs/1511.07289}
}

The SELU activation function is defined by

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} \lambda * x & : x > 0 \\ \lambda * \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} \lambda & : x > 0 \\ f(x) + \lambda * \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Klambauer2017,
author = {Gunter Klambauer and Thomas Unterthiner and
Andreas Mayr},
title = {Self-Normalizing Neural Networks},
journal = {Advances in Neural Information Processing Systems},
year = {2017},
url = {https://arxiv.org/abs/1706.02515}
}

In testing mode, there is no computation of the derivative.

Note
Make sure to use SELU activation function with normalized inputs and weights initialized with Lecun Normal Initialization.
Template Parameters
InputTypeThe type of the layer's inputs. The layer automatically cast inputs to this type (Default: arma::mat).
OutputTypeThe type of the computation which also causes the output to also be in this type. The type also allows the computation and weight type to differ from the input type (Default: arma::mat).

Definition at line 108 of file elu.hpp.

Constructor & Destructor Documentation

◆ ELUType() [1/4]

ELUType ( )

Create the ELU object.

NOTE: Use this constructor for SELU activation function.

Referenced by ELUType< MatType >::Clone(), and ELUType< MatType >::~ELUType().

◆ ELUType() [2/4]

ELUType ( const double  alpha)

Create the ELU object using the specified parameter.

The non zero gradient for negative inputs can be adjusted by specifying the ELU hyperparameter alpha (alpha > 0).

Note
Use this constructor for ELU activation function.
Parameters
alphaScale parameter for the negative factor.

◆ ~ELUType()

◆ ELUType() [3/4]

ELUType ( const ELUType< MatType > &  other)

◆ ELUType() [4/4]

ELUType ( ELUType< MatType > &&  other)

Member Function Documentation

◆ Alpha() [1/2]

double const& Alpha ( ) const
inline

Get the non zero gradient.

Definition at line 167 of file elu.hpp.

◆ Alpha() [2/2]

double& Alpha ( )
inline

Modify the non zero gradient.

Definition at line 169 of file elu.hpp.

◆ Backward()

void Backward ( const MatType &  input,
const MatType &  gy,
MatType &  g 
)
virtual

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation f(x).
gyThe backpropagated error.
gThe calculated gradient.

Reimplemented from Layer< MatType >.

Referenced by ELUType< MatType >::~ELUType().

◆ Clone()

ELUType* Clone ( ) const
inlinevirtual

Clone the ELUType object. This handles polymorphism correctly.

Implements Layer< MatType >.

Definition at line 129 of file elu.hpp.

References ELUType< MatType >::ELUType().

◆ Forward()

void Forward ( const MatType &  input,
MatType &  output 
)
virtual

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

Reimplemented from Layer< MatType >.

Referenced by ELUType< MatType >::~ELUType().

◆ Lambda()

double const& Lambda ( ) const
inline

Get the lambda parameter.

Definition at line 172 of file elu.hpp.

References ELUType< MatType >::serialize().

◆ operator=() [1/2]

ELUType& operator= ( const ELUType< MatType > &  other)

◆ operator=() [2/2]

ELUType& operator= ( ELUType< MatType > &&  other)

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the layer.

Referenced by ELUType< MatType >::Lambda().


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/elu.hpp