FlexibleReLUType< InputType, OutputType > Class Template Reference

The FlexibleReLU activation function, defined by. More...

Inheritance diagram for FlexibleReLUType< InputType, OutputType >:

Public Member Functions

 FlexibleReLUType (const double alpha=0)
 Create the FlexibleReLU object using the specified alpha parameter. More...

 
const double & Alpha () const
 Get the parameter controlling the range of the ReLU function. More...

 
double & Alpha ()
 Modify the parameter controlling the range of the ReLU function. More...

 
void Backward (const InputType &input, const OutputType &gy, OutputType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...

 
FlexibleReLUTypeClone () const
 Clone the FlexibleReLUType object. This handles polymorphism correctly. More...

 
void Forward (const InputType &input, OutputType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...

 
void Gradient (const InputType &input, const OutputType &error, OutputType &gradient)
 Calculate the gradient using the output delta and the input activation. More...

 
OutputType const & Parameters () const
 Get the parameters. More...

 
OutputType & Parameters ()
 Modify the parameters. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
void SetWeights (typename OutputType::elem_type *weightsPtr)
 Reset the layer parameter (alpha). More...

 
const size_t WeightSize () const
 Get the total number of trainable weights in the layer. More...

 
- Public Member Functions inherited from Layer< InputType, OutputType >
 Layer ()
 Default constructor. More...

 
 Layer (const Layer &layer)
 Copy constructor. This is not responsible for copying weights! More...

 
 Layer (Layer &&layer)
 Move constructor. This is not responsible for moving weights! More...

 
virtual ~Layer ()
 Default deconstructor. More...

 
virtual void Backward (const InputType &, const InputType &, InputType &)
 Performs a backpropagation step through the layer, with respect to the given input. More...

 
virtual void ComputeOutputDimensions ()
 Compute the output dimensions. More...

 
virtual void CustomInitialize (InputType &, const size_t)
 Override the weight matrix of the layer. More...

 
virtual void Forward (const InputType &, InputType &)
 Takes an input object, and computes the corresponding output of the layer. More...

 
virtual void Forward (const InputType &, const InputType &)
 Takes an input and output object, and computes the corresponding loss of the layer. More...

 
virtual void Gradient (const InputType &, const InputType &, InputType &)
 Computing the gradient of the layer with respect to its own input. More...

 
const std::vector< size_t > & InputDimensions () const
 Get the input dimensions. More...

 
std::vector< size_t > & InputDimensions ()
 Modify the input dimensions. More...

 
virtual double Loss ()
 Get the layer loss. More...

 
virtual Layeroperator= (const Layer &layer)
 Copy assignment operator. This is not responsible for copying weights! More...

 
virtual Layeroperator= (Layer &&layer)
 Move assignment operator. This is not responsible for moving weights! More...

 
const std::vector< size_t > & OutputDimensions ()
 Get the output dimensions. More...

 
virtual size_t OutputSize () final
 Get the number of elements in the output from this layer. More...

 
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
virtual void SetWeights (typename InputType ::elem_type *)
 Reset the layer parameter. More...

 
virtual bool const & Training () const
 Get whether the layer is currently in training mode. More...

 
virtual bool & Training ()
 Modify whether the layer is currently in training mode. More...

 

Additional Inherited Members

- Protected Attributes inherited from Layer< InputType, OutputType >
std::vector< size_t > inputDimensions
 Logical input dimensions of each point. More...

 
std::vector< size_t > outputDimensions
 Logical output dimensions of each point. More...

 
bool training
 If true, the layer is in training mode; otherwise, it is in testing mode. More...

 
bool validOutputDimensions
 This is true if ComputeOutputDimensions() has been called, and outputDimensions can be considered to be up-to-date. More...

 

Detailed Description


template
<
typename
InputType
=
arma::mat
,
typename
OutputType
=
arma::mat
>

class mlpack::ann::FlexibleReLUType< InputType, OutputType >

The FlexibleReLU activation function, defined by.

\begin{eqnarray*} f(x) &=& \max(0,x)+alpha \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ 0 & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Qiu2018,
author = {Suo Qiu, Xiangmin Xu and Bolun Cai},
title = {FReLU: Flexible Rectified Linear Units for Improving
Convolutional Neural Networks}
journal = {arxiv preprint},
URL = {https://arxiv.org/abs/1706.08098},
year = {2018}
}
Template Parameters
InputTypeThe type of the layer's inputs. The layer automatically cast inputs to this type (Default: arma::mat).
OutputTypeThe type of the computation which also causes the output to also be in this type. The type also allows the computation and weight type to differ from the input type (Default: arma::mat).

Definition at line 58 of file flexible_relu.hpp.

Constructor & Destructor Documentation

◆ FlexibleReLUType()

FlexibleReLUType ( const double  alpha = 0)

Create the FlexibleReLU object using the specified alpha parameter.

The trainable alpha parameter controls the range of the ReLU function. (Default alpha = 0).

Parameters
alphaParameter to adjust the range of the ReLU function.

Referenced by FlexibleReLUType< InputType, OutputType >::Clone().

Member Function Documentation

◆ Alpha() [1/2]

const double& Alpha ( ) const
inline

Get the parameter controlling the range of the ReLU function.

Definition at line 116 of file flexible_relu.hpp.

◆ Alpha() [2/2]

double& Alpha ( )
inline

Modify the parameter controlling the range of the ReLU function.

Definition at line 118 of file flexible_relu.hpp.

◆ Backward()

void Backward ( const InputType &  input,
const OutputType &  gy,
OutputType &  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation.
gyThe backpropagated error.
gThe calculated gradient.

Referenced by FlexibleReLUType< InputType, OutputType >::Clone().

◆ Clone()

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

Referenced by FlexibleReLUType< InputType, OutputType >::Clone().

◆ Gradient()

void Gradient ( const InputType &  input,
const OutputType &  error,
OutputType &  gradient 
)

Calculate the gradient using the output delta and the input activation.

Parameters
inputThe input parameter used for calculating the gradient.
errorThe calculated error.
gradientThe calculated gradient.

Referenced by FlexibleReLUType< InputType, OutputType >::Clone().

◆ Parameters() [1/2]

OutputType const& Parameters ( ) const
inlinevirtual

Get the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 111 of file flexible_relu.hpp.

◆ Parameters() [2/2]

OutputType& Parameters ( )
inlinevirtual

Modify the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 113 of file flexible_relu.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the layer.

Referenced by FlexibleReLUType< InputType, OutputType >::WeightSize().

◆ SetWeights()

void SetWeights ( typename OutputType::elem_type *  weightsPtr)

Reset the layer parameter (alpha).

The method is called to assign the allocated memory to the learnable layer parameter.

Referenced by FlexibleReLUType< InputType, OutputType >::Clone().

◆ WeightSize()

const size_t WeightSize ( ) const
inlinevirtual

Get the total number of trainable weights in the layer.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 120 of file flexible_relu.hpp.

References FlexibleReLUType< InputType, OutputType >::serialize().


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/not_adapted/flexible_relu.hpp