LayerNormType< InputType, OutputType > Class Template Reference

Declaration of the Layer Normalization class. More...

Inheritance diagram for LayerNormType< InputType, OutputType >:

Public Member Functions

 LayerNormType ()
 Create the LayerNormType object. More...

 
 LayerNormType (const size_t size, const double eps=1e-8)
 Create the LayerNorm object for a specified number of input units. More...

 
void Backward (const InputType &input, const OutputType &gy, OutputType &g)
 Backward pass through the layer. More...

 
LayerNormTypeClone () const
 Clone the LayerNormType object. This handles polymorphism correctly. More...

 
double Epsilon () const
 Get the value of epsilon. More...

 
void Forward (const InputType &input, OutputType &output)
 Forward pass of Layer Normalization. More...

 
void Gradient (const InputType &input, const OutputType &error, OutputType &gradient)
 Calculate the gradient using the output delta and the input activations. More...

 
size_t InSize () const
 Get the number of input units. More...

 
OutputType Mean ()
 Get the mean across single training data. More...

 
OutputType const & Parameters () const
 Get the parameters. More...

 
OutputType & Parameters ()
 Modify the parameters. More...

 
void Reset ()
 Reset the layer parameters. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
OutputType Variance ()
 Get the variance across single training data. More...

 
const size_t WeightSize () const
 Get the total number of trainable weights in the layer. More...

 
- Public Member Functions inherited from Layer< InputType, OutputType >
 Layer ()
 Default constructor. More...

 
 Layer (const Layer &layer)
 Copy constructor. This is not responsible for copying weights! More...

 
 Layer (Layer &&layer)
 Move constructor. This is not responsible for moving weights! More...

 
virtual ~Layer ()
 Default deconstructor. More...

 
virtual void Backward (const InputType &, const InputType &, InputType &)
 Performs a backpropagation step through the layer, with respect to the given input. More...

 
virtual void ComputeOutputDimensions ()
 Compute the output dimensions. More...

 
virtual void CustomInitialize (InputType &, const size_t)
 Override the weight matrix of the layer. More...

 
virtual void Forward (const InputType &, InputType &)
 Takes an input object, and computes the corresponding output of the layer. More...

 
virtual void Forward (const InputType &, const InputType &)
 Takes an input and output object, and computes the corresponding loss of the layer. More...

 
virtual void Gradient (const InputType &, const InputType &, InputType &)
 Computing the gradient of the layer with respect to its own input. More...

 
const std::vector< size_t > & InputDimensions () const
 Get the input dimensions. More...

 
std::vector< size_t > & InputDimensions ()
 Modify the input dimensions. More...

 
virtual double Loss ()
 Get the layer loss. More...

 
virtual Layeroperator= (const Layer &layer)
 Copy assignment operator. This is not responsible for copying weights! More...

 
virtual Layeroperator= (Layer &&layer)
 Move assignment operator. This is not responsible for moving weights! More...

 
const std::vector< size_t > & OutputDimensions ()
 Get the output dimensions. More...

 
virtual size_t OutputSize () final
 Get the number of elements in the output from this layer. More...

 
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
virtual void SetWeights (typename InputType ::elem_type *)
 Reset the layer parameter. More...

 
virtual bool const & Training () const
 Get whether the layer is currently in training mode. More...

 
virtual bool & Training ()
 Modify whether the layer is currently in training mode. More...

 

Additional Inherited Members

- Protected Attributes inherited from Layer< InputType, OutputType >
std::vector< size_t > inputDimensions
 Logical input dimensions of each point. More...

 
std::vector< size_t > outputDimensions
 Logical output dimensions of each point. More...

 
bool training
 If true, the layer is in training mode; otherwise, it is in testing mode. More...

 
bool validOutputDimensions
 This is true if ComputeOutputDimensions() has been called, and outputDimensions can be considered to be up-to-date. More...

 

Detailed Description


template
<
typename
InputType
=
arma::mat
,
typename
OutputType
=
arma::mat
>

class mlpack::ann::LayerNormType< InputType, OutputType >

Declaration of the Layer Normalization class.

The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively over a single training data. These parameters are learnt by the network. Layer Normalization is different from Batch Normalization in the way that normalization is done for individual training cases, and the mean and standard deviations are computed across the layer dimensions, as opposed to across the batch.

For more information, refer to the following papers,

@article{Ba16,
author = {Jimmy Lei Ba, Jamie Ryan Kiros and Geoffrey E. Hinton},
title = {Layer Normalization},
volume = {abs/1607.06450},
year = {2016},
url = {http://arxiv.org/abs/1607.06450},
eprint = {1607.06450},
}
@article{Ioffe15,
author = {Sergey Ioffe and
Christian Szegedy},
title = {Batch Normalization: Accelerating Deep Network Training by
Reducing Internal Covariate Shift},
journal = {CoRR},
volume = {abs/1502.03167},
year = {2015},
url = {http://arxiv.org/abs/1502.03167},
eprint = {1502.03167},
}
Template Parameters
InputTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 65 of file layer_norm.hpp.

Constructor & Destructor Documentation

◆ LayerNormType() [1/2]

◆ LayerNormType() [2/2]

LayerNormType ( const size_t  size,
const double  eps = 1e-8 
)

Create the LayerNorm object for a specified number of input units.

Parameters
sizeThe number of input units.
epsThe epsilon added to variance to ensure numerical stability.

Member Function Documentation

◆ Backward()

void Backward ( const InputType &  input,
const OutputType &  gy,
OutputType &  g 
)

Backward pass through the layer.

Parameters
inputThe input activations.
gyThe backpropagated error.
gThe calculated gradient.

Referenced by LayerNormType< InputType, OutputType >::Clone().

◆ Clone()

◆ Epsilon()

double Epsilon ( ) const
inline

Get the value of epsilon.

Definition at line 134 of file layer_norm.hpp.

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Forward pass of Layer Normalization.

Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.

Parameters
inputInput data for the layer.
outputResulting output activations.

Referenced by LayerNormType< InputType, OutputType >::Clone().

◆ Gradient()

void Gradient ( const InputType &  input,
const OutputType &  error,
OutputType &  gradient 
)

Calculate the gradient using the output delta and the input activations.

Parameters
inputThe input activations.
errorThe calculated error.
gradientThe calculated gradient.

Referenced by LayerNormType< InputType, OutputType >::Clone().

◆ InSize()

size_t InSize ( ) const
inline

Get the number of input units.

Definition at line 131 of file layer_norm.hpp.

◆ Mean()

OutputType Mean ( )
inline

Get the mean across single training data.

Definition at line 125 of file layer_norm.hpp.

◆ Parameters() [1/2]

OutputType const& Parameters ( ) const
inlinevirtual

Get the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 120 of file layer_norm.hpp.

◆ Parameters() [2/2]

OutputType& Parameters ( )
inlinevirtual

Modify the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 122 of file layer_norm.hpp.

◆ Reset()

void Reset ( )

Reset the layer parameters.

Referenced by LayerNormType< InputType, OutputType >::Clone().

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the layer.

Referenced by LayerNormType< InputType, OutputType >::WeightSize().

◆ Variance()

OutputType Variance ( )
inline

Get the variance across single training data.

Definition at line 128 of file layer_norm.hpp.

◆ WeightSize()

const size_t WeightSize ( ) const
inlinevirtual

Get the total number of trainable weights in the layer.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 136 of file layer_norm.hpp.

References LayerNormType< InputType, OutputType >::serialize().


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/not_adapted/layer_norm.hpp