WeightNormType< InputType, OutputType > Class Template Reference

Declaration of the WeightNorm layer class. More...

Inheritance diagram for WeightNormType< InputType, OutputType >:

Public Member Functions

 WeightNormType ()
 Create an empty WeightNorm layer. More...

 
 WeightNormType (Layer< InputType, OutputType > *layer)
 Create the WeightNorm layer object. More...

 
 WeightNormType (const WeightNormType &other)
 Create a WeightNorm layer by copying the given layer. More...

 
 WeightNormType (WeightNormType &&other)
 Create a WeightNorm layer by taking ownership of the other layer. More...

 
 ~WeightNormType ()
 Destructor to release allocated memory. More...

 
void Backward (const InputType &input, const OutputType &gy, OutputType &g)
 Backward pass through the layer. More...

 
WeightNormTypeClone () const
 Clone the WeightNormType object. This handles polymorphism correctly. More...

 
void Forward (const InputType &input, OutputType &output)
 Forward pass of the WeightNorm layer. More...

 
void Gradient (const InputType &input, const OutputType &error, OutputType &gradient)
 Calculate the gradient using the output delta, input activations and the weights of the wrapped layer. More...

 
WeightNormTypeoperator= (const WeightNormType &other)
 Copy the given layer. More...

 
WeightNormTypeoperator= (WeightNormType &&other)
 Take ownership of the data in the given layer. More...

 
const std::vector< size_t > OutputDimensions () const
 
OutputType const & Parameters () const
 Get the parameters. More...

 
OutputType & Parameters ()
 Modify the parameters. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
void SetWeights (typename OutputType::elem_type *weightsPtr)
 Reset the layer parameters. More...

 
const size_t WeightSize () const
 Get the total number of trainable weights in the layer. More...

 
Layer< InputType, OutputType > *const & WrappedLayer ()
 Get the wrapped layer. More...

 
- Public Member Functions inherited from Layer< InputType, OutputType >
 Layer ()
 Default constructor. More...

 
 Layer (const Layer &layer)
 Copy constructor. This is not responsible for copying weights! More...

 
 Layer (Layer &&layer)
 Move constructor. This is not responsible for moving weights! More...

 
virtual ~Layer ()
 Default deconstructor. More...

 
virtual void Backward (const InputType &, const InputType &, InputType &)
 Performs a backpropagation step through the layer, with respect to the given input. More...

 
virtual void ComputeOutputDimensions ()
 Compute the output dimensions. More...

 
virtual void Forward (const InputType &, InputType &)
 Takes an input object, and computes the corresponding output of the layer. More...

 
virtual void Forward (const InputType &, const InputType &)
 Takes an input and output object, and computes the corresponding loss of the layer. More...

 
virtual void Gradient (const InputType &, const InputType &, InputType &)
 Computing the gradient of the layer with respect to its own input. More...

 
const std::vector< size_t > & InputDimensions () const
 Get the input dimensions. More...

 
std::vector< size_t > & InputDimensions ()
 Modify the input dimensions. More...

 
virtual double Loss ()
 Get the layer loss. More...

 
virtual Layeroperator= (const Layer &layer)
 Copy assignment operator. This is not responsible for copying weights! More...

 
virtual Layeroperator= (Layer &&layer)
 Move assignment operator. This is not responsible for moving weights! More...

 
const std::vector< size_t > & OutputDimensions ()
 Get the output dimensions. More...

 
virtual size_t OutputSize () final
 Get the number of elements in the output from this layer. More...

 
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
virtual void SetWeights (typename InputType ::elem_type *)
 Reset the layer parameter. More...

 
virtual bool const & Training () const
 Get whether the layer is currently in training mode. More...

 
virtual bool & Training ()
 Modify whether the layer is currently in training mode. More...

 

Additional Inherited Members

- Protected Attributes inherited from Layer< InputType, OutputType >
std::vector< size_t > inputDimensions
 Logical input dimensions of each point. More...

 
std::vector< size_t > outputDimensions
 Logical output dimensions of each point. More...

 
bool training
 If true, the layer is in training mode; otherwise, it is in testing mode. More...

 
bool validOutputDimensions
 This is true if ComputeOutputDimensions() has been called, and outputDimensions can be considered to be up-to-date. More...

 

Detailed Description


template
<
typename
InputType
=
arma::mat
,
typename
OutputType
=
arma::mat
>

class mlpack::ann::WeightNormType< InputType, OutputType >

Declaration of the WeightNorm layer class.

The layer reparameterizes the weight vectors in a neural network, decoupling the length of those weight vectors from their direction. This reparameterization does not introduce any dependencies between the examples in a mini-batch.

This class will be a wrapper around existing layers. It will just modify the calculation and updation of weights of the layer.

For more information, refer to the following paper,

@inproceedings{Salimans2016WeightNorm,
title = {Weight Normalization: A Simple Reparameterization to Accelerate
Training of Deep Neural Networks},
author = {Tim Salimans, Diederik P. Kingma},
booktitle = {Neural Information Processing Systems 2016},
year = {2016},
url = {https://arxiv.org/abs/1602.07868},
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 52 of file weight_norm.hpp.

Constructor & Destructor Documentation

◆ WeightNormType() [1/4]

Create an empty WeightNorm layer.

Referenced by WeightNormType< InputType, OutputType >::Clone().

◆ WeightNormType() [2/4]

WeightNormType ( Layer< InputType, OutputType > *  layer)

Create the WeightNorm layer object.

Parameters
layerThe layer whose weights are needed to be normalized.

◆ ~WeightNormType()

Destructor to release allocated memory.

◆ WeightNormType() [3/4]

WeightNormType ( const WeightNormType< InputType, OutputType > &  other)

Create a WeightNorm layer by copying the given layer.

◆ WeightNormType() [4/4]

WeightNormType ( WeightNormType< InputType, OutputType > &&  other)

Create a WeightNorm layer by taking ownership of the other layer.

Member Function Documentation

◆ Backward()

void Backward ( const InputType &  input,
const OutputType &  gy,
OutputType &  g 
)

Backward pass through the layer.

This function calls the Backward() function of the wrapped layer.

Parameters
inputThe input activations.
gyThe backpropagated error.
gThe calculated gradient.

Referenced by WeightNormType< InputType, OutputType >::Clone().

◆ Clone()

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Forward pass of the WeightNorm layer.

Calculates the weights of the wrapped layer from the parameter vector v and the scalar parameter g. It then calulates the output of the wrapped layer from the calculated weights.

Parameters
inputInput data for the layer.
outputResulting output activations.

Referenced by WeightNormType< InputType, OutputType >::Clone().

◆ Gradient()

void Gradient ( const InputType &  input,
const OutputType &  error,
OutputType &  gradient 
)

Calculate the gradient using the output delta, input activations and the weights of the wrapped layer.

Parameters
inputThe input activations.
errorThe calculated error.
gradientThe calculated gradient.

Referenced by WeightNormType< InputType, OutputType >::Clone().

◆ operator=() [1/2]

WeightNormType& operator= ( const WeightNormType< InputType, OutputType > &  other)

Copy the given layer.

◆ operator=() [2/2]

WeightNormType& operator= ( WeightNormType< InputType, OutputType > &&  other)

Take ownership of the data in the given layer.

◆ OutputDimensions()

◆ Parameters() [1/2]

OutputType const& Parameters ( ) const
inlinevirtual

Get the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 123 of file weight_norm.hpp.

◆ Parameters() [2/2]

OutputType& Parameters ( )
inlinevirtual

Modify the parameters.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 125 of file weight_norm.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

◆ SetWeights()

void SetWeights ( typename OutputType::elem_type *  weightsPtr)

Reset the layer parameters.

Referenced by WeightNormType< InputType, OutputType >::Clone().

◆ WeightSize()

const size_t WeightSize ( ) const
inlinevirtual

Get the total number of trainable weights in the layer.

Reimplemented from Layer< InputType, OutputType >.

Definition at line 130 of file weight_norm.hpp.

References Layer< MatType >::WeightSize().

◆ WrappedLayer()

Layer<InputType, OutputType>* const& WrappedLayer ( )
inline

Get the wrapped layer.

Definition at line 128 of file weight_norm.hpp.


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/not_adapted/weight_norm.hpp