SoftShrinkType< InputType, OutputType > Class Template Reference

Soft Shrink operator is defined as,

\begin{eqnarray*} f(x) &=& \begin{cases} x - \lambda & : x > \lambda \\ x + \lambda & : x < -\lambda \\ 0 & : otherwise. \\ \end{cases} \\ f'(x) &=& \begin{cases} 1 & : x > \lambda \\ 1 & : x < -\lambda \\ 0 & : otherwise. \end{cases} \end{eqnarray*}

. More...

Inheritance diagram for SoftShrinkType< InputType, OutputType >:

Public Member Functions

 SoftShrinkType (const double lambda=0.5)
 Create SoftShrink object using specified hyperparameter lambda. More...

 
void Backward (const InputType &input, const OutputType &gy, OutputType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...

 
SoftShrinkTypeClone () const
 Clone the SoftShrinkType object. This handles polymorphism correctly. More...

 
void Forward (const InputType &input, OutputType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...

 
double const & Lambda () const
 Get the hyperparameter lambda. More...

 
double & Lambda ()
 Modify the hyperparameter lambda. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
- Public Member Functions inherited from Layer< InputType, OutputType >
 Layer ()
 Default constructor. More...

 
 Layer (const Layer &layer)
 Copy constructor. This is not responsible for copying weights! More...

 
 Layer (Layer &&layer)
 Move constructor. This is not responsible for moving weights! More...

 
virtual ~Layer ()
 Default deconstructor. More...

 
virtual void Backward (const InputType &, const InputType &, InputType &)
 Performs a backpropagation step through the layer, with respect to the given input. More...

 
virtual void ComputeOutputDimensions ()
 Compute the output dimensions. More...

 
virtual void CustomInitialize (InputType &, const size_t)
 Override the weight matrix of the layer. More...

 
virtual void Forward (const InputType &, InputType &)
 Takes an input object, and computes the corresponding output of the layer. More...

 
virtual void Forward (const InputType &, const InputType &)
 Takes an input and output object, and computes the corresponding loss of the layer. More...

 
virtual void Gradient (const InputType &, const InputType &, InputType &)
 Computing the gradient of the layer with respect to its own input. More...

 
const std::vector< size_t > & InputDimensions () const
 Get the input dimensions. More...

 
std::vector< size_t > & InputDimensions ()
 Modify the input dimensions. More...

 
virtual double Loss ()
 Get the layer loss. More...

 
virtual Layeroperator= (const Layer &layer)
 Copy assignment operator. This is not responsible for copying weights! More...

 
virtual Layeroperator= (Layer &&layer)
 Move assignment operator. This is not responsible for moving weights! More...

 
const std::vector< size_t > & OutputDimensions ()
 Get the output dimensions. More...

 
virtual size_t OutputSize () final
 Get the number of elements in the output from this layer. More...

 
virtual const InputType & Parameters () const
 Get the parameters. More...

 
virtual InputType & Parameters ()
 Set the parameters. More...

 
void serialize (Archive &ar, const uint32_t)
 Serialize the layer. More...

 
virtual void SetWeights (typename InputType ::elem_type *)
 Reset the layer parameter. More...

 
virtual bool const & Training () const
 Get whether the layer is currently in training mode. More...

 
virtual bool & Training ()
 Modify whether the layer is currently in training mode. More...

 
virtual size_t WeightSize () const
 Get the total number of trainable weights in the layer. More...

 

Additional Inherited Members

- Protected Attributes inherited from Layer< InputType, OutputType >
std::vector< size_t > inputDimensions
 Logical input dimensions of each point. More...

 
std::vector< size_t > outputDimensions
 Logical output dimensions of each point. More...

 
bool training
 If true, the layer is in training mode; otherwise, it is in testing mode. More...

 
bool validOutputDimensions
 This is true if ComputeOutputDimensions() has been called, and outputDimensions can be considered to be up-to-date. More...

 

Detailed Description


template
<
typename
InputType
=
arma::mat
,
typename
OutputType
=
arma::mat
>

class mlpack::ann::SoftShrinkType< InputType, OutputType >

Soft Shrink operator is defined as,

\begin{eqnarray*} f(x) &=& \begin{cases} x - \lambda & : x > \lambda \\ x + \lambda & : x < -\lambda \\ 0 & : otherwise. \\ \end{cases} \\ f'(x) &=& \begin{cases} 1 & : x > \lambda \\ 1 & : x < -\lambda \\ 0 & : otherwise. \end{cases} \end{eqnarray*}

.

Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 49 of file softshrink.hpp.

Constructor & Destructor Documentation

◆ SoftShrinkType()

SoftShrinkType ( const double  lambda = 0.5)

Create SoftShrink object using specified hyperparameter lambda.

Parameters
lambdaThe noise level of an image depends on settings of an imaging device. The settings can be used to select appropriate parameters for denoising methods. It is proportional to the noise level entered by the user. And it is calculated by multiplying the noise level sigma of the input(noisy image) and a coefficient 'a' which is one of the training parameters. Default value of lambda is 0.5.

Referenced by SoftShrinkType< InputType, OutputType >::Clone().

Member Function Documentation

◆ Backward()

void Backward ( const InputType &  input,
const OutputType &  gy,
OutputType &  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation f(x).
gyThe backpropagated error.
gThe calculated gradient

Referenced by SoftShrinkType< InputType, OutputType >::Clone().

◆ Clone()

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the Soft Shrink function.
outputResulting output activation

Referenced by SoftShrinkType< InputType, OutputType >::Clone().

◆ Lambda() [1/2]

double const& Lambda ( ) const
inline

Get the hyperparameter lambda.

Definition at line 89 of file softshrink.hpp.

◆ Lambda() [2/2]

double& Lambda ( )
inline

Modify the hyperparameter lambda.

Definition at line 91 of file softshrink.hpp.

References SoftShrinkType< InputType, OutputType >::serialize().

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the layer.

Referenced by SoftShrinkType< InputType, OutputType >::Lambda().


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/layer/not_adapted/softshrink.hpp