mlpack  3.0.3
mlpack::ann Namespace Reference

Artificial Neural Network. More...

Namespaces

 augmented
 

Classes

class  Add
 Implementation of the Add module class. More...

 
class  AddMerge
 Implementation of the AddMerge module class. More...

 
class  AddVisitor
 AddVisitor exposes the Add() method of the given module. More...

 
class  AlphaDropout
 The alpha - dropout layer is a regularizer that randomly with probability 'ratio' sets input values to alphaDash. More...

 
class  AtrousConvolution
 Implementation of the Atrous Convolution class. More...

 
class  BackwardVisitor
 BackwardVisitor executes the Backward() function given the input, error and delta parameter. More...

 
class  BaseLayer
 Implementation of the base layer. More...

 
class  BatchNorm
 Declaration of the Batch Normalization layer class. More...

 
class  BilinearInterpolation
 Definition and Implementation of the Bilinear Interpolation Layer. More...

 
class  Concat
 Implementation of the Concat class. More...

 
class  ConcatPerformance
 Implementation of the concat performance class. More...

 
class  Constant
 Implementation of the constant layer. More...

 
class  ConstInitialization
 This class is used to initialize weight matrix with constant values. More...

 
class  Convolution
 Implementation of the Convolution class. More...

 
class  CopyVisitor
 This visitor is to support copy constructor for neural network module. More...

 
class  CrossEntropyError
 The cross-entropy performance function measures the network's performance according to the cross-entropy between the input and target distributions. More...

 
class  DeleteVisitor
 DeleteVisitor executes the destructor of the instantiated object. More...

 
class  DeltaVisitor
 DeltaVisitor exposes the delta parameter of the given module. More...

 
class  DeterministicSetVisitor
 DeterministicSetVisitor set the deterministic parameter given the deterministic value. More...

 
class  DropConnect
 The DropConnect layer is a regularizer that randomly with probability ratio sets the connection values to zero and scales the remaining elements by factor 1 /(1 - ratio). More...

 
class  Dropout
 The dropout layer is a regularizer that randomly with probability 'ratio' sets input values to zero and scales the remaining elements by factor 1 / (1 - ratio) rather than during test time so as to keep the expected sum same. More...

 
class  ELU
 The ELU activation function, defined by. More...

 
class  FastLSTM
 An implementation of a faster version of the Fast LSTM network layer. More...

 
class  FFN
 Implementation of a standard feed forward network. More...

 
class  FFTConvolution
 Computes the two-dimensional convolution through fft. More...

 
class  FlexibleReLU
 The FlexibleReLU activation function, defined by. More...

 
class  ForwardVisitor
 ForwardVisitor executes the Forward() function given the input and output parameter. More...

 
class  FullConvolution
 
class  GaussianInitialization
 This class is used to initialize weigth matrix with a gaussian. More...

 
class  Glimpse
 The glimpse layer returns a retina-like representation (down-scaled cropped images) of increasing scale around a given location in a given image. More...

 
class  GlorotInitializationType
 This class is used to initialize the weight matrix with the Glorot Initialization method. More...

 
class  GradientSetVisitor
 GradientSetVisitor update the gradient parameter given the gradient set. More...

 
class  GradientUpdateVisitor
 GradientUpdateVisitor update the gradient parameter given the gradient set. More...

 
class  GradientVisitor
 SearchModeVisitor executes the Gradient() method of the given module using the input and delta parameter. More...

 
class  GradientZeroVisitor
 
class  GRU
 An implementation of a gru network layer. More...

 
class  HardTanH
 The Hard Tanh activation function, defined by. More...

 
class  HeInitialization
 This class is used to initialize weight matrix with the He initialization rule given by He et. More...

 
class  IdentityFunction
 The identity function, defined by. More...

 
class  InitTraits
 This is a template class that can provide information about various initialization methods. More...

 
class  InitTraits< KathirvalavakumarSubavathiInitialization >
 Initialization traits of the kathirvalavakumar subavath initialization rule. More...

 
class  InitTraits< NguyenWidrowInitialization >
 Initialization traits of the Nguyen-Widrow initialization rule. More...

 
class  Join
 Implementation of the Join module class. More...

 
class  KathirvalavakumarSubavathiInitialization
 This class is used to initialize the weight matrix with the method proposed by T. More...

 
class  KLDivergence
 The KullbackÔÇôLeibler divergence is often used for continuous distributions (direct regression). More...

 
class  LayerNorm
 Declaration of the Layer Normalization class. More...

 
class  LayerTraits
 This is a template class that can provide information about various layers. More...

 
class  LeakyReLU
 The LeakyReLU activation function, defined by. More...

 
class  LecunNormalInitialization
 This class is used to initialize weight matrix with the Lecun Normalization initialization rule. More...

 
class  Linear
 Implementation of the Linear layer class. More...

 
class  LinearNoBias
 Implementation of the LinearNoBias class. More...

 
class  LoadOutputParameterVisitor
 LoadOutputParameterVisitor restores the output parameter using the given parameter set. More...

 
class  LogisticFunction
 The logistic function, defined by. More...

 
class  LogSoftMax
 Implementation of the log softmax layer. More...

 
class  Lookup
 Implementation of the Lookup class. More...

 
class  LSTM
 An implementation of a lstm network layer. More...

 
class  MaxPooling
 Implementation of the MaxPooling layer. More...

 
class  MaxPoolingRule
 
class  MeanPooling
 Implementation of the MeanPooling. More...

 
class  MeanPoolingRule
 
class  MeanSquaredError
 The mean squared error performance function measures the network's performance according to the mean of squared errors. More...

 
class  MultiplyConstant
 Implementation of the multiply constant layer. More...

 
class  MultiplyMerge
 Implementation of the MultiplyMerge module class. More...

 
class  NaiveConvolution
 Computes the two-dimensional convolution. More...

 
class  NegativeLogLikelihood
 Implementation of the negative log likelihood layer. More...

 
class  NetworkInitialization
 This class is used to initialize the network with the given initialization rule. More...

 
class  NguyenWidrowInitialization
 This class is used to initialize the weight matrix with the Nguyen-Widrow method. More...

 
class  OivsInitialization
 This class is used to initialize the weight matrix with the oivs method. More...

 
class  OrthogonalInitialization
 This class is used to initialize the weight matrix with the orthogonal matrix initialization. More...

 
class  OutputHeightVisitor
 OutputWidthVisitor exposes the OutputHeight() method of the given module. More...

 
class  OutputParameterVisitor
 OutputParameterVisitor exposes the output parameter of the given module. More...

 
class  OutputWidthVisitor
 OutputWidthVisitor exposes the OutputWidth() method of the given module. More...

 
class  ParametersSetVisitor
 ParametersSetVisitor update the parameters set using the given matrix. More...

 
class  ParametersVisitor
 ParametersVisitor exposes the parameters set of the given module and stores the parameters set into the given matrix. More...

 
class  PReLU
 The PReLU activation function, defined by (where alpha is trainable) More...

 
class  RandomInitialization
 This class is used to initialize randomly the weight matrix. More...

 
class  RectifierFunction
 The rectifier function, defined by. More...

 
class  Recurrent
 Implementation of the RecurrentLayer class. More...

 
class  RecurrentAttention
 This class implements the Recurrent Model for Visual Attention, using a variety of possible layer implementations. More...

 
class  ReinforceNormal
 Implementation of the reinforce normal layer. More...

 
class  ResetCellVisitor
 ResetCellVisitor executes the ResetCell() function. More...

 
class  ResetVisitor
 ResetVisitor executes the Reset() function. More...

 
class  RewardSetVisitor
 RewardSetVisitor set the reward parameter given the reward value. More...

 
class  RNN
 Implementation of a standard recurrent neural network container. More...

 
class  SaveOutputParameterVisitor
 SaveOutputParameterVisitor saves the output parameter into the given parameter set. More...

 
class  Select
 The select module selects the specified column from a given input matrix. More...

 
class  Sequential
 Implementation of the Sequential class. More...

 
class  SetInputHeightVisitor
 SetInputHeightVisitor updates the input height parameter with the given input height. More...

 
class  SetInputWidthVisitor
 SetInputWidthVisitor updates the input width parameter with the given input width. More...

 
class  SigmoidCrossEntropyError
 The SigmoidCrossEntropyError performance function measures the network's performance according to the cross-entropy function between the input and target distributions. More...

 
class  SoftplusFunction
 The softplus function, defined by. More...

 
class  SoftsignFunction
 The softsign function, defined by. More...

 
class  SVDConvolution
 Computes the two-dimensional convolution using singular value decomposition. More...

 
class  SwishFunction
 The swish function, defined by. More...

 
class  TanhFunction
 The tanh function, defined by. More...

 
class  TransposedConvolution
 Implementation of the Transposed Convolution class. More...

 
class  ValidConvolution
 
class  VRClassReward
 Implementation of the variance reduced classification reinforcement layer. More...

 
class  WeightSetVisitor
 WeightSetVisitor update the module parameters given the parameters set. More...

 
class  WeightSizeVisitor
 WeightSizeVisitor returns the number of weights of the given module. More...

 

Typedefs

template
<
class
ActivationFunction
=
LogisticFunction
,
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>
using CustomLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard Sigmoid layer. More...

 
template
<
typename
MatType
=
arma::mat
>
using Embedding = Lookup< MatType, MatType >
 
using GlorotInitialization = GlorotInitializationType< false >
 GlorotInitialization uses uniform distribution. More...

 
template
<
class
ActivationFunction
=
IdentityFunction
,
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>
using IdentityLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard Identity-Layer using the identity activation function. More...

 
template<typename... CustomLayers>
using LayerTypes = boost::variant< Add< arma::mat, arma::mat > *, AddMerge< arma::mat, arma::mat > *, AtrousConvolution< NaiveConvolution< ValidConvolution >, NaiveConvolution< FullConvolution >, NaiveConvolution< ValidConvolution >, arma::mat, arma::mat > *, BaseLayer< LogisticFunction, arma::mat, arma::mat > *, BaseLayer< IdentityFunction, arma::mat, arma::mat > *, BaseLayer< TanhFunction, arma::mat, arma::mat > *, BaseLayer< RectifierFunction, arma::mat, arma::mat > *, BatchNorm< arma::mat, arma::mat > *, BilinearInterpolation< arma::mat, arma::mat > *, Concat< arma::mat, arma::mat > *, ConcatPerformance< NegativeLogLikelihood< arma::mat, arma::mat >, arma::mat, arma::mat > *, Constant< arma::mat, arma::mat > *, Convolution< NaiveConvolution< ValidConvolution >, NaiveConvolution< FullConvolution >, NaiveConvolution< ValidConvolution >, arma::mat, arma::mat > *, TransposedConvolution< NaiveConvolution< ValidConvolution >, NaiveConvolution< FullConvolution >, NaiveConvolution< ValidConvolution >, arma::mat, arma::mat > *, DropConnect< arma::mat, arma::mat > *, Dropout< arma::mat, arma::mat > *, AlphaDropout< arma::mat, arma::mat > *, ELU< arma::mat, arma::mat > *, FlexibleReLU< arma::mat, arma::mat > *, Glimpse< arma::mat, arma::mat > *, HardTanH< arma::mat, arma::mat > *, Join< arma::mat, arma::mat > *, LayerNorm< arma::mat, arma::mat > *, LeakyReLU< arma::mat, arma::mat > *, Linear< arma::mat, arma::mat > *, LinearNoBias< arma::mat, arma::mat > *, LogSoftMax< arma::mat, arma::mat > *, Lookup< arma::mat, arma::mat > *, LSTM< arma::mat, arma::mat > *, GRU< arma::mat, arma::mat > *, FastLSTM< arma::mat, arma::mat > *, MaxPooling< arma::mat, arma::mat > *, MeanPooling< arma::mat, arma::mat > *, MultiplyConstant< arma::mat, arma::mat > *, MultiplyMerge< arma::mat, arma::mat > *, NegativeLogLikelihood< arma::mat, arma::mat > *, PReLU< arma::mat, arma::mat > *, Recurrent< arma::mat, arma::mat > *, RecurrentAttention< arma::mat, arma::mat > *, ReinforceNormal< arma::mat, arma::mat > *, Select< arma::mat, arma::mat > *, Sequential< arma::mat, arma::mat > *, VRClassReward< arma::mat, arma::mat > *, CustomLayers *... >
 
template
<
class
ActivationFunction
=
RectifierFunction
,
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>
using ReLULayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard rectified linear unit non-linearity layer. More...

 
using SELU = ELU< arma::mat, arma::mat >
 
template
<
class
ActivationFunction
=
LogisticFunction
,
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>
using SigmoidLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard Sigmoid-Layer using the logistic activation function. More...

 
template
<
class
ActivationFunction
=
TanhFunction
,
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>
using TanHLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType >
 Standard hyperbolic tangent layer. More...

 
using XavierInitialization = GlorotInitializationType< true >
 XavierInitilization is the popular name for this method. More...

 

Functions

 HAS_ANY_METHOD_FORM (Model, HasModelCheck)
 
 HAS_MEM_FUNC (Gradient, HasGradientCheck)
 
 HAS_MEM_FUNC (Deterministic, HasDeterministicCheck)
 
 HAS_MEM_FUNC (Parameters, HasParametersCheck)
 
 HAS_MEM_FUNC (Add, HasAddCheck)
 
 HAS_MEM_FUNC (Location, HasLocationCheck)
 
 HAS_MEM_FUNC (Reset, HasResetCheck)
 
 HAS_MEM_FUNC (ResetCell, HasResetCellCheck)
 
 HAS_MEM_FUNC (Reward, HasRewardCheck)
 
 HAS_MEM_FUNC (InputWidth, HasInputWidth)
 
 HAS_MEM_FUNC (InputHeight, HasInputHeight)
 
 HAS_MEM_FUNC (Rho, HasRho)
 

Detailed Description

Artificial Neural Network.

Typedef Documentation

◆ CustomLayer

using CustomLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard Sigmoid layer.

Definition at line 31 of file custom_layer.hpp.

◆ Embedding

using Embedding = Lookup<MatType, MatType>

Definition at line 139 of file lookup.hpp.

◆ GlorotInitialization

GlorotInitialization uses uniform distribution.

Definition at line 148 of file glorot_init.hpp.

◆ IdentityLayer

using IdentityLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard Identity-Layer using the identity activation function.

Definition at line 147 of file base_layer.hpp.

◆ LayerTypes

using LayerTypes = boost::variant< Add<arma::mat, arma::mat>*, AddMerge<arma::mat, arma::mat>*, AtrousConvolution<NaiveConvolution<ValidConvolution>, NaiveConvolution<FullConvolution>, NaiveConvolution<ValidConvolution>, arma::mat, arma::mat>*, BaseLayer<LogisticFunction, arma::mat, arma::mat>*, BaseLayer<IdentityFunction, arma::mat, arma::mat>*, BaseLayer<TanhFunction, arma::mat, arma::mat>*, BaseLayer<RectifierFunction, arma::mat, arma::mat>*, BatchNorm<arma::mat, arma::mat>*, BilinearInterpolation<arma::mat, arma::mat>*, Concat<arma::mat, arma::mat>*, ConcatPerformance<NegativeLogLikelihood<arma::mat, arma::mat>, arma::mat, arma::mat>*, Constant<arma::mat, arma::mat>*, Convolution<NaiveConvolution<ValidConvolution>, NaiveConvolution<FullConvolution>, NaiveConvolution<ValidConvolution>, arma::mat, arma::mat>*, TransposedConvolution<NaiveConvolution<ValidConvolution>, NaiveConvolution<FullConvolution>, NaiveConvolution<ValidConvolution>, arma::mat, arma::mat>*, DropConnect<arma::mat, arma::mat>*, Dropout<arma::mat, arma::mat>*, AlphaDropout<arma::mat, arma::mat>*, ELU<arma::mat, arma::mat>*, FlexibleReLU<arma::mat, arma::mat>*, Glimpse<arma::mat, arma::mat>*, HardTanH<arma::mat, arma::mat>*, Join<arma::mat, arma::mat>*, LayerNorm<arma::mat, arma::mat>*, LeakyReLU<arma::mat, arma::mat>*, Linear<arma::mat, arma::mat>*, LinearNoBias<arma::mat, arma::mat>*, LogSoftMax<arma::mat, arma::mat>*, Lookup<arma::mat, arma::mat>*, LSTM<arma::mat, arma::mat>*, GRU<arma::mat, arma::mat>*, FastLSTM<arma::mat, arma::mat>*, MaxPooling<arma::mat, arma::mat>*, MeanPooling<arma::mat, arma::mat>*, MultiplyConstant<arma::mat, arma::mat>*, MultiplyMerge<arma::mat, arma::mat>*, NegativeLogLikelihood<arma::mat, arma::mat>*, PReLU<arma::mat, arma::mat>*, Recurrent<arma::mat, arma::mat>*, RecurrentAttention<arma::mat, arma::mat>*, ReinforceNormal<arma::mat, arma::mat>*, Select<arma::mat, arma::mat>*, Sequential<arma::mat, arma::mat>*, VRClassReward<arma::mat, arma::mat>*, CustomLayers*... >

Definition at line 184 of file layer_types.hpp.

◆ ReLULayer

using ReLULayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard rectified linear unit non-linearity layer.

Definition at line 158 of file base_layer.hpp.

◆ SELU

using SELU = ELU<arma::mat, arma::mat>

Definition at line 259 of file elu.hpp.

◆ SigmoidLayer

using SigmoidLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard Sigmoid-Layer using the logistic activation function.

Definition at line 136 of file base_layer.hpp.

◆ TanHLayer

using TanHLayer = BaseLayer< ActivationFunction, InputDataType, OutputDataType>

Standard hyperbolic tangent layer.

Definition at line 169 of file base_layer.hpp.

◆ XavierInitialization

XavierInitilization is the popular name for this method.

Definition at line 143 of file glorot_init.hpp.

Function Documentation

◆ HAS_ANY_METHOD_FORM()

mlpack::ann::HAS_ANY_METHOD_FORM ( Model  ,
HasModelCheck   
)

◆ HAS_MEM_FUNC() [1/11]

mlpack::ann::HAS_MEM_FUNC ( Gradient  ,
HasGradientCheck   
)

◆ HAS_MEM_FUNC() [2/11]

mlpack::ann::HAS_MEM_FUNC ( Deterministic  ,
HasDeterministicCheck   
)

◆ HAS_MEM_FUNC() [3/11]

mlpack::ann::HAS_MEM_FUNC ( Parameters  ,
HasParametersCheck   
)

◆ HAS_MEM_FUNC() [4/11]

mlpack::ann::HAS_MEM_FUNC ( Add  ,
HasAddCheck   
)

◆ HAS_MEM_FUNC() [5/11]

mlpack::ann::HAS_MEM_FUNC ( Location  ,
HasLocationCheck   
)

◆ HAS_MEM_FUNC() [6/11]

mlpack::ann::HAS_MEM_FUNC ( Reset  ,
HasResetCheck   
)

◆ HAS_MEM_FUNC() [7/11]

mlpack::ann::HAS_MEM_FUNC ( ResetCell  ,
HasResetCellCheck   
)

◆ HAS_MEM_FUNC() [8/11]

mlpack::ann::HAS_MEM_FUNC ( Reward  ,
HasRewardCheck   
)

◆ HAS_MEM_FUNC() [9/11]

mlpack::ann::HAS_MEM_FUNC ( InputWidth  ,
HasInputWidth   
)

◆ HAS_MEM_FUNC() [10/11]

mlpack::ann::HAS_MEM_FUNC ( InputHeight  ,
HasInputHeight   
)

◆ HAS_MEM_FUNC() [11/11]

mlpack::ann::HAS_MEM_FUNC ( Rho  ,
HasRho   
)