KLDivergence< InputDataType, OutputDataType > Class Template Reference

The Kullback–Leibler divergence is often used for continuous distributions (direct regression). More...

Public Member Functions

 KLDivergence (const bool takeMean=false)
 Create the Kullback–Leibler Divergence object with the specified parameters. More...

 
template
<
typename
PredictionType
,
typename
TargetType
,
typename
LossType
>
void Backward (const PredictionType &prediction, const TargetType &target, LossType &loss)
 Ordinary feed backward pass of a neural network. More...

 
template
<
typename
PredictionType
,
typename
TargetType
>
PredictionType::elem_type Forward (const PredictionType &prediction, const TargetType &target)
 Computes the Kullback–Leibler divergence error function. More...

 
OutputDataType & OutputParameter () const
 Get the output parameter. More...

 
OutputDataType & OutputParameter ()
 Modify the output parameter. More...

 
template
<
typename
Archive
>
void serialize (Archive &ar, const uint32_t)
 Serialize the loss function. More...

 
bool TakeMean () const
 Get the value of takeMean. More...

 
bool & TakeMean ()
 Modify the value of takeMean. More...

 

Detailed Description


template
<
typename
InputDataType
=
arma::mat
,
typename
OutputDataType
=
arma::mat
>

class mlpack::ann::KLDivergence< InputDataType, OutputDataType >

The Kullback–Leibler divergence is often used for continuous distributions (direct regression).

For more information, see the following paper.

article{Kullback1951,
title = {On Information and Sufficiency},
author = {S. Kullback, R.A. Leibler},
journal = {The Annals of Mathematical Statistics},
year = {1951}
}
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 45 of file kl_divergence.hpp.

Constructor & Destructor Documentation

◆ KLDivergence()

KLDivergence ( const bool  takeMean = false)

Create the Kullback–Leibler Divergence object with the specified parameters.

Parameters
takeMeanBoolean variable to specify whether to take mean or not.

Member Function Documentation

◆ Backward()

void Backward ( const PredictionType &  prediction,
const TargetType &  target,
LossType &  loss 
)

Ordinary feed backward pass of a neural network.

Parameters
predictionPredictions used for evaluating the specified loss function.
targetThe target vector.
lossThe calculated error.

◆ Forward()

PredictionType::elem_type Forward ( const PredictionType &  prediction,
const TargetType &  target 
)

Computes the Kullback–Leibler divergence error function.

Parameters
predictionPredictions used for evaluating the specified loss function.
targetTarget data to compare with.

◆ OutputParameter() [1/2]

OutputDataType& OutputParameter ( ) const
inline

Get the output parameter.

Definition at line 81 of file kl_divergence.hpp.

◆ OutputParameter() [2/2]

OutputDataType& OutputParameter ( )
inline

Modify the output parameter.

Definition at line 83 of file kl_divergence.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const uint32_t   
)

Serialize the loss function.

Referenced by KLDivergence< InputDataType, OutputDataType >::TakeMean().

◆ TakeMean() [1/2]

bool TakeMean ( ) const
inline

Get the value of takeMean.

Definition at line 86 of file kl_divergence.hpp.

◆ TakeMean() [2/2]

bool& TakeMean ( )
inline

Modify the value of takeMean.

Definition at line 88 of file kl_divergence.hpp.

References KLDivergence< InputDataType, OutputDataType >::serialize().


The documentation for this class was generated from the following file:
  • /home/jenkins-mlpack/mlpack.org/_src/mlpack-git/src/mlpack/methods/ann/loss_functions/kl_divergence.hpp