mlpack 3.4.2
Public Member Functions | List of all members
ELU< InputDataType, OutputDataType > Class Template Reference

The ELU activation function, defined by. More...

#include <elu.hpp>

Public Member Functions

 ELU ()
 Create the ELU object. More...
 
 ELU (const double alpha)
 Create the ELU object using the specified parameter. More...
 
double & Alpha ()
 Modify the non zero gradient. More...
 
double const & Alpha () const
 Get the non zero gradient. More...
 
template<typename DataType >
void Backward (const DataType &input, const DataType &gy, DataType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...
 
OutputDataType & Delta ()
 Modify the delta. More...
 
OutputDataType const & Delta () const
 Get the delta. More...
 
bool & Deterministic ()
 Modify the value of deterministic parameter. More...
 
bool Deterministic () const
 Get the value of deterministic parameter. More...
 
template<typename InputType , typename OutputType >
void Forward (const InputType &input, OutputType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...
 
double const & Lambda () const
 Get the lambda parameter. More...
 
OutputDataType & OutputParameter ()
 Modify the output parameter. More...
 
OutputDataType const & OutputParameter () const
 Get the output parameter. More...
 
template<typename Archive >
void serialize (Archive &ar, const unsigned int)
 Serialize the layer. More...
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::ELU< InputDataType, OutputDataType >

The ELU activation function, defined by.

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} x & : x > 0 \\ \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ f(x) + \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Clevert2015,
author = {Djork{-}Arn{\'{e}} Clevert and Thomas Unterthiner and
Sepp Hochreiter},
title = {Fast and Accurate Deep Network Learning by Exponential Linear
Units (ELUs)},
journal = {CoRR},
year = {2015},
url = {https://arxiv.org/abs/1511.07289}
}

The SELU activation function is defined by

\begin{eqnarray*} f(x) &=& \left\{ \begin{array}{lr} \lambda * x & : x > 0 \\ \lambda * \alpha(e^x - 1) & : x \le 0 \end{array} \right. \\ f'(x) &=& \left\{ \begin{array}{lr} \lambda & : x > 0 \\ f(x) + \lambda * \alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

For more information, read the following paper:

@article{Klambauer2017,
author = {Gunter Klambauer and Thomas Unterthiner and
Andreas Mayr},
title = {Self-Normalizing Neural Networks},
journal = {Advances in Neural Information Processing Systems},
year = {2017},
url = {https://arxiv.org/abs/1706.02515}
}

In the deterministic mode, there is no computation of the derivative.

Note
During training deterministic should be set to false and during testing/inference deterministic should be set to true.
Make sure to use SELU activation function with normalized inputs and weights initialized with Lecun Normal Initialization.
Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 111 of file elu.hpp.

Constructor & Destructor Documentation

◆ ELU() [1/2]

ELU ( )

Create the ELU object.

NOTE: Use this constructor for SELU activation function.

◆ ELU() [2/2]

ELU ( const double  alpha)

Create the ELU object using the specified parameter.

The non zero gradient for negative inputs can be adjusted by specifying the ELU hyperparameter alpha (alpha > 0).

Note
Use this constructor for ELU activation function.
Parameters
alphaScale parameter for the negative factor.

Member Function Documentation

◆ Alpha() [1/2]

double & Alpha ( )
inline

Modify the non zero gradient.

Definition at line 166 of file elu.hpp.

◆ Alpha() [2/2]

double const & Alpha ( ) const
inline

Get the non zero gradient.

Definition at line 164 of file elu.hpp.

◆ Backward()

void Backward ( const DataType &  input,
const DataType &  gy,
DataType &  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation f(x).
gyThe backpropagated error.
gThe calculated gradient.

◆ Delta() [1/2]

OutputDataType & Delta ( )
inline

Modify the delta.

Definition at line 161 of file elu.hpp.

◆ Delta() [2/2]

OutputDataType const & Delta ( ) const
inline

Get the delta.

Definition at line 159 of file elu.hpp.

◆ Deterministic() [1/2]

bool & Deterministic ( )
inline

Modify the value of deterministic parameter.

Definition at line 171 of file elu.hpp.

◆ Deterministic() [2/2]

bool Deterministic ( ) const
inline

Get the value of deterministic parameter.

Definition at line 169 of file elu.hpp.

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

◆ Lambda()

double const & Lambda ( ) const
inline

Get the lambda parameter.

Definition at line 174 of file elu.hpp.

◆ OutputParameter() [1/2]

OutputDataType & OutputParameter ( )
inline

Modify the output parameter.

Definition at line 156 of file elu.hpp.

◆ OutputParameter() [2/2]

OutputDataType const & OutputParameter ( ) const
inline

Get the output parameter.

Definition at line 154 of file elu.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const unsigned int   
)

Serialize the layer.


The documentation for this class was generated from the following file: