mlpack 3.4.2
Public Member Functions | List of all members
LeakyReLU< InputDataType, OutputDataType > Class Template Reference

The LeakyReLU activation function, defined by. More...

#include <leaky_relu.hpp>

Public Member Functions

 LeakyReLU (const double alpha=0.03)
 Create the LeakyReLU object using the specified parameters. More...
 
double & Alpha ()
 Modify the non zero gradient. More...
 
double const & Alpha () const
 Get the non zero gradient. More...
 
template<typename DataType >
void Backward (const DataType &input, const DataType &gy, DataType &g)
 Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...
 
OutputDataType & Delta ()
 Modify the delta. More...
 
OutputDataType const & Delta () const
 Get the delta. More...
 
template<typename InputType , typename OutputType >
void Forward (const InputType &input, OutputType &output)
 Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...
 
OutputDataType & OutputParameter ()
 Modify the output parameter. More...
 
OutputDataType const & OutputParameter () const
 Get the output parameter. More...
 
template<typename Archive >
void serialize (Archive &ar, const unsigned int)
 Serialize the layer. More...
 

Detailed Description

template<typename InputDataType = arma::mat, typename OutputDataType = arma::mat>
class mlpack::ann::LeakyReLU< InputDataType, OutputDataType >

The LeakyReLU activation function, defined by.

\begin{eqnarray*} f(x) &=& \max(x, alpha*x) \\ f'(x) &=& \left\{ \begin{array}{lr} 1 & : x > 0 \\ alpha & : x \le 0 \end{array} \right. \end{eqnarray*}

Template Parameters
InputDataTypeType of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).
OutputDataTypeType of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube).

Definition at line 44 of file leaky_relu.hpp.

Constructor & Destructor Documentation

◆ LeakyReLU()

LeakyReLU ( const double  alpha = 0.03)

Create the LeakyReLU object using the specified parameters.

The non zero gradient can be adjusted by specifying the parameter alpha in the range 0 to 1. Default (alpha = 0.03)

Parameters
alphaNon zero gradient

Member Function Documentation

◆ Alpha() [1/2]

double & Alpha ( )
inline

Modify the non zero gradient.

Definition at line 91 of file leaky_relu.hpp.

◆ Alpha() [2/2]

double const & Alpha ( ) const
inline

Get the non zero gradient.

Definition at line 89 of file leaky_relu.hpp.

◆ Backward()

void Backward ( const DataType &  input,
const DataType &  gy,
DataType &  g 
)

Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f.

Using the results from the feed forward pass.

Parameters
inputThe propagated input activation.
gyThe backpropagated error.
gThe calculated gradient.

◆ Delta() [1/2]

OutputDataType & Delta ( )
inline

Modify the delta.

Definition at line 86 of file leaky_relu.hpp.

◆ Delta() [2/2]

OutputDataType const & Delta ( ) const
inline

Get the delta.

Definition at line 84 of file leaky_relu.hpp.

◆ Forward()

void Forward ( const InputType &  input,
OutputType &  output 
)

Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f.

Parameters
inputInput data used for evaluating the specified function.
outputResulting output activation.

◆ OutputParameter() [1/2]

OutputDataType & OutputParameter ( )
inline

Modify the output parameter.

Definition at line 81 of file leaky_relu.hpp.

◆ OutputParameter() [2/2]

OutputDataType const & OutputParameter ( ) const
inline

Get the output parameter.

Definition at line 79 of file leaky_relu.hpp.

◆ serialize()

void serialize ( Archive &  ar,
const unsigned int   
)

Serialize the layer.


The documentation for this class was generated from the following file: