mlpack 3.4.2
|
Declaration of the Layer Normalization class. More...
#include <layer_norm.hpp>
Public Member Functions | |
LayerNorm () | |
Create the LayerNorm object. More... | |
LayerNorm (const size_t size, const double eps=1e-8) | |
Create the LayerNorm object for a specified number of input units. More... | |
template<typename eT > | |
void | Backward (const arma::Mat< eT > &input, const arma::Mat< eT > &gy, arma::Mat< eT > &g) |
Backward pass through the layer. More... | |
OutputDataType & | Delta () |
Modify the delta. More... | |
OutputDataType const & | Delta () const |
Get the delta. More... | |
double | Epsilon () const |
Get the value of epsilon. More... | |
template<typename eT > | |
void | Forward (const arma::Mat< eT > &input, arma::Mat< eT > &output) |
Forward pass of Layer Normalization. More... | |
OutputDataType & | Gradient () |
Modify the gradient. More... | |
OutputDataType const & | Gradient () const |
Get the gradient. More... | |
template<typename eT > | |
void | Gradient (const arma::Mat< eT > &input, const arma::Mat< eT > &error, arma::Mat< eT > &gradient) |
Calculate the gradient using the output delta and the input activations. More... | |
size_t | InSize () const |
Get the number of input units. More... | |
OutputDataType | Mean () |
Get the mean across single training data. More... | |
OutputDataType & | OutputParameter () |
Modify the output parameter. More... | |
OutputDataType const & | OutputParameter () const |
Get the output parameter. More... | |
OutputDataType & | Parameters () |
Modify the parameters. More... | |
OutputDataType const & | Parameters () const |
Get the parameters. More... | |
void | Reset () |
Reset the layer parameters. More... | |
template<typename Archive > | |
void | serialize (Archive &ar, const unsigned int) |
Serialize the layer. More... | |
OutputDataType | Variance () |
Get the variance across single training data. More... | |
Declaration of the Layer Normalization class.
The layer transforms the input data into zero mean and unit variance and then scales and shifts the data by parameters, gamma and beta respectively over a single training data. These parameters are learnt by the network. Layer Normalization is different from Batch Normalization in the way that normalization is done for individual training cases, and the mean and standard deviations are computed across the layer dimensions, as opposed to across the batch.
For more information, refer to the following papers,
InputDataType | Type of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
Definition at line 65 of file layer_norm.hpp.
LayerNorm | ( | const size_t | size, |
const double | eps = 1e-8 |
||
) |
Create the LayerNorm object for a specified number of input units.
size | The number of input units. |
eps | The epsilon added to variance to ensure numerical stability. |
void Backward | ( | const arma::Mat< eT > & | input, |
const arma::Mat< eT > & | gy, | ||
arma::Mat< eT > & | g | ||
) |
Backward pass through the layer.
input | The input activations. |
gy | The backpropagated error. |
g | The calculated gradient. |
|
inline |
Modify the delta.
Definition at line 132 of file layer_norm.hpp.
|
inline |
Get the delta.
Definition at line 130 of file layer_norm.hpp.
|
inline |
Get the value of epsilon.
Definition at line 149 of file layer_norm.hpp.
void Forward | ( | const arma::Mat< eT > & | input, |
arma::Mat< eT > & | output | ||
) |
Forward pass of Layer Normalization.
Transforms the input data into zero mean and unit variance, scales the data by a factor gamma and shifts it by beta.
input | Input data for the layer. |
output | Resulting output activations. |
|
inline |
Modify the gradient.
Definition at line 137 of file layer_norm.hpp.
|
inline |
Get the gradient.
Definition at line 135 of file layer_norm.hpp.
void Gradient | ( | const arma::Mat< eT > & | input, |
const arma::Mat< eT > & | error, | ||
arma::Mat< eT > & | gradient | ||
) |
Calculate the gradient using the output delta and the input activations.
input | The input activations. |
error | The calculated error. |
gradient | The calculated gradient. |
|
inline |
Get the number of input units.
Definition at line 146 of file layer_norm.hpp.
|
inline |
Get the mean across single training data.
Definition at line 140 of file layer_norm.hpp.
|
inline |
Modify the output parameter.
Definition at line 127 of file layer_norm.hpp.
|
inline |
Get the output parameter.
Definition at line 125 of file layer_norm.hpp.
|
inline |
Modify the parameters.
Definition at line 122 of file layer_norm.hpp.
|
inline |
Get the parameters.
Definition at line 120 of file layer_norm.hpp.
void Reset | ( | ) |
Reset the layer parameters.
void serialize | ( | Archive & | ar, |
const unsigned int | |||
) |
Serialize the layer.
|
inline |
Get the variance across single training data.
Definition at line 143 of file layer_norm.hpp.