SMAUG
Simulating Machine Learning Applications on gem5-Aladdin
|
Implements the batch normalization layer. More...
#include <batch_norm_op.h>
Implements the batch normalization layer.
The four different parameter types to BN layers (mean, v ariance, gamma, and beta) are to be provided as separate Tensors. For performance reasons, the variance parameter is often precomputed as 1/sqrt(variance + eps), so hardware does not need to do a square root/division operation.
Backend | The Backend specialization of this Operator. |
Public Types | |
enum | { Inputs, Mean, Variance, Gamma, Scaleshift = Gamma, Beta, kNumInputs } |
enum | { Outputs, kNumOutputs } |
Public Member Functions | |
BatchNormOp (const std::string &name, Workspace *workspace) | |
void | run () override |
TensorShape | inferOutputShape () const |
TensorShape | inferWeightsShape () const |
void | createWeightsTensors () |
void | createOutputTensors () |
void | createAllTensors () override |
int | getNumParameters () const override |
std::vector< TensorBase * > | getParameterizableInputs () override |
bool | isSamplingSupported () const override |
void | setSamplingInfo (const SamplingInfo &_sampling) override |
void | run () |
Static Public Attributes | |
static constexpr float | kEpsilon = 1e-5 |
Protected Attributes | |
const std::string | meanName |
const std::string | varianceName |
const std::string | gammaName |
const std::string | betaName |
SamplingInfo | sampling |