SMAUG
Simulating Machine Learning Applications on gem5-Aladdin
Public Types | Public Member Functions | Static Public Attributes | Protected Attributes | List of all members
smaug::BatchNormOp< Backend > Class Template Reference

Implements the batch normalization layer. More...

#include <batch_norm_op.h>

Detailed Description

template<typename Backend>
class smaug::BatchNormOp< Backend >

Implements the batch normalization layer.

The four different parameter types to BN layers (mean, v ariance, gamma, and beta) are to be provided as separate Tensors. For performance reasons, the variance parameter is often precomputed as 1/sqrt(variance + eps), so hardware does not need to do a square root/division operation.

Template Parameters
BackendThe Backend specialization of this Operator.

Definition at line 46 of file backend.h.

Public Types

enum  {
  Inputs, Mean, Variance, Gamma,
  Scaleshift = Gamma, Beta, kNumInputs
}
 
enum  { Outputs, kNumOutputs }
 

Public Member Functions

 BatchNormOp (const std::string &name, Workspace *workspace)
 
void run () override
 
TensorShape inferOutputShape () const
 
TensorShape inferWeightsShape () const
 
void createWeightsTensors ()
 
void createOutputTensors ()
 
void createAllTensors () override
 
int getNumParameters () const override
 
std::vector< TensorBase * > getParameterizableInputs () override
 
bool isSamplingSupported () const override
 
void setSamplingInfo (const SamplingInfo &_sampling) override
 
void run ()
 

Static Public Attributes

static constexpr float kEpsilon = 1e-5
 

Protected Attributes

const std::string meanName
 
const std::string varianceName
 
const std::string gammaName
 
const std::string betaName
 
SamplingInfo sampling
 

The documentation for this class was generated from the following files: