NN和Functional

NN

conv1d

batchnorm1d

  • CLASStorch.nn.``BatchNorm1d(num_features, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)[SOURCE]

    Applies Batch Normalization over a 2D or 3D input (a mini-batch of 1D inputs with optional additional channel dimension) as described in the paper Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift .\(y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta*y*=Var[*x*]+*ϵ**x*−E[*x*]∗*γ*+*β*\)The mean and standard-deviation are calculated per-dimension over the mini-batches and \gammaγ and \betaβ are learnable parameter vectors of size C (where C is the input size). By default, the elements of \gammaγ are set to 1 and the elements of \betaβ are set to 0. The standard-deviation is calculated via the biased estimator, equivalent to torch.var(input, unbiased=False).

Examples:

>>> # With Learnable Parameters
>>> m = nn.BatchNorm1d(100)
>>> # Without Learnable Parameters
>>> m = nn.BatchNorm1d(100, affine=False)
>>> input = torch.randn(20, 100)
>>> output = m(input)

Functional

posted @ 2021-03-01 17:18  zae  阅读(79)  评论(0编辑  收藏  举报