IgANet
IgANets - Isogeometric Analysis Networks
Loading...
Searching...
No Matches
layer.hpp File Reference

Network layer. More...

#include <core.hpp>
#include <utils/fqn.hpp>

Go to the source code of this file.

Classes

class  iganet::ActivationFunction
 Abstract activation function structure. More...
 
class  iganet::BatchNorm
 Batch Normalization as described in the paper. More...
 
class  iganet::CELU
 Continuously Differentiable Exponential Linear Units activation function. More...
 
class  iganet::ELU
 Exponential Linear Units activation function. More...
 
class  iganet::GELU
 Gaussian Error Linear Units activation function. More...
 
class  iganet::GLU
 Grated Linear Units activation function. More...
 
class  iganet::GroupNorm
 Group Normalization over a mini-batch of inputs as described in the paper Group Normalization, https://arxiv.org/abs/1803.08494. More...
 
class  iganet::GumbelSoftmax
 Gumbel-Softmax distribution activation function. More...
 
class  iganet::Hardshrink
 Hard shrinkish activation function. More...
 
class  iganet::Hardsigmoid
 Hardsigmoid activation function. More...
 
class  iganet::Hardswish
 Hardswish activation function. More...
 
class  iganet::Hardtanh
 Hardtanh activation function. More...
 
class  iganet::InstanceNorm
 Instance Normalization as described in the paper. More...
 
class  iganet::LayerNorm
 Layer Normalization as described in the paper. More...
 
class  iganet::LeakyReLU
 Leaky ReLU activation function. More...
 
class  iganet::LocalResponseNorm
 Local response Normalization. More...
 
class  iganet::LogSigmoid
 LogSigmoid activation function. More...
 
class  iganet::LogSoftmax
 LogSoftmax activation function. More...
 
class  iganet::Mish
 Mish activation function. More...
 
class  iganet::None
 No-op activation function. More...
 
class  iganet::Normalize
 Lp Normalization. More...
 
class  iganet::PReLU
 PReLU activation function. More...
 
class  iganet::ReLU
 ReLU activation function. More...
 
class  iganet::ReLU6
 ReLU6 activation function. More...
 
class  iganet::RReLU
 Randomized ReLU activation function. More...
 
class  iganet::SELU
 SELU activation function. More...
 
class  iganet::Sigmoid
 Sigmoid activation function. More...
 
class  iganet::SiLU
 Sigmoid Linear Unit activation function. More...
 
class  iganet::Softmax
 Softmax activation function. More...
 
class  iganet::Softmin
 Softmin activation function. More...
 
class  iganet::Softplus
 Softplus activation function. More...
 
class  iganet::Softshrink
 Softshrink activation function. More...
 
class  iganet::Softsign
 Softsign activation function. More...
 
class  iganet::Tanh
 Tanh activation function. More...
 
class  iganet::Tanhshrink
 Tanhshrink activation function. More...
 
class  iganet::Threshold
 Threshold activation function. More...
 

Namespaces

namespace  iganet
 

Enumerations

enum class  iganet::activation : short_t {
  iganet::none = 0 , iganet::batch_norm = 1 , iganet::celu = 2 , iganet::elu = 3 ,
  iganet::gelu = 4 , iganet::glu = 5 , iganet::group_norm = 6 , iganet::gumbel_softmax = 7 ,
  iganet::hardshrink = 9 , iganet::hardsigmoid = 8 , iganet::hardswish = 10 , iganet::hardtanh = 11 ,
  iganet::instance_norm = 12 , iganet::layer_norm = 13 , iganet::leaky_relu = 14 , iganet::local_response_norm = 15 ,
  iganet::logsigmoid = 16 , iganet::logsoftmax = 17 , iganet::mish = 18 , iganet::normalize = 19 ,
  iganet::prelu = 20 , iganet::relu = 21 , iganet::relu6 = 22 , iganet::rrelu = 23 ,
  iganet::selu = 24 , iganet::sigmoid = 25 , iganet::silu = 26 , iganet::softmax = 27 ,
  iganet::softmin = 28 , iganet::softplus = 29 , iganet::softshrink = 30 , iganet::softsign = 31 ,
  iganet::tanh = 32 , iganet::tanhshrink = 33 , iganet::threshold = 34
}
 Enumerator for nonlinear activation functions. More...
 

Functions

std::ostream & iganet::operator<< (std::ostream &os, const ActivationFunction &obj)
 Print (as string) an ActivationFunction object.
 

Detailed Description

Network layer.

Author
Matthias Moller

This Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0. If a copy of the MPL was not distributed with this file, You can obtain one at http://mozilla.org/MPL/2.0/.