velora.models.activation¶
Utility methods for activation functions.
ActivationEnum
¶
Bases: Enum
An Enum for PyTorch activation functions.
Useful for getting activation functions dynamically using a string name.
Refer to the get() method for more details.
Source code in velora/models/activation.py
| Python | |
|---|---|
41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 | |
get(name)
classmethod
¶
Get the torch.nn activation function.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
Literal['relu', 'tanh', 'elu', 'leaky_relu', 'prelu', 'selu', 'silu', 'softsign', 'sigmoid', 'hardsigmoid', 'lecun_tanh']
|
the name of the activation function. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
activation |
nn.Module
|
the PyTorch activation module. |
Source code in velora/models/activation.py
| Python | |
|---|---|
61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 | |
LeCunTanh
¶
Implements LeCun's Tanh activation function.
$$
f(x) = 1.7159 \tanh (\frac{2}{3} x)
$$
Constants are applied to keep the variance of the output close to 1.
Source code in velora/models/activation.py
| Python | |
|---|---|
8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | |