Selu#

Selu - 6#

Version

  • name: Selu (GitHub)

  • domain: main

  • since_version: 6

  • function: True

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 6.

Summary

Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.

Attributes

  • alpha: Coefficient of SELU default to 1.67326319217681884765625 (i.e., float32 approximation of 1.6732632423543772848170429916717). Default value is 1.6732631921768188.

  • gamma: Coefficient of SELU default to 1.05070102214813232421875 (i.e., float32 approximation of 1.0507009873554804934193349852946). Default value is 1.0507010221481323.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Examples

default

node = onnx.helper.make_node(
    "Selu", inputs=["x"], outputs=["y"], alpha=2.0, gamma=3.0
)

x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-3.79272318, 0., 3.]
y = (
    np.clip(x, 0, np.inf) * 3.0
    + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0 * 3.0
)
expect(node, inputs=[x], outputs=[y], name="test_selu_example")

x = np.random.randn(3, 4, 5).astype(np.float32)
y = (
    np.clip(x, 0, np.inf) * 3.0
    + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0 * 3.0
)
expect(node, inputs=[x], outputs=[y], name="test_selu")

_selu_default

default_alpha = 1.67326319217681884765625
default_gamma = 1.05070102214813232421875
node = onnx.helper.make_node(
    "Selu",
    inputs=["x"],
    outputs=["y"],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = (
    np.clip(x, 0, np.inf) * default_gamma
    + (np.exp(np.clip(x, -np.inf, 0)) - 1) * default_alpha * default_gamma
)
expect(node, inputs=[x], outputs=[y], name="test_selu_default")

Differences

00Selu takes one input data (Tensor) and produces one output dataSelu takes one input data (Tensor) and produces one output data
11(Tensor) where the scaled exponential linear unit function,(Tensor) where the scaled exponential linear unit function,
22y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0,y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0,
33is applied to the tensor elementwise.is applied to the tensor elementwise.
44
55**Attributes****Attributes**
66
77* **alpha**:* **alpha**:
88 Coefficient of SELU default to 1.6732. Default value is 1.673200011253357. Coefficient of SELU default to 1.67326319217681884765625 (i.e.,
99* **consumed_inputs**: float32 approximation of 1.6732632423543772848170429916717). Default value is 1.6732631921768188.
10* **gamma**:
1011 legacy optimization attribute. Coefficient of SELU default to 1.05070102214813232421875 (i.e.,
1112* **gamma**: float32 approximation of 1.0507009873554804934193349852946). Default value is 1.0507010221481323.
12 Coefficient of SELU default to 1.0507. Default value is 1.0506999492645264.
1313
1414**Inputs****Inputs**
1515
1616* **X** (heterogeneous) - **T**:* **X** (heterogeneous) - **T**:
1717 Input tensor Input tensor
1818
1919**Outputs****Outputs**
2020
2121* **Y** (heterogeneous) - **T**:* **Y** (heterogeneous) - **T**:
2222 Output tensor Output tensor
2323
2424**Type Constraints****Type Constraints**
2525
2626* **T** in (* **T** in (
2727 tensor(double), tensor(double),
2828 tensor(float), tensor(float),
2929 tensor(float16) tensor(float16)
3030 ): ):
3131 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

Selu - 1#

Version

  • name: Selu (GitHub)

  • domain: main

  • since_version: 1

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: False

This version of the operator has been available since version 1.

Summary

Selu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the scaled exponential linear unit function, y = gamma * (alpha * e^x - alpha) for x <= 0, y = gamma * x for x > 0, is applied to the tensor elementwise.

Attributes

  • alpha: Coefficient of SELU default to 1.6732. Default value is 1.673200011253357.

  • consumed_inputs: legacy optimization attribute.

  • gamma: Coefficient of SELU default to 1.0507. Default value is 1.0506999492645264.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.