Elu#

Elu - 6#

Version

  • name: Elu (GitHub)

  • domain: main

  • since_version: 6

  • function: True

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 6.

Summary

Elu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the function f(x) = alpha * (exp(x) - 1.) for x < 0, f(x) = x for x >= 0., is applied to the tensor elementwise.

Attributes

  • alpha: Coefficient of ELU. Default value is 1.0.

Inputs

  • X (heterogeneous) - T: 1D input tensor

Outputs

  • Y (heterogeneous) - T: 1D output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Examples

default

node = onnx.helper.make_node("Elu", inputs=["x"], outputs=["y"], alpha=2.0)

x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-1.2642411, 0., 1.]
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0
expect(node, inputs=[x], outputs=[y], name="test_elu_example")

x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0
expect(node, inputs=[x], outputs=[y], name="test_elu")

_elu_default

default_alpha = 1.0
node = onnx.helper.make_node(
    "Elu",
    inputs=["x"],
    outputs=["y"],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * default_alpha
expect(node, inputs=[x], outputs=[y], name="test_elu_default")

Differences

00Elu takes one input data (Tensor) and produces one output dataElu takes one input data (Tensor) and produces one output data
11(Tensor) where the function f(x) = alpha * (exp(x) - 1.) for x <(Tensor) where the function f(x) = alpha * (exp(x) - 1.) for x <
220, f(x) = x for x >= 0., is applied to the tensor elementwise.0, f(x) = x for x >= 0., is applied to the tensor elementwise.
33
44**Attributes****Attributes**
55
66* **alpha**:* **alpha**:
77 Coefficient of ELU default to 1.0. Default value is 1.0. Coefficient of ELU. Default value is 1.0.
8* **consumed_inputs**:
9 legacy optimization attribute.
108
119**Inputs****Inputs**
1210
1311* **X** (heterogeneous) - **T**:* **X** (heterogeneous) - **T**:
1412 1D input tensor 1D input tensor
1513
1614**Outputs****Outputs**
1715
1816* **Y** (heterogeneous) - **T**:* **Y** (heterogeneous) - **T**:
1917 1D input tensor 1D output tensor
2018
2119**Type Constraints****Type Constraints**
2220
2321* **T** in (* **T** in (
2422 tensor(double), tensor(double),
2523 tensor(float), tensor(float),
2624 tensor(float16) tensor(float16)
2725 ): ):
2826 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

Elu - 1#

Version

  • name: Elu (GitHub)

  • domain: main

  • since_version: 1

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: False

This version of the operator has been available since version 1.

Summary

Elu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the function f(x) = alpha * (exp(x) - 1.) for x < 0, f(x) = x for x >= 0., is applied to the tensor elementwise.

Attributes

  • alpha: Coefficient of ELU default to 1.0. Default value is 1.0.

  • consumed_inputs: legacy optimization attribute.

Inputs

  • X (heterogeneous) - T: 1D input tensor

Outputs

  • Y (heterogeneous) - T: 1D input tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.