# Elu#

## Elu - 6#

Version

• name: Elu (GitHub)

• domain: main

• since_version: 6

• function: True

• support_level: SupportType.COMMON

• shape inference: True

This version of the operator has been available since version 6.

Summary

Elu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the function f(x) = alpha * (exp(x) - 1.) for x < 0, f(x) = x for x >= 0., is applied to the tensor elementwise.

Attributes

• alpha: Coefficient of ELU. Default value is 1.0.

Inputs

• X (heterogeneous) - T: 1D input tensor

Outputs

• Y (heterogeneous) - T: 1D output tensor

Type Constraints

• T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Examples

default

node = onnx.helper.make_node("Elu", inputs=["x"], outputs=["y"], alpha=2.0)

x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-1.2642411, 0., 1.]
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0
expect(node, inputs=[x], outputs=[y], name="test_elu_example")

x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * 2.0
expect(node, inputs=[x], outputs=[y], name="test_elu")

_elu_default

default_alpha = 1.0
node = onnx.helper.make_node(
"Elu",
inputs=["x"],
outputs=["y"],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + (np.exp(np.clip(x, -np.inf, 0)) - 1) * default_alpha
expect(node, inputs=[x], outputs=[y], name="test_elu_default")

Differences

 0 0 Elu takes one input data (Tensor) and produces one output data Elu takes one input data (Tensor) and produces one output data 1 1 (Tensor) where the function f(x) = alpha * (exp(x) - 1.) for x < (Tensor) where the function f(x) = alpha * (exp(x) - 1.) for x < 2 2 0, f(x) = x for x >= 0., is applied to the tensor elementwise. 0, f(x) = x for x >= 0., is applied to the tensor elementwise. 3 3 4 4 **Attributes** **Attributes** 5 5 6 6 * **alpha**: * **alpha**: 7 7 Coefficient of ELU default to 1.0. Default value is 1.0. Coefficient of ELU. Default value is 1.0. 8 * **consumed_inputs**: 9 legacy optimization attribute. 10 8 11 9 **Inputs** **Inputs** 12 10 13 11 * **X** (heterogeneous) - **T**: * **X** (heterogeneous) - **T**: 14 12 1D input tensor 1D input tensor 15 13 16 14 **Outputs** **Outputs** 17 15 18 16 * **Y** (heterogeneous) - **T**: * **Y** (heterogeneous) - **T**: 19 17 1D input tensor 1D output tensor 20 18 21 19 **Type Constraints** **Type Constraints** 22 20 23 21 * **T** in ( * **T** in ( 24 22 tensor(double), tensor(double), 25 23 tensor(float), tensor(float), 26 24 tensor(float16) tensor(float16) 27 25 ): ): 28 26 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

## Elu - 1#

Version

• name: Elu (GitHub)

• domain: main

• since_version: 1

• function: False

• support_level: SupportType.COMMON

• shape inference: False

This version of the operator has been available since version 1.

Summary

Elu takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the function f(x) = alpha * (exp(x) - 1.) for x < 0, f(x) = x for x >= 0., is applied to the tensor elementwise.

Attributes

• alpha: Coefficient of ELU default to 1.0. Default value is 1.0.

• consumed_inputs: legacy optimization attribute.

Inputs

• X (heterogeneous) - T: 1D input tensor

Outputs

• Y (heterogeneous) - T: 1D input tensor

Type Constraints

• T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.