Examples

  1. Computes predictions with any runtime

  2. Convert ONNX into DOT

  3. Convert ONNX into JSON

  4. Convert ONNX into graph

  5. Convert a function into ONNX code

  6. Convert a function into ONNX code and run

  7. Get the tree of a simple function

Computes predictions with any runtime

The following example compares predictions between scikit-learn and this runtime for the python runtime.

<<<

import numpy
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnx_conv import to_onnx

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, _ = train_test_split(X, y)
clr = LinearRegression()
clr.fit(X_train, y_train)

exp = clr.predict(X_test[:5])
print(exp)

model_def = to_onnx(clr, X_train.astype(numpy.float32))
oinf = OnnxInference(model_def)
y = oinf.run({'X': X_test[:5]})
print(y)

>>>

    [ 1.518 -0.053 -0.048  1.152 -0.098]
    {'variable': array([[ 1.518],
           [-0.053],
           [-0.048],
           [ 1.152],
           [-0.098]])}

(original entry : onnx_inference.py:docstring of mlprodict.onnxrt.onnx_inference.OnnxInference.run, line 17)

Convert ONNX into DOT

An example on how to convert an ONNX graph into DOT.

<<<

import numpy
from skl2onnx.algebra.onnx_ops import OnnxLinearRegressor
from skl2onnx.common.data_types import FloatTensorType
from mlprodict.onnxrt import OnnxInference

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxLinearRegressor('X', output_names=['Y'], **pars)
model_def = onx.to_onnx({'X': pars['coefficients'].astype(numpy.float32)},
                        outputs=[('Y', FloatTensorType([1]))])
oinf = OnnxInference(model_def)
print(oinf.to_dot())

>>>

    digraph{
      orientation=portrait;
      ranksep=0.25;
      nodesep=0.05;
    
      X [shape=box color=red label="X\nfloat((0,))" fontsize=10];
    
      Y [shape=box color=green label="Y\nfloat((1,))" fontsize=10];
    
    
      Li_LinearRegressor [shape=box style="filled,rounded" color=orange label="LinearRegressor\n(Li_LinearRegressor)\ncoefficients=[1. 2.]\nintercepts=[1.]\npost_transform=b'NONE'" fontsize=10];
      X -> Li_LinearRegressor;
      Li_LinearRegressor -> Y;
    }

See an example of representation in notebook ONNX visualization.

(original entry : onnx_inference_exports.py:docstring of mlprodict.onnxrt.onnx_inference_exports.OnnxInferenceExport.to_dot, line 24)

Convert ONNX into JSON

An example on how to convert an ONNX graph into JSON.

<<<

import numpy
from skl2onnx.algebra.onnx_ops import OnnxLinearRegressor
from skl2onnx.common.data_types import FloatTensorType
from mlprodict.onnxrt import OnnxInference

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxLinearRegressor('X', output_names=['Y'], **pars)
model_def = onx.to_onnx({'X': pars['coefficients'].astype(numpy.float32)},
                        outputs=[('Y', FloatTensorType([1]))])
oinf = OnnxInference(model_def)
print(oinf.to_json())

>>>

    {
      "domain": "ai.onnx",
      "producer_version": "1.5.999992",
      "model_version": 0,
      "ir_version": 6,
      "producer_name": "skl2onnx",
      "doc_string": "",
      "inputs": [
        {
          "name": "X",
          "type": {
            "tensor_type": {
              "elem_type": 1,
              "shape": {
                "dim": {}
              }
            }
          }
        }
      ],
      "outputs": [
        {
          "name": "Y",
          "type": {
            "tensor_type": {
              "elem_type": 1,
              "shape": {
                "dim": {
                  "dim_value": 1
                }
              }
            }
          }
        }
      ],
      "initializers": {},
      "nodes": [
        {
          "name": "Li_LinearRegressor",
          "op_type": "LinearRegressor",
          "domain": "ai.onnx.ml",
          "inputs": [
            "X"
          ],
          "outputs": [
            "Y"
          ],
          "attributes": {
            "coefficients": {
              "floats": [
                1.0,
                2.0
              ],
              "type": "FLOATS"
            },
            "intercepts": {
              "floats": [
                1.0
              ],
              "type": "FLOATS"
            },
            "post_transform": {
              "s": "NONE",
              "type": "STRING"
            }
          }
        }
      ]
    }

(original entry : onnx_inference_exports.py:docstring of mlprodict.onnxrt.onnx_inference_exports.OnnxInferenceExport.to_json, line 6)

Convert ONNX into graph

An example on how to convert an ONNX graph into a graph.

<<<

import pprint
import numpy
from skl2onnx.algebra.onnx_ops import OnnxLinearRegressor
from skl2onnx.common.data_types import FloatTensorType
from mlprodict.onnxrt import OnnxInference

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxLinearRegressor('X', output_names=['Y'], **pars)
model_def = onx.to_onnx({'X': pars['coefficients'].astype(numpy.float32)},
                        outputs=[('Y', FloatTensorType([1]))])
oinf = OnnxInference(model_def)
pprint.pprint(oinf.to_sequence())

>>>

    {'inits': {},
     'inputs': {'X': {'name': 'X',
                      'type': {'elem': 'float', 'kind': 'tensor', 'shape': (0,)}}},
     'intermediate': {'Y': None},
     'nodes': {'Li_LinearRegressor': Onnx-LinearRegressor(X) -> Y},
     'outputs': {'Y': {'name': 'Y',
                       'type': {'elem': 'float', 'kind': 'tensor', 'shape': (1,)}}},
     'sequence': [Onnx-LinearRegressor(X) -> Y],
     'targets': {'ai.onnx.ml': 11}}

See an example of representation in notebook ONNX visualization.

(original entry : onnx_inference.py:docstring of mlprodict.onnxrt.onnx_inference.OnnxInference.to_sequence, line 5)

Convert a function into ONNX code

The following code parses a python function and returns another python function which produces an ONNX graph if executed.

<<<

import numpy
from mlprodict.onnx_grammar import translate_fct2onnx


def trs(x, y):
    z = x + numpy.transpose(y, axes=[1, 0])
    return x * z


onnx_code = translate_fct2onnx(
    trs, context={'numpy.transpose': numpy.transpose})
print(onnx_code)

>>>

    def trs(x, y, dtype=numpy.float32):
        z = (
            OnnxAdd(
                x,
                OnnxTranspose(
                    y,
                    perm=[1, 0]
                )
            )
        )
        return (
            OnnxMul(
                x,
                z
            )
        )

(original entry : onnx_translation.py:docstring of mlprodict.onnx_grammar.onnx_translation.translate_fct2onnx, line 23)

Convert a function into ONNX code and run

The following code parses a python function and returns another python function which produces an ONNX graph if executed. The example executes the function, creates an ONNX then uses OnnxInference to compute predictions. Finally it compares them to the original.

<<<

import numpy
from mlprodict.onnx_grammar import translate_fct2onnx
from mlprodict.onnxrt import OnnxInference
from skl2onnx.algebra.onnx_ops import (
    OnnxAdd, OnnxTranspose, OnnxMul, OnnxIdentity
)

ctx = {'OnnxAdd': OnnxAdd,
       'OnnxTranspose': OnnxTranspose,
       'OnnxMul': OnnxMul,
       'OnnxIdentity': OnnxIdentity}


def trs(x, y):
    z = x + numpy.transpose(y, axes=[1, 0])
    return x * z


inputs = {'x': numpy.array([[1, 2]], dtype=numpy.float32),
          'y': numpy.array([[-0.3, 0.4]], dtype=numpy.float32).T}

original = trs(inputs['x'], inputs['y'])

print('original output:', original)

onnx_fct = translate_fct2onnx(
    trs, context={'numpy.transpose': numpy.transpose},
    cpl=True, context_cpl=ctx, output_names=['Z'])

onnx_code = onnx_fct('x', 'y')
print('ONNX code:', onnx_code)

onnx_g = onnx_code.to_onnx(inputs)

oinf = OnnxInference(onnx_g)
res = oinf.run(inputs)

print("ONNX inference:", res['Z'])
print("ONNX graph:", onnx_g)

>>>

    original output: [[0.7 4.8]]
    ONNX code: <skl2onnx.algebra.onnx_ops.OnnxIdentity object at 0x7fdeae870cc0>
    ONNX inference: [[0.7 4.8]]
    ONNX graph: ir_version: 6
    producer_name: "skl2onnx"
    producer_version: "1.5.999992"
    domain: "ai.onnx"
    model_version: 0
    graph {
      node {
        input: "y"
        output: "Tr_transposed0"
        name: "Tr_Transpose"
        op_type: "Transpose"
        attribute {
          name: "perm"
          ints: 1
          ints: 0
          type: INTS
        }
        domain: ""
      }
      node {
        input: "x"
        input: "Tr_transposed0"
        output: "Ad_C0"
        name: "Ad_Add"
        op_type: "Add"
        domain: ""
      }
      node {
        input: "x"
        input: "Ad_C0"
        output: "Mu_C0"
        name: "Mu_Mul"
        op_type: "Mul"
        domain: ""
      }
      node {
        input: "Mu_C0"
        output: "Z"
        name: "Id_Identity"
        op_type: "Identity"
        domain: ""
      }
      name: "OnnxIdentity"
      input {
        name: "x"
        type {
          tensor_type {
            elem_type: 1
            shape {
              dim {
              }
              dim {
                dim_value: 2
              }
            }
          }
        }
      }
      input {
        name: "y"
        type {
          tensor_type {
            elem_type: 1
            shape {
              dim {
              }
              dim {
                dim_value: 1
              }
            }
          }
        }
      }
      output {
        name: "Z"
        type {
          tensor_type {
            elem_type: 1
            shape {
              dim {
                dim_value: 0
              }
              dim {
                dim_value: 2
              }
            }
          }
        }
      }
    }
    opset_import {
      domain: ""
      version: 11
    }

(original entry : onnx_translation.py:docstring of mlprodict.onnx_grammar.onnx_translation.translate_fct2onnx, line 48)

Get the tree of a simple function

The following code uses Python syntax but follows a SQL logic.

<<<

import ast
import inspect
from textwrap import dedent
from mlprodict.onnx_grammar import CodeNodeVisitor


def norm2(x, y):
    delta = x - y
    n = delta ** 2
    return n


code = dedent(inspect.getsource(norm2))
node = ast.parse(code)
v = CodeNodeVisitor()
v.visit(node)
for r in v.Rows:
    print("{0}{1}: {2}".format("    " * r["indent"], r["type"], r["str"]))

>>>

    Module: 
        FunctionDef: norm2
            arguments: 
                arg: x
                arg: y
            Assign: 
                Name: delta
                BinOp: 
                    Name: x
                    Sub: 
                    Name: y
            Assign: 
                Name: n
                BinOp: 
                    Name: delta
                    Pow: 
                    Num: 2
            Return: 
                Name: n

(original entry : node_visitor_translator.py:docstring of mlprodict.onnx_grammar.node_visitor_translator.CodeNodeVisitor, line 3)