module onnxrt.onnx_inference_exports

Inheritance diagram of mlprodict.onnxrt.onnx_inference_exports

Short summary

module mlprodict.onnxrt.onnx_inference_exports

Extensions to class OnnxInference.

source on GitHub

Classes

class

truncated documentation

OnnxInferenceExport

Implements methods to export a instance of OnnxInference into json, dot, text, python. …

Methods

method

truncated documentation

__init__

to_dot

Produces a DOT language string for the graph.

to_json

Converts an ONNX model into JSON.

to_onnx_code

Exports the ONNX graph into an onnx code which replicates it.

to_python

Converts the ONNX runtime into independant python code. The function creates multiple files starting with …

to_text

It calls function onnx2bigraph() to return the ONNX graph as text.

Documentation

Extensions to class OnnxInference.

source on GitHub

class mlprodict.onnxrt.onnx_inference_exports.OnnxInferenceExport(oinf)

Bases: object

Implements methods to export a instance of OnnxInference into json, dot, text, python.

source on GitHub

Parameters

oinfOnnxInference

source on GitHub

__init__(oinf)
Parameters

oinfOnnxInference

source on GitHub

to_dot(recursive=False, prefix='', add_rt_shapes=False, use_onnx=False, **params)

Produces a DOT language string for the graph.

Parameters
  • params – additional params to draw the graph

  • recursive – also show subgraphs inside operator like Scan

  • prefix – prefix for every node name

  • add_rt_shapes – adds shapes infered from the python runtime

  • use_onnx – use onnx dot format instead of this one

Returns

string

Default options for the graph are:

options = {
    'orientation': 'portrait',
    'ranksep': '0.25',
    'nodesep': '0.05',
    'width': '0.5',
    'height': '0.1',
    'size': '7',
}

One example:

Convert ONNX into DOT

An example on how to convert an ONNX graph into DOT.

<<<

import numpy
from skl2onnx.algebra.onnx_ops import OnnxLinearRegressor
from skl2onnx.common.data_types import FloatTensorType
from mlprodict.onnxrt import OnnxInference

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxLinearRegressor('X', output_names=['Y'], **pars)
model_def = onx.to_onnx({'X': pars['coefficients'].astype(numpy.float32)},
                        outputs=[('Y', FloatTensorType([1]))],
                        target_opset=12)
oinf = OnnxInference(model_def)
print(oinf.to_dot())

>>>

    digraph{
      ranksep=0.25;
      size=7;
      nodesep=0.05;
      orientation=portrait;
    
      X [shape=box color=red label="X\nfloat((0,))" fontsize=10];
    
      Y [shape=box color=green label="Y\nfloat((1,))" fontsize=10];
    
    
      Li_LinearRegressor [shape=box style="filled,rounded" color=orange label="LinearRegressor\n(Li_LinearRegressor)\ncoefficients=[1. 2.]\nintercepts=[1.]\npost_transform=b'NONE'" fontsize=10];
      X -> Li_LinearRegressor;
      Li_LinearRegressor -> Y;
    }

See an example of representation in notebook ONNX visualization.

source on GitHub

to_json(indent=2)

Converts an ONNX model into JSON.

Parameters

indent – indentation

Returns

string

Convert ONNX into JSON

An example on how to convert an ONNX graph into JSON.

<<<

import numpy
from skl2onnx.algebra.onnx_ops import OnnxLinearRegressor
from skl2onnx.common.data_types import FloatTensorType
from mlprodict.onnxrt import OnnxInference

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxLinearRegressor('X', output_names=['Y'], **pars)
model_def = onx.to_onnx({'X': pars['coefficients'].astype(numpy.float32)},
                        outputs=[('Y', FloatTensorType([1]))],
                        target_opset=12)
oinf = OnnxInference(model_def)
print(oinf.to_json())

>>>

    {
      "model_version": 0,
      "producer_name": "skl2onnx",
      "ir_version": 6,
      "doc_string": "",
      "domain": "ai.onnx",
      "producer_version": "1.10.4",
      "inputs": [
        {
          "name": "X",
          "type": {
            "tensor_type": {
              "elem_type": 1,
              "shape": {
                "dim": {}
              }
            }
          }
        }
      ],
      "outputs": [
        {
          "name": "Y",
          "type": {
            "tensor_type": {
              "elem_type": 1,
              "shape": {
                "dim": {
                  "dim_value": 1
                }
              }
            }
          }
        }
      ],
      "initializers": {},
      "nodes": [
        {
          "name": "Li_LinearRegressor",
          "op_type": "LinearRegressor",
          "domain": "ai.onnx.ml",
          "inputs": [
            "X"
          ],
          "outputs": [
            "Y"
          ],
          "attributes": {
            "coefficients": {
              "t": {
                "dims": 2,
                "data_type": 11,
                "name": "coefficients",
                "double_data": 2.0
              },
              "type": "TENSOR"
            },
            "intercepts": {
              "t": {
                "dims": 1,
                "data_type": 11,
                "name": "intercepts",
                "double_data": 1.0
              },
              "type": "TENSOR"
            },
            "post_transform": {
              "s": "NONE",
              "type": "STRING"
            }
          }
        }
      ]
    }

source on GitHub

to_onnx_code()

Exports the ONNX graph into an onnx code which replicates it.

Returns

string

source on GitHub

to_python(prefix='onnx_pyrt_', dest=None, inline=True)

Converts the ONNX runtime into independant python code. The function creates multiple files starting with prefix and saved to folder dest.

Parameters
  • prefix – file prefix

  • dest – destination folder

  • inline – constant matrices are put in the python file itself as byte arrays

Returns

file dictionary

The function does not work if the chosen runtime is not python.

<<<

import numpy
from skl2onnx.algebra.onnx_ops import OnnxAdd
from mlprodict.onnxrt import OnnxInference

idi = numpy.identity(2).astype(numpy.float32)
onx = OnnxAdd('X', idi, output_names=['Y'],
              op_version=12)
model_def = onx.to_onnx({'X': idi},
                        target_opset=12)
X = numpy.array([[1, 2], [3, 4]], dtype=numpy.float32)
oinf = OnnxInference(model_def, runtime='python')
res = oinf.to_python()
print(res['onnx_pyrt_main.py'])

>>>

    # coding: utf-8
    '''
    Python code equivalent to an ONNX graph.
    It was was generated by module *mlprodict*.
    '''
    from io import BytesIO
    from numpy import array, float32, ndarray
    import numpy
    import pickle
    
    
    def pyrt_Add(X, Ad_Addcst):
        # inplaces not take into account False-False
        return numpy.add(X, Ad_Addcst)
    
    
    class OnnxPythonInference:
    
        def __init__(self):
            self._load_inits()
    
        @property
        def metadata(self):
            return {'model_version': 0, 'producer_name': 'skl2onnx', 'ir_version': 7, 'doc_string': '', 'domain': 'ai.onnx', 'producer_version': '1.10.4'}
    
        @property
        def inputs(self):
            return ['X']
    
        @property
        def outputs(self):
            return ['Y']
    
        def _load_inits(self):
            self._inits = {}
            iocst = b'\x80\x04\x95\x9a\x00\x00\x00\x00\x00\x00\x00\x8c\x15numpy.core.multiarray\x94\x8c\x0c_reconstruct\x94\x93\x94\x8c\x05numpy\x94\x8c\x07ndarray\x94\x93\x94K\x00\x85\x94C\x01b\x94\x87\x94R\x94(K\x01K\x02K\x02\x86\x94h\x03\x8c\x05dtype\x94\x93\x94\x8c\x02f4\x94\x89\x88\x87\x94R\x94(K\x03\x8c\x01<\x94NNNJ\xff\xff\xff\xffJ\xff\xff\xff\xffK\x00t\x94b\x89C\x10\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x94t\x94b.'
            self._inits['Ad_Addcst'] = pickle.loads(iocst)
    
        def run(self, X):
            # constant
            Ad_Addcst = self._inits['Ad_Addcst']
    
            # graph code
            Y = pyrt_Add(X, Ad_Addcst)
    
            # return
            return Y

source on GitHub

to_text(recursive=False, grid=5, distance=5, kind='bi')

It calls function onnx2bigraph to return the ONNX graph as text.

Parameters
  • recursive – dig into subgraphs too

  • grid – align text to this grid

  • distance – distance to the text

  • kind – see below

Returns

text

Possible values for format: * ‘bi’: use onnx2bigraph * ‘seq’: use onnx_simple_text_plot

source on GitHub