module onnxrt.onnx_inference_exports#

Inheritance diagram of mlprodict.onnxrt.onnx_inference_exports

Short summary#

module mlprodict.onnxrt.onnx_inference_exports

Extensions to class OnnxInference.

source on GitHub

Classes#

class

truncated documentation

OnnxInferenceExport

Implements methods to export a instance of OnnxInference into json, dot, text, python. …

Methods#

method

truncated documentation

__init__

to_dot

Produces a DOT language string for the graph.

to_json

Converts an ONNX model into JSON.

to_onnx_code

Exports the ONNX graph into an onnx code which replicates it.

to_python

Converts the ONNX runtime into independant python code. The function creates multiple files starting with …

to_text

It calls function onnx2bigraph() to return the ONNX graph as text.

Documentation#

Extensions to class OnnxInference.

source on GitHub

class mlprodict.onnxrt.onnx_inference_exports.OnnxInferenceExport(oinf)#

Bases: object

Implements methods to export a instance of OnnxInference into json, dot, text, python.

source on GitHub

Parameters:

oinfOnnxInference

source on GitHub

__init__(oinf)#
Parameters:

oinfOnnxInference

source on GitHub

to_dot(recursive=False, prefix='', add_rt_shapes=False, use_onnx=False, add_functions=True, **params)#

Produces a DOT language string for the graph.

Parameters:
  • params – additional params to draw the graph

  • recursive – also show subgraphs inside operator like Scan

  • prefix – prefix for every node name

  • add_rt_shapes – adds shapes infered from the python runtime

  • use_onnx – use onnx dot format instead of this one

  • add_functions – add functions to the graph

Returns:

string

Default options for the graph are:

options = {
    'orientation': 'portrait',
    'ranksep': '0.25',
    'nodesep': '0.05',
    'width': '0.5',
    'height': '0.1',
    'size': '7',
}

One example:

Convert ONNX into DOT

An example on how to convert an ONNX graph into DOT.

<<<

import numpy
from mlprodict.npy.xop import loadop
from mlprodict.onnxrt import OnnxInference

OnnxAiOnnxMlLinearRegressor = loadop(
    ('ai.onnx.ml', 'LinearRegressor'))

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxAiOnnxMlLinearRegressor(
    'X', output_names=['Y'], **pars)
model_def = onx.to_onnx(
    {'X': pars['coefficients'].astype(numpy.float32)},
    outputs={'Y': numpy.float32},
    target_opset=12)
oinf = OnnxInference(model_def)
print(oinf.to_dot())

>>>

    digraph{
      size=7;
      ranksep=0.25;
      orientation=portrait;
      nodesep=0.05;
    
      X [shape=box color=red label="X\nfloat((2,))" fontsize=10];
    
      Y [shape=box color=green label="Y\nfloat(('?',))" fontsize=10];
    
    
      _linearregressor [shape=box style="filled,rounded" color=orange label="LinearRegressor\n(_linearregressor)\ncoefficients=[1. 2.]\nintercepts=[1.]\npost_transform=b'NONE'" fontsize=10];
      X -> _linearregressor;
      _linearregressor -> Y;
    }

See an example of representation in notebook ONNX visualization.

source on GitHub

to_json(indent=2)#

Converts an ONNX model into JSON.

Parameters:

indent – indentation

Returns:

string

Convert ONNX into JSON

An example on how to convert an ONNX graph into JSON.

<<<

import numpy
from mlprodict.npy.xop import loadop
from mlprodict.onnxrt import OnnxInference

OnnxAiOnnxMlLinearRegressor = loadop(
    ('ai.onnx.ml', 'LinearRegressor'))

pars = dict(coefficients=numpy.array([1., 2.]),
            intercepts=numpy.array([1.]),
            post_transform='NONE')
onx = OnnxAiOnnxMlLinearRegressor(
    'X', output_names=['Y'], **pars)
model_def = onx.to_onnx(
    {'X': pars['coefficients'].astype(numpy.float32)},
    outputs={'Y': numpy.float32},
    target_opset=12)
oinf = OnnxInference(model_def)
print(oinf.to_json())

>>>

    {
      "doc_string": "",
      "ir_version": 8,
      "producer_version": "",
      "domain": "",
      "producer_name": "",
      "model_version": 0,
      "inputs": [
        {
          "name": "X",
          "type": {
            "tensor_type": {
              "elem_type": 1,
              "shape": {
                "dim": {
                  "dim_value": 2
                }
              }
            }
          }
        }
      ],
      "outputs": [
        {
          "name": "Y",
          "type": {
            "tensor_type": {
              "elem_type": 1
            }
          }
        }
      ],
      "initializers": {},
      "nodes": [
        {
          "name": "_linearregressor",
          "op_type": "LinearRegressor",
          "domain": "ai.onnx.ml",
          "inputs": [
            "X"
          ],
          "outputs": [
            "Y"
          ],
          "attributes": {
            "coefficients": {
              "floats": [
                1.0,
                2.0
              ],
              "type": "FLOATS"
            },
            "intercepts": {
              "floats": [
                1.0
              ],
              "type": "FLOATS"
            },
            "post_transform": {
              "s": "NONE",
              "type": "STRING"
            }
          }
        }
      ]
    }

source on GitHub

to_onnx_code()#

Exports the ONNX graph into an onnx code which replicates it.

Returns:

string

source on GitHub

to_python(prefix='onnx_pyrt_', dest=None, inline=True)#

Converts the ONNX runtime into independant python code. The function creates multiple files starting with prefix and saved to folder dest.

Parameters:
  • prefix – file prefix

  • dest – destination folder

  • inline – constant matrices are put in the python file itself as byte arrays

Returns:

file dictionary

The function does not work if the chosen runtime is not python.

<<<

import numpy
from mlprodict.npy.xop import loadop
from mlprodict.onnxrt import OnnxInference

OnnxAdd = loadop('Add')

idi = numpy.identity(2).astype(numpy.float32)
onx = OnnxAdd('X', idi, output_names=['Y'],
              op_version=12)
model_def = onx.to_onnx({'X': idi},
                        target_opset=12)
X = numpy.array([[1, 2], [3, 4]], dtype=numpy.float32)
oinf = OnnxInference(model_def, runtime='python')
res = oinf.to_python()
print(res['onnx_pyrt_main.py'])

>>>

    # coding: utf-8
    '''
    Python code equivalent to an ONNX graph.
    It was was generated by module *mlprodict*.
    '''
    from io import BytesIO
    from numpy import array, float32, ndarray
    import numpy
    import pickle
    
    
    def pyrt_Add(X, init):
        # inplaces not take into account False-False
        return numpy.add(X, init)
    
    
    class OnnxPythonInference:
    
        def __init__(self):
            self._load_inits()
    
        @property
        def metadata(self):
            return {'doc_string': '', 'ir_version': 7, 'producer_version': '', 'domain': '', 'producer_name': '', 'model_version': 0}
    
        @property
        def inputs(self):
            return ['X']
    
        @property
        def outputs(self):
            return ['Y']
    
        def _load_inits(self):
            self._inits = {}
            iocst = b'\x80\x04\x95\x9a\x00\x00\x00\x00\x00\x00\x00\x8c\x15numpy.core.multiarray\x94\x8c\x0c_reconstruct\x94\x93\x94\x8c\x05numpy\x94\x8c\x07ndarray\x94\x93\x94K\x00\x85\x94C\x01b\x94\x87\x94R\x94(K\x01K\x02K\x02\x86\x94h\x03\x8c\x05dtype\x94\x93\x94\x8c\x02f4\x94\x89\x88\x87\x94R\x94(K\x03\x8c\x01<\x94NNNJ\xff\xff\xff\xffJ\xff\xff\xff\xffK\x00t\x94b\x89C\x10\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x94t\x94b.'
            self._inits['init'] = pickle.loads(iocst)
    
        def run(self, X):
            # constant
            init = self._inits['init']
    
            # graph code
            Y = pyrt_Add(X, init)
    
            # return
            return Y

source on GitHub

to_text(recursive=False, grid=5, distance=5, kind='bi')#

It calls function onnx2bigraph to return the ONNX graph as text.

Parameters:
  • recursive – dig into subgraphs too

  • grid – align text to this grid

  • distance – distance to the text

  • kind – see below

Returns:

text

Possible values for format: * ‘bi’: use onnx2bigraph * ‘seq’: use onnx_simple_text_plot

source on GitHub