https://github.com/sdpython/mlprodict/blob/master/_doc/sphinxdoc/source/_static/project_ico.png?raw=true

mlprodict#

Build status Build Status Windows https://circleci.com/gh/sdpython/mlprodict/tree/master.svg?style=svg https://dev.azure.com/xavierdupre3/mlprodict/_apis/build/status/sdpython.mlprodict https://badge.fury.io/py/mlprodict.svg MIT License https://codecov.io/github/sdpython/mlprodict/coverage.svg?branch=master GitHub Issues Notebook Coverage Downloads Forks Stars https://mybinder.org/badge_logo.svg size

mlprodict was initially started to help implementing converters to ONNX. The main features is a python runtime for ONNX (class OnnxInference), visualization tools (see Visualization), and a numpy API for ONNX). The package also provides tools to compare predictions, to benchmark models converted with sklearn-onnx.

import numpy
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_iris
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnxrt.validate.validate_difference import measure_relative_difference
from mlprodict import __max_supported_opset__, get_ir_version

iris = load_iris()
X = iris.data[:, :2]
y = iris.target
lr = LinearRegression()
lr.fit(X, y)

# Predictions with scikit-learn.
expected = lr.predict(X[:5])
print(expected)

# Conversion into ONNX.
from mlprodict.onnx_conv import to_onnx
model_onnx = to_onnx(lr, X.astype(numpy.float32),
                     black_op={'LinearRegressor'},
                     target_opset=__max_supported_opset__)
print("ONNX:", str(model_onnx)[:200] + "\n...")

# Predictions with onnxruntime
model_onnx.ir_version = get_ir_version(__max_supported_opset__)
oinf = OnnxInference(model_onnx, runtime='onnxruntime1')
ypred = oinf.run({'X': X[:5].astype(numpy.float32)})
print("ONNX output:", ypred)

# Measuring the maximum difference.
print("max abs diff:", measure_relative_difference(expected, ypred['variable']))

# And the python runtime
oinf = OnnxInference(model_onnx, runtime='python')
ypred = oinf.run({'X': X[:5].astype(numpy.float32)},
                 verbose=1, fLOG=print)
print("ONNX output:", ypred)

Installation

Installation from pip should work unless you need the latest development features.

pip install mlprodict

The package includes a runtime for ONNX. That’s why there is a limited number of dependencies. However, some features relies on sklearn-onnx, onnxruntime, scikit-learn. They can be installed with the following instructions:

pip install mlprodict[all]

The code is available at GitHub/mlprodict and has online documentation.