Runtimes for ONNX#

Python Runtime = ‘python’#

This module implements a python runtime for ONNX. It is a work constantly in progress. It was started to facilitate the implementation of scikit-learn converters in sklearn-onnx. Main class is OnnxInference.

<<<

import numpy
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnx_conv import to_onnx

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, _ = train_test_split(X, y)
clr = LinearRegression()
clr.fit(X_train, y_train)

# predictions with scikit-learn
exp = clr.predict(X_test[:5])
print(exp)

# predictions with onnxruntime
model_def = to_onnx(clr, X_train.astype(numpy.float32),
                    target_opset=12)
oinf = OnnxInference(model_def)
y = oinf.run({'X': X_test[:5]})
print(y)

>>>

    [1.346 0.056 1.714 0.117 1.885]
    {'variable': array([[1.346],
           [0.056],
           [1.714],
           [0.117],
           [1.885]])}

Some ONNX operators converters are using were not all available in older version of ONNX. This version is called opset number. ONNX 1.4.0 is opset 9, ONNX 1.5.0 is opset 10… Next table shows which operator is available in which opset. An empty cell means it is not available. Other cells contains concatenated flags whose meaning is the following:

  • ERROR means the automated process failed to give a appropriate status or the runtime produces predictions too far from the original predictions, the second part of the constant gives an approximate diagnostic, last columns gives the exception message,

  • OK: the converter works fine and the runtime produces predictions almost equal to the orignal predictions, relative difference is below 1e-5,

  • e<%f: the converter works fine and the runtime produces predictions close to the orignal predictions, relative difference is below the threshold,

  • i/j: the model was converted for a specific opset but the converted ONNX is compatible with smaller opset, i is the smallest compatible opset for the main domain, j is the smallest compatible opset for the ai domain,

The model are tested through simple problems using the Iris dataset. The datasets is split into train test datasets. Function find_suitable_problem gives the list of problem every scikit-learn is tested on. The main ones are the following:

  • b-cm: binary classification,

  • m-cl: multi-class classification,

  • reg: regression,

  • cluster: clutering,

  • outlier: outlier detection,

  • num-tr: no label, only numerical features

The full list is given by find_suitable_problem. Next table tracks what is available, what is working and some indication about the cause of the error if it does not work.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG,
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset17

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

python

1.2.1

1

4

1

259

1

0

skl2onnx

1.13.1

1

259

1

0

-1

-1

-1

OK 17/1

0.41

0.41

0.41

0.43

0.51

0.39

0.43

0.39

0.43

0.36

0.47

0.39

0.47

0.42

0.63

LinearRegression

m-reg

default

predict

0

{}

null

4

python

1.2.1

1

2

1

301

1

0

skl2onnx

1.13.1

1

301

1

0

-1

-1

-1

OK 17/1

0.4

0.41

0.42

0.51

0.76

0.39

0.43

0.38

0.43

0.39

0.46

0.48

0.54

0.71

0.82

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

python

1.2.1

1

4

1

365

3

3

skl2onnx

1.13.1

-1

365

3

3

1

-1

-1

OK 13/

1.2

1.2

1.2

1.1

1.1

1

1.2

1.1

1.3

1.1

1.3

1

1.2

0.92

1.3

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

python

1.2.1

1

2

1

405

3

3

skl2onnx

1.13.1

-1

405

3

3

1

-1

-1

OK 13/

1.1

1.2

1.1

1.1

1

1

1.2

1.1

1.2

1

1.3

1

1.2

1

1

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

OK 9/1

0.78

0.84

0.83

1.1

1.6

0.68

0.82

0.79

0.96

0.72

0.93

1

1.2

1.5

1.6

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

0.61

0.65

0.66

0.96

1.5

0.53

0.65

0.62

0.68

0.57

0.74

0.89

1.1

1.5

1.6

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

OK 9/1

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

OK 9/1

0.95

0.96

0.9

0.76

0.67

0.81

1

0.9

1.1

0.82

1

0.71

0.81

0.66

0.68

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

0.74

0.75

0.72

0.67

0.65

0.63

0.79

0.71

0.79

0.66

0.81

0.63

0.71

0.64

0.65

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

OK 9/1

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

OK 13/1

2.3

2.5

2.6

3.4

5.1

1.9

2.5

2.3

2.6

2.3

2.7

3.1

3.7

4.9

5.3

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

2.1

2.2

2.3

3.1

5

1.8

2.3

2.1

2.4

2.1

2.5

2.9

3.3

4.8

5.2

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

OK 13/1

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

1

1

399

1

0

skl2onnx

1.13.1

1

399

1

0

-1

-1

-1

OK 17/1

0.45

0.47

0.48

0.66

1.2

0.38

0.48

0.45

0.5

0.44

0.52

0.57

0.74

1.2

1.3

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.2.1

1

3

1

426

1

0

skl2onnx

1.13.1

1

426

1

0

-1

-1

-1

OK 17/1

0.5

0.5

0.51

0.63

0.83

0.47

0.52

0.41

0.53

0.43

0.58

0.57

0.68

0.77

0.89

Full results are available at l-onnx-bench-python.

python_compiled#

This runtime is almost the same as the previous one but it creates and compiles a dedicated function to call every node of the graph. Graph operations are faster but it is not possible to look into every intermediate node anymore.

<<<

import numpy
from sklearn.ensemble import AdaBoostRegressor
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnx_conv import to_onnx

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, _ = train_test_split(X, y)
clr = AdaBoostRegressor(n_estimators=5)
clr.fit(X_train, y_train)
model_def = to_onnx(clr, X_train.astype(numpy.float32),
                    target_opset=12)
oinf = OnnxInference(model_def, runtime="python_compiled")
print(oinf)

>>>

    OnnxInference(...)
        def compiled_run(dict_inputs, yield_ops=None, context=None, attributes=None):
            if yield_ops is not None:
                raise NotImplementedError('yields_ops should be None.')
            # init: axis_name (axis_name)
            # init: estimators_weights (estimators_weights)
            # init: half_scalar (half_scalar)
            # init: k_value (k_value)
            # init: last_index (last_index)
            # init: negate (negate)
            # init: shape_tensor (shape_tensor)
            # inputs
            X = dict_inputs['X']
            (est_label_0, ) = n0_treeensembleregressor_3(X)
            (est_label_4, ) = n1_treeensembleregressor_3(X)
            (est_label_2, ) = n2_treeensembleregressor_3(X)
            (est_label_1, ) = n3_treeensembleregressor_3(X)
            (est_label_3, ) = n4_treeensembleregressor_3(X)
            (concatenated_labels, ) = n5_concat(est_label_0, est_label_1, est_label_2, est_label_3, est_label_4)
            (negated_labels, ) = n6_mul(concatenated_labels, negate)
            (sorted_values, sorted_indices, ) = n7_topk_11(negated_labels, k_value)
            (array_feat_extractor_output, ) = n8_arrayfeatureextractor(estimators_weights, sorted_indices)
            (reshaped_weights, ) = n9_reshape_5(array_feat_extractor_output, shape_tensor)
            (weights_cdf, ) = n10_cumsum(reshaped_weights, axis_name)
            (median_value, ) = n11_arrayfeatureextractor(weights_cdf, last_index)
            (comp_value, ) = n12_mul(median_value, half_scalar)
            (median_or_above, ) = n13_less(weights_cdf, comp_value)
            (cast_result, ) = n14_cast(median_or_above)
            (median_idx, ) = n15_argmin_12(cast_result)
            (median_estimators, ) = n16_gatherelements(sorted_indices, median_idx)
            (variable, ) = n17_gatherelements(concatenated_labels, median_estimators)
            return {
                'variable': variable,
            }

onnxruntime1#

onnxruntime loads the ONNX data in a single session and calls it onle once to compute the predictions. We create a table similar to Python Runtime = ‘python’.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG, runtime='onnxruntime1',
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset17

ERROR-msg

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

onnxruntime1

1.2.1

1

4

1

259

1

0

skl2onnx

1.13.1

1

259

1

0

-1

-1

-1

OK 17/1

1.5

0.82

0.76

0.89

1.7

0.71

8.2

0.72

1.2

0.69

0.89

0.8

0.98

1.5

2

LinearRegression

m-reg

default

predict

0

{}

null

4

onnxruntime1

1.2.1

1

2

1

301

1

0

skl2onnx

1.13.1

1

301

1

0

-1

-1

-1

OK 17/1

0.73

0.72

0.71

0.77

0.96

0.7

0.78

0.67

0.96

0.66

0.8

0.72

0.84

0.87

1.1

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

onnxruntime1

1.2.1

1

4

1

365

3

3

skl2onnx

1.13.1

-1

365

3

3

1

-1

-1

OK 13/

2.5

1.6

1.6

2.1

3.7

1.4

13

1.4

2.1

1.5

1.7

1.9

2.3

2.7

6.2

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

onnxruntime1

1.2.1

1

2

1

405

3

3

skl2onnx

1.13.1

-1

405

3

3

1

-1

-1

OK 13/

1.5

1.5

1.5

1.7

1.9

1.4

1.7

1.4

2

1.4

1.6

1.5

1.8

1.8

2

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

OK 9/1

0.99

1.1

1.3

4

7.9

0.92

1.3

0.97

1.2

1.2

1.4

3.6

4.3

6.4

10

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

0.63

0.64

0.73

0.77

0.87

0.59

0.77

0.6

0.67

0.58

1.4

0.71

0.83

0.7

1.1

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

OK 9/1

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

OK 9/1

1.2

1.2

1.4

3.1

4.7

1.1

1.5

1.1

1.2

1.2

1.5

2.9

3.3

4.7

4.8

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

0.77

0.72

0.75

0.54

0.45

0.73

0.91

0.67

0.76

0.59

1.5

0.51

0.57

0.45

0.46

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

OK 9/1

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to create InferenceSession due to ‘[ONNXRuntimeError] : 10 : INVALID…

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

2.5

2.7

2.6

2.9

3.8

2.3

2.9

2.4

4.3

2.3

2.8

2.7

3.1

3.7

3.9

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to create InferenceSession due to ‘[ONNXRuntimeError] : 10 : INVALID…

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

1

1

399

1

0

skl2onnx

1.13.1

1

399

1

0

-1

-1

-1

OK 17/1

0.64

0.64

0.65

0.85

1.6

0.61

0.71

0.59

0.87

0.6

0.73

0.79

0.96

1.5

1.7

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.2.1

1

3

1

426

1

0

skl2onnx

1.13.1

1

426

1

0

-1

-1

-1

OK 17/1

0.71

0.7

0.71

0.79

0.97

0.66

0.76

0.66

0.95

0.6

0.78

0.72

0.84

0.93

1

Full results are available at Availability of scikit-learn model for runtime onnxruntime1.

Profiling

onnxruntime has a tool to verify and test ONNX graphs: onnxruntime_perf_test. It measures the execution time for a graph. It can also be used to profile the code of onnxruntime. On Windows (but it also works on Linux):

  • Creates an onnx graph and its inputs as protobug. Places them in a folder like explained in the page onnxruntime_perf_test.

  • Clone and compile onnxruntime using release with debug information, python tools/ci_build/build.py --build_dir build_dir --config RelWithDebInfo --build_wheel --use_openmp --use_mklml --numpy_version= --skip_onnx_tests

  • Open Visual Studio and modifies the command line of onnxruntime_perf_test.exe as: -s -t 30 <model.onnx> <anything.txt>. Select it as startup project.

  • Starts the profiling.

onnxruntime2: independent onnxruntime for every node#

This runtime does not load the ONNX data in a single session but instead calls onnxruntime for each node independently. This was developped mostly to facilitate the implementation of converters from scikit-learn object to ONNX. We create a table similar to Python Runtime = ‘python’.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG, runtime='onnxruntime2',
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset17

ERROR-msg

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

onnxruntime2

1.2.1

1

4

1

259

1

0

skl2onnx

1.13.1

1

259

1

0

-1

-1

-1

OK 17/1

0.57

0.57

0.58

0.69

1.6

0.55

0.6

0.55

0.6

0.5

0.73

0.59

0.77

1.4

1.8

LinearRegression

m-reg

default

predict

0

{}

null

4

onnxruntime2

1.2.1

1

2

1

301

1

0

skl2onnx

1.13.1

1

301

1

0

-1

-1

-1

OK 17/1

0.56

0.55

0.56

0.65

0.72

0.55

0.59

0.52

0.6

0.51

0.64

0.6

0.7

0.59

0.86

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

onnxruntime2

1.2.1

1

4

1

365

3

3

skl2onnx

1.13.1

-1

365

3

3

1

-1

-1

OK 13/

1.6

1.6

1.7

2.2

5.6

1.6

1.7

1.6

1.7

1.5

1.8

2

2.4

5

6.3

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

onnxruntime2

1.2.1

1

2

1

405

3

3

skl2onnx

1.13.1

-1

405

3

3

1

-1

-1

OK 13/

1.6

1.6

1.6

1.8

2.1

1.5

1.6

1.5

1.6

1.4

1.7

1.6

1.9

2

2.1

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

0.57

0.59

0.6

0.73

1

0.54

0.61

0.54

0.63

0.54

0.68

0.66

0.81

1

1

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

652

4

0

skl2onnx

1.13.1

1

652

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

496

2

0

skl2onnx

1.13.1

1

496

2

0

-1

-1

-1

OK 17/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

0.7

0.67

0.63

0.51

0.47

0.67

0.73

0.63

0.72

0.58

0.71

0.47

0.55

0.46

0.47

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

3

1

681

4

0

skl2onnx

1.13.1

1

681

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

3

1

523

2

0

skl2onnx

1.13.1

1

523

2

0

-1

-1

-1

OK 17/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

3

3.1

3.1

3.3

1.3

2.7

3.1

3

3.2

2.8

3.3

3.1

3.6

1.2

1.3

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

1161

13

5

skl2onnx

1.13.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

1004

11

5

skl2onnx

1.13.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

1

1

399

1

0

skl2onnx

1.13.1

1

399

1

0

-1

-1

-1

OK 17/1

0.64

0.49

0.51

0.71

0.042

0.46

3.4

0.47

0.52

0.43

0.57

0.64

0.76

0.037

0.048

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.2.1

1

3

1

426

1

0

skl2onnx

1.13.1

1

426

1

0

-1

-1

-1

OK 17/1

0.55

0.54

0.55

0.67

0.89

0.52

0.57

0.51

0.57

0.5

0.62

0.62

0.73

0.84

0.95

Full results are available at Availability of scikit-learn model for runtime onnxruntime1.