Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FuseReluClip Unexpected data type for Clip 'min' input of 11 #23116

Open
Cookiee235 opened this issue Dec 16, 2024 · 3 comments
Open

FuseReluClip Unexpected data type for Clip 'min' input of 11 #23116

Cookiee235 opened this issue Dec 16, 2024 · 3 comments
Labels
core runtime issues related to core runtime

Comments

@Cookiee235
Copy link

Describe the issue

The valid model which has been checked by the onnx.checker.check_model(model) statement crash when executing the inference unexpected.
The crash message is shown below:

The Crash stacktrace

 Traceback (most recent call last):
  File "test.py", line 10, in <module>
    original_session = ort.InferenceSession(model_path)
  File "C:\software\conda\envs\OPTFuzz\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "C:\software\conda\envs\OPTFuzz\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 491, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: D:\a\_work\1\s\onnxruntime\core\optimizer\relu_clip_fusion.cc:77 onnxruntime::FuseReluClip::Apply Unexpected data type for Clip 'min' input of 11

To reproduce

Step 1: Download the model via this link

Step 2: run the following script:

import onnx
import onnxruntime as ort

model_path = "model_with_clip.onnx"
model = onnx.load(model_path)
onnx.checker.check_model(model)
print(onnx.helper.printable_graph(model.graph))

original_session = ort.InferenceSession(model_path)

Urgency

No response

Platform

Linux

OS Version

Ubuntu 20.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

5c1b7cc

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@yuslepukhin yuslepukhin added the core runtime issues related to core runtime label Dec 16, 2024
@skottmckay
Copy link
Contributor

ORT doesn't support fp64 for the min/max. We typically support common production use-cases. Not sure there's any need to use fp64 for the min/max so I'd suggest changing those to be fp32.

@xadupre
Copy link
Member

xadupre commented Dec 17, 2024

onnxruntime should support min/max for float64 in onnxruntime but the optimizer FuseReluClip is not (see line https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/core/optimizer/relu_clip_fusion.cc#L61). I wonder if we should fail on this case.

@skottmckay
Copy link
Contributor

If the implementation has limited type support FuseReluClip::SatisfyCondition should be checking the input types.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core runtime issues related to core runtime
Projects
None yet
Development

No branches or pull requests

4 participants