Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conflict constraints checking/description about PoolAttributes #23088

Open
Cookiee235 opened this issue Dec 12, 2024 · 2 comments
Open

Conflict constraints checking/description about PoolAttributes #23088

Cookiee235 opened this issue Dec 12, 2024 · 2 comments

Comments

@Cookiee235
Copy link

Describe the issue

For the simple model, It can pass the validity checking which seems a valid model.

However, when executing the inference on this model, It crashes unexpectedly and throws the crash message which shows that the MaxPool has a constraint about the attribute pad and kernel (Pad should be smaller than kernel).

When I further checked the official documentation (https://onnx.ai/onnx/operators/onnx__MaxPool.html), it showed that "the pad can take any value greater than or equal to 0".

I'm confused. Who can tell me which checking/description is fake? 1. the model checker provided by onnx? 2. the documentation? 3. the exception message thrown by ONNXRuntime?

Stacktrace

Traceback (most recent call last):
  File "/share_container/optfuzz/ONNX/debug.py", line 83, in <module>
    optimize_and_infer("../res/onnx_ut/onnx_models_gf/RuleTransformerL1/RuleTransformerL1_4.onnx")
  File "/share_container/optfuzz/ONNX/debug.py", line 60, in optimize_and_infer
    original_session = ort.InferenceSession(model_path)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/software/onnxruntime/build/Linux/Release/onnxruntime/capi/onnxruntime_inference_collection.py", line 465, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/software/onnxruntime/build/Linux/Release/onnxruntime/capi/onnxruntime_inference_collection.py", line 537, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /software/onnxruntime/onnxruntime/core/providers/cpu/nn/pool_attributes.h:78 onnxruntime::PoolAttributes::PoolAttributes(const onnxruntime::OpNodeProtoHelper<onnxruntime::ProtoHelperNodeContext>&, const string&, int) pads[dim] < kernel_shape[dim] && pads[dim + kernel_shape.size()] < kernel_shape[dim] was false. Pad should be smaller than kernel.

To reproduce

Step 1. Download the simple model via this link

Step 2. Run the script:

import onnx
import onnxruntime as ort


model_path = "model5.onnx"
model = onnx.load(model_path)
onnx.checker.check_model(model)
print(onnx.helper.printable_graph(model.graph))

original_session = ort.InferenceSession(model_path)



Urgency

No response

Platform

Linux

OS Version

Ubuntu 20.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

5c1b7cc

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@xadupre
Copy link
Member

xadupre commented Dec 17, 2024

Is it possible to reduce the padding? Is torch of tensorflow supporting this case?

@Cookiee235
Copy link
Author

@xadupre Thank you for your reply. Indeed, reducing padding is a workaround to avoid this crash.

After investigation, I found PyTorch allows padding larger than the kernel size, and they don't throw errors in similar cases. Thus, we may fix the source code of ONNXRuntime to align with the ONNX specification.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants