Skip to content

LogSoftmax produces different results when opset=11 or 13 #25402

@coffezhou

Description

@coffezhou

Describe the issue

For the following onnx model,

Image

it only contains a LogSoftmax operator with 'axis=1' and an 4D input. When opset=11 or 13, LogSoftmax produces different results.
For opset=11, the results are as follows:

[array([[[[-2.6983917, -2.7265973],
         [-3.1844459, -3.047412 ]],

        [[-3.316408 , -3.2627435],
         [-3.2462628, -3.225058 ]],

        [[-1.977868 , -2.062889 ],
         [-1.1356447, -3.2581956]]]], dtype=float32)]

However, when opset=13 or higher, the results are as follows:

 [array([[[[-1.2794111 , -1.2604415 ],
         [-2.2719865 , -0.973498  ]],

        [[-1.8974271 , -1.7965877 ],
         [-2.3338037 , -1.151144  ]],

        [[-0.5588873 , -0.5967334 ],
         [-0.22318542, -1.1842817 ]]]], dtype=float32)]
Mismatched elements: 12 / 12 (100%)
Max absolute difference among violations: 2.073914
Max relative difference among violations: 4.088346

Although there are some differences between between the version 11 and version 13 of LogSoftmax , the results should be identical when the input is a 4D tensor and 'axis=1'。
I find this issue when I run a model using tensorrt, which produces a different results of onnxruntime.

To reproduce

Environment

OS: Ubuntu 20.04
onnxruntime: 1.23.0.dev20250714001
Python Version: 3.12.9

Steps to reproduce

This bug can be reproduced by the following code with the model in the attachment.

from typing import Dict, List, Literal, Optional
import sys
import os

import numpy as np
import onnx
import onnxruntime
from onnx import ModelProto, TensorProto, helper, mapping
from onnx.reference import ReferenceEvaluator


import argparse
import pickle

def test() -> None:
  
    onnx_model1 = onnx.load("2.onnx")
    
    onnx_model1.ir_version = 10

    print(onnx_model1.opset_import[0].version)
    
    with open("inputs.pkl", 'rb') as fp:
        inputs = pickle.load(fp)


    ort_session1 = onnxruntime.InferenceSession(
            onnx_model1.SerializeToString(), providers=["CPUExecutionProvider"]
        )
    ort_output1 = ort_session1.run([], inputs)
    
    print("ONNXRuntime:\n", ort_output1)
    
    onnx_model2 = onnx.load("2.onnx")
    onnx_model2.ir_version = 10
    onnx_model2.opset_import[0].version = 13
    
    
    ort_session2 = onnxruntime.InferenceSession(
            onnx_model2.SerializeToString(), providers=["CPUExecutionProvider"]
        )
    ort_output2 = ort_session2.run([], inputs)
    
    print("ONNXRuntime:\n", ort_output2)
    
    np.testing.assert_allclose(ort_output1[0], ort_output2[0], rtol=0.1, atol=0.1)
    
    
if __name__ == "__main__":
    test()

testcase.zip

Urgency

No response

Platform

Linux

OS Version

Ubuntu 20.04

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.23.0.dev20250714001

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions