Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On linux env, onnx costs high anon memory, but not been released #23117

Open
jiangjiongyu opened this issue Dec 16, 2024 · 2 comments
Open

On linux env, onnx costs high anon memory, but not been released #23117

jiangjiongyu opened this issue Dec 16, 2024 · 2 comments
Labels
api:Java issues related to the Java API

Comments

@jiangjiongyu
Copy link

Describe the issue

process running heap info:
leak_2189015_107.pdf

To reproduce

Running Code

public class Test {

    private static OrtEnvironment env;

    private static OrtSession session;

    public static long count;

    public static long channels;

    public static long netHeight;

    public static long netWidth;

    public synchronized static void init(String onnxPath) {
        if (StringUtils.isEmpty(onnxPath)) {
            return;
        }
        if (env == null || session == null) {
            loadModel(onnxPath);
        }
    }

    private static void loadModel(String onnxPath) {
        long startTime = System.currentTimeMillis();
        try {
            env = OrtEnvironment.getEnvironment("createSessionFromPath");
            OrtSession.SessionOptions options = new OrtSession.SessionOptions();
            session = env.createSession(onnxPath, options);
            count = 1; 
            channels = 3; 
            netHeight = 224;
            netWidth = 224;
        } catch (Throwable e) {

        }
    }


    public static String detect(String imgPath) {
        if (env == null || session == null) {
            return "";
        }
        long startTime = System.currentTimeMillis();
        Mat img = Imgcodecs.imread(imgPath);
        Mat resizedImage = new Mat();
        float[][] output1 = new float[0][];
        try {
            Size size = new Size(netWidth, netHeight);
            Imgproc.resize(img, resizedImage, size);
            float[][][][] convertArray = convertToFloatArray(resizedImage);
            long modelStartTime = System.currentTimeMillis();
            try (OnnxTensor tensor = OnnxTensor.createTensor(env, convertArray)) {
                String inputName = session.getInputNames().iterator().next();
                try (OrtSession.Result output = session.run(Collections.singletonMap(inputName, tensor))) {
                    output1 = (float[][]) output.get(0).getValue();
                }
            }
        } catch (Throwable e) {
            e.printStackTrace();
        } finally {
            resizedImage.release();
            img.release();
        }

        if (output1.length == 0) {
            return "";
        }

        return  getResult(output1);
    }

    private static String getResult(float[][] output1) {
        int index = 0;
        float max = Float.MIN_VALUE;
        for (int i = 0; i < output1[0].length; i++) {
            if (output1[0][i] > max) {
                max = output1[0][i];
                index = i;
            }
        }
        return String.valueOf(index);
    }


    private static float[][][][] convertToFloatArray(Mat mat) {
        int channels = mat.channels();
        int rows = mat.rows();
        int cols = mat.cols();
        float[][][][] floatArray = new float[1][3][rows][cols];
        for (int i = 0; i < rows; i++) {
            for (int j = 0; j < cols; j++) {
                for (int c = 0; c < channels; c++) {
                    floatArray[0][c][i][j] = (float)mat.get(i, j)[c] / 255.0f;
                }
            }
        }
        return floatArray;
    }
}

Urgency

12.17

Platform

Linux

OS Version

Linux 81-228 5.15.0-122-generic(ldd (Ubuntu GLIBC 2.35-0ubuntu3.8) 2.35)

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.15.0

ONNX Runtime API

Java

Architecture

X86

Execution Provider

Default CPU

Execution Provider Library Version

No response

@github-actions github-actions bot added the api:Java issues related to the Java API label Dec 16, 2024
@Craigacp
Copy link
Contributor

I can't see why it's allocating lots of strings unless you're recreating the OrtSession and OrtSession.SessionOptions a lot inside a loop without closing both of them. The code you've given doesn't close the SessionOptions which it should do, but that typically isn't instantiated that frequently.

Does it replicate on ORT 1.20.0?

@jiangjiongyu
Copy link
Author

I can't see why it's allocating lots of strings unless you're recreating the OrtSession and OrtSession.SessionOptions a lot inside a loop without closing both of them. The code you've given doesn't close the SessionOptions which it should do, but that typically isn't instantiated that frequently.

Does it replicate on ORT 1.20.0?

LoadModel function been executed only once,just use detect function frequently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api:Java issues related to the Java API
Projects
None yet
Development

No branches or pull requests

2 participants