Skip to content

[CANN]Bug: CANN run error on 300I DUO #16628

@stellarzhou

Description

@stellarzhou

Name and Version

llama.cpp release version: b6765
built with gcc 12.3.1
openeuler 22.03 LTS-sp4 aarch64
cann:8.0.0

Compile option such as:
cmake -B build -DGGML_CANN=on -DCMAKE_BUILD_TYPE=Debug -DSOC_TYPE=Ascend310P3
cmake --build build --config debug

Operating systems

Linux

GGML backends

cann

Hardware

atlas 300I DUO
kunpeng 920 5220

Models

No response

Problem description & steps to reproduce

when run ./llama-cli --version appear error
`./llama-cli --version
CANN error: EE1001: [PID: 753761] 2025-10-17-11:53:32.738.425 The argument is invalid.Reason: rtMemGetInfoEx execute failed, reason=[context pointer null]
Solution: 1.Check the input parameter range of the function. 2.Check the function invocation relationship.
TraceBack (most recent call last):
Check param failed, curCtx can not be null.[FUNC:MemGetInfoEx][FILE:api_impl.cc][LINE:2297]
The argument is invalid.Reason: rtMemGetInfoEx execute failed, reason=[context pointer null]
get memory information failed, runtime result = 107002[FUNC:ReportCallError][FILE:log_inner.cpp][LINE:161]
ctx is NULL![FUNC:GetDevErrMsg][FILE:api_impl.cc][LINE:5372]
The argument is invalid.Reason: rtGetDevMsg execute failed, reason=[context pointer null]

First Bad Commit

No response

Relevant log output

`./llama-cli --version
CANN error: EE1001: [PID: 753761] 2025-10-17-11:53:32.738.425 The argument is invalid.Reason: rtMemGetInfoEx execute failed, reason=[context pointer null]
        Solution: 1.Check the input parameter range of the function. 2.Check the function invocation relationship.
        TraceBack (most recent call last):
        Check param failed, curCtx can not be null.[FUNC:MemGetInfoEx][FILE:api_impl.cc][LINE:2297]
        The argument is invalid.Reason: rtMemGetInfoEx execute failed, reason=[context pointer null]
        get memory information failed, runtime result = 107002[FUNC:ReportCallError][FILE:log_inner.cpp][LINE:161]
        ctx is NULL![FUNC:GetDevErrMsg][FILE:api_impl.cc][LINE:5372]
        The argument is invalid.Reason: rtGetDevMsg execute failed, reason=[context pointer null]

  current device: 0, in function ggml_backend_cann_get_device_memory at /dahuacloud/user/data/data6701/llama.cpp-b6765/ggml/src/ggml-cann/ggml-cann.cpp:3072
  aclrtGetMemInfo(ACL_HBM_MEM, free, total)
/dahuacloud/user/data/data6701/llama.cpp-b6765/ggml/src/ggml-cann/ggml-cann.cpp:69: CANN error
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-base.so(+0x13ba4)[0xffffb0033ba4]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-base.so(ggml_print_backtrace+0x1ec)[0xffffb0033f7c]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-base.so(ggml_abort+0xfc)[0xffffb00340fc]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-cann.so(+0x277e8)[0xffffb00f77e8]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-cann.so(_Z14ggml_cann_infov+0x0)[0xffffb00f9af0]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-cann.so(_Z14ggml_cann_infov+0xec)[0xffffb00f9bdc]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml-cann.so(ggml_backend_cann_reg+0x7c)[0xffffb00f9d7c]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml.so(+0x295c)[0xffffb029295c]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml.so(+0x3f90)[0xffffb0293f90]
/dahuacloud/user/data/data6701/llama.cpp-b6765/build/bin/libggml.so(ggml_backend_load_all_from_path+0x30)[0xffffb0295490]
./llama-cli[0x42e620]
./llama-cli[0x44a728]
./llama-cli[0x417550]
/usr/lib64/libc.so.6(+0x276c4)[0xffffafa976c4]
/usr/lib64/libc.so.6(__libc_start_main+0x98)[0xffffafa977a8]
./llama-cli[0x41c330]
Aborted (core dumped)`

Metadata

Metadata

Assignees

No one assigned

    Labels

    Ascend NPUissues specific to Ascend NPUs

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions