wasm-micro-runtime/core/iwasm/libraries/wasi-nn/src
YAMAMOTO Takashi 4f86468670
wasi-nn: retire is_model_loaded flag (#4613)
this flag doesn't make much sense anymore because:
- backends validate given graph/ctx by themselves
- some of them support loading multiple models for a context
2025-09-14 14:01:55 +08:00
..
utils wasi-nn: fix set_input memory address validation for the legacy abi (#4534) 2025-08-06 10:40:56 +08:00
wasi_nn_backend.h wasi-nn: reduce code duplication a bit (#4433) 2025-07-01 10:37:12 +08:00
wasi_nn_llamacpp.c wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468) 2025-07-10 08:28:08 +08:00
wasi_nn_onnx.cpp wasi_nn_onnx.cpp: fix debug build (#4564) 2025-08-19 08:23:00 +08:00
wasi_nn_openvino.c wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468) 2025-07-10 08:28:08 +08:00
wasi_nn_private.h wasi-nn: retire is_model_loaded flag (#4613) 2025-09-14 14:01:55 +08:00
wasi_nn_tensorflowlite.cpp wasi_nn_tensorflowlite.cpp: make this compatible with wasmedge (#4517) 2025-08-01 14:31:02 +08:00
wasi_nn.c wasi-nn: retire is_model_loaded flag (#4613) 2025-09-14 14:01:55 +08:00