wasm-micro-runtime/core/iwasm/libraries/wasi-nn/src
YAMAMOTO Takashi 56f87b7ee9
wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468)
as tested by core/iwasm/libraries/wasi-nn/test/test_tensorflow.c,
the legacy "wasi_nn" abi uses the number of fp32 for get_output.
because these backends don't implement the abi, bail out explicitly
in build time.

cf.
https://github.com/bytecodealliance/wasm-micro-runtime/issues/4376
2025-07-10 08:28:08 +08:00
..
utils wasi-nn: make the host use the wasi_ephemeral_nn version of tensor_data (#4411) 2025-06-27 07:41:42 +08:00
wasi_nn_backend.h wasi-nn: reduce code duplication a bit (#4433) 2025-07-01 10:37:12 +08:00
wasi_nn_llamacpp.c wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468) 2025-07-10 08:28:08 +08:00
wasi_nn_openvino.c wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468) 2025-07-10 08:28:08 +08:00
wasi_nn_private.h wasi-nn: make the host use the wasi_ephemeral_nn version of tensor_data (#4411) 2025-06-27 07:41:42 +08:00
wasi_nn_tensorflowlite.cpp wasi-nn: reduce code duplication a bit (#4433) 2025-07-01 10:37:12 +08:00
wasi_nn.c wasi-nn: make the host use the wasi_ephemeral_nn version of tensor_data (#4411) 2025-06-27 07:41:42 +08:00