mirror of
https://github.com/bytecodealliance/wasm-micro-runtime.git
synced 2025-02-09 00:15:07 +00:00
e53ab91439
Initial integration of WASI-NN based on #1225: - Implement the library core/iwasm/libraries/wasi-nn - Support TensorFlow, CPU, F32 at the first stage - Add cmake variable `-DWAMR_BUILD_WASI_NN` - Add test case based on Docker image and update document Refer to #1573 |
||
---|---|---|
.. | ||
test | ||
.dockerignore | ||
logger.h | ||
README.md | ||
wasi_nn_common.h | ||
wasi_nn_native.c | ||
wasi_nn_tensorflow.cpp | ||
wasi_nn_tensorflow.hpp | ||
wasi_nn.cmake | ||
wasi_nn.h |
WASI-NN
How to use
Enable WASI-NN in the WAMR by spefiying it in the cmake building configuration as follows,
set (WAMR_BUILD_WASI_NN 1)
The definition of the functions provided by WASI-NN is in the header file core/iwasm/libraries/wasi-nn/wasi_nn.h
.
By only including this file in your WASM application you will bind WASI-NN into your module.
Tests
To run the tests we assume that the current directory is the root of the repository.
- Build the docker image,
docker build -t wasi-nn -f core/iwasm/libraries/wasi-nn/test/Dockerfile .
- Run the container
docker run wasi-nn
If all the tests have run properly you will the the following message in the terminal,
Tests: passed!
What is missing
- Only 1 model at a time is supported.
graph
andgraph-execution-context
are ignored.
- Only
tensorflow
(lite) is supported. - Only
cpu
is supported.