Add wasi-nn example as smoke test case (#3501)

This commit is contained in:
liang.he 2024-06-07 10:26:09 +08:00 committed by GitHub
parent dc21c62431
commit bd44117676
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 112 additions and 18 deletions

View File

@ -16,23 +16,22 @@ By only including this file in your WASM application you will bind WASI-NN into
To run the tests we assume that the current directory is the root of the repository.
### Build the runtime
Build the runtime image for your execution target type.
`EXECUTION_TYPE` can be:
* `cpu`
* `nvidia-gpu`
* `vx-delegate`
* `tpu`
- `cpu`
- `nvidia-gpu`
- `vx-delegate`
- `tpu`
```
EXECUTION_TYPE=cpu
docker build -t wasi-nn-${EXECUTION_TYPE} -f core/iwasm/libraries/wasi-nn/test/Dockerfile.${EXECUTION_TYPE} .
```
### Build wasm app
```
@ -43,7 +42,6 @@ docker build -t wasi-nn-compile -f core/iwasm/libraries/wasi-nn/test/Dockerfile.
docker run -v $PWD/core/iwasm/libraries/wasi-nn:/wasi-nn wasi-nn-compile
```
### Run wasm app
If all the tests have run properly you will the the following message in the terminal,
@ -52,7 +50,7 @@ If all the tests have run properly you will the the following message in the ter
Tests: passed!
```
* CPU
- CPU
```
docker run \
@ -64,9 +62,9 @@ docker run \
/assets/test_tensorflow.wasm
```
* (NVIDIA) GPU
* Requirements:
* [NVIDIA docker](https://github.com/NVIDIA/nvidia-docker).
- (NVIDIA) GPU
- Requirements:
- [NVIDIA docker](https://github.com/NVIDIA/nvidia-docker).
```
docker run \
@ -79,7 +77,7 @@ docker run \
/assets/test_tensorflow.wasm
```
* vx-delegate for NPU (x86 simulator)
- vx-delegate for NPU (x86 simulator)
```
docker run \
@ -90,9 +88,9 @@ docker run \
/assets/test_tensorflow_quantized.wasm
```
* (Coral) TPU
* Requirements:
* [Coral USB](https://coral.ai/products/accelerator/).
- (Coral) TPU
- Requirements:
- [Coral USB](https://coral.ai/products/accelerator/).
```
docker run \
@ -109,6 +107,45 @@ docker run \
Supported:
* Graph encoding: `tensorflowlite`.
* Execution target: `cpu`, `gpu` and `tpu`.
* Tensor type: `fp32`.
- Graph encoding: `tensorflowlite`.
- Execution target: `cpu`, `gpu` and `tpu`.
- Tensor type: `fp32`.
## Smoke test
Use [classification-example](https://github.com/bytecodealliance/wasi-nn/tree/main/rust/examples/classification-example) as a smoke test case to make sure the wasi-nn support in WAMR is working properly.
> [!Important]
> It requires openvino.
### Prepare the model and the wasm
``` bash
$ pwd
/workspaces/wasm-micro-runtime/core/iwasm/libraries/wasi-nn/test
$ docker build -t wasi-nn-example:v1.0 -f Dockerfile.wasi-nn-example .
```
There are model files(*mobilenet\**) and wasm files(*wasi-nn-example.wasm*) in the directory */workspaces/wasi-nn/rust/examples/classification-example/build* in the image of wasi-nn-example:v1.0.
### build iwasm and test
*TODO: May need alternative steps to build the iwasm and test in the container of wasi-nn-example:v1.0*
``` bash
$ pwd
/workspaces/wasm-micro-runtime
$ docker run --rm -it -v $(pwd):/workspaces/wasm-micro-runtime wasi-nn-example:v1.0 /bin/bash
```
> [!Caution]
> The following steps are executed in the container of wasi-nn-example:v1.0.
``` bash
$ cd /workspaces/wasm-micro-runtime/product-mini/platforms/linux
$ cmake -S . -B build -DWAMR_BUILD_WASI_NN=1 -DWAMR_BUILD_WASI_EPHEMERAL_NN=1
$ cmake --build build
$ ./build/iwasm -v=5 --map-dir=/workspaces/wasi-nn/rust/examples/classification-example/build/::fixture /workspaces/wasi-nn/rust/examples/classification-example/build/wasi-nn-example.wasm
```

View File

@ -0,0 +1,57 @@
# Copyright (C) 2019 Intel Corporation. All rights reserved.
# SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
FROM mcr.microsoft.com/devcontainers/rust:1-1-bullseye
ARG DEBIAN_FRONTEND=noninteractive
ENV TZ=Asian/Shanghai
# hadolint ignore=DL3009
RUN apt-get update \
&& apt-get upgrade -y
#
# Rust targets
RUN rustup target add wasm32-wasi wasm32-unknown-unknown
#
# Openvino
# Refer to
# - https://docs.openvino.ai/2022.3/openvino_docs_install_guides_installing_openvino_from_archive_linux.html
# - https://docs.openvino.ai/2023.3/openvino_docs_install_guides_installing_openvino_from_archive_linux.html
# - https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-archive-linux.html
#
# FIXME: upgrade to 2024.1 or latest after wasi-nn(rust binding) is ready
WORKDIR /opt/intel
RUN wget -q https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3.2/linux/l_openvino_toolkit_ubuntu20_2022.3.2.9279.e2c7e4d7b4d_x86_64.tgz
RUN tar -xf l_openvino_toolkit_ubuntu20_2022.3.2.9279.e2c7e4d7b4d_x86_64.tgz \
&& rm l_openvino_toolkit_ubuntu20_2022.3.2.9279.e2c7e4d7b4d_x86_64.tgz \
&& mv l_openvino_toolkit_ubuntu20_2022.3.2.9279.e2c7e4d7b4d_x86_64 /opt/intel/openvino
WORKDIR /opt/intel/openvino
RUN ./install_dependencies/install_openvino_dependencies.sh -y \
&& ./setupvars.sh
#
# wasmtime
WORKDIR /opt
RUN wget -q https://github.com/bytecodealliance/wasmtime/releases/download/v21.0.0/wasmtime-v21.0.0-x86_64-linux.tar.xz
RUN tar -xf wasmtime-v21.0.0-x86_64-linux.tar.xz \
&& rm wasmtime-v21.0.0-x86_64-linux.tar.xz \
&& ln -sf "$(realpath ./wasmtime-v21.0.0-x86_64-linux/wasmtime)" /usr/local/bin/wasmtime
#
# wasi-nn
WORKDIR /workspaces/wasi-nn
RUN git clone --depth 1 https://github.com/bytecodealliance/wasi-nn.git .
# hadolint ignore=DL3059
RUN ./build.sh rust
# There are model files(mobilenet*) and wasm files(wasi-nn-example.wasm) in the directory,
# /workspaces/wasi-nn/rust/examples/classification-example/build
RUN apt-get autoremove -y \
&& apt-get clean -y \
&& rm -rf /tmp/*
WORKDIR /workspaces