mirror of
https://github.com/bytecodealliance/wasm-micro-runtime.git
synced 2025-02-06 15:05:19 +00:00
eb29385963
Implement more wasm-c-api APIs to support Envoy integration: - sync up with latest c-api definition - change CMakeLists to export necessary headers and install the static library of iwasm - enable to export tables and memories - support memorytype and tabletype APIs - update wasm-c-api sampels - enable to export importtype APIs And refine bazel scripts for sample XNNPACK workload, add license headers for sample simple. Signed-off-by: Wenyong Huang <wenyong.huang@intel.com> |
||
---|---|---|
.. | ||
.gitignore | ||
CMakeLists.txt | ||
docker_build.sh | ||
README.md | ||
xnnpack.patch |
"XNNPACK" sample introduction
This sample demonstrates how to build XNNPACK benchmarks into WebAssembly with emsdk toolchain and run them with iwasm.
Installation toolchains
-
bazel. Please install bazel from latest release
-
emsdk. Please install emsdk to /opt/emsdk:
cd /opt
git clone https://github.com/emscripten-core/emsdk.git
cd emsdk
./emsdk install latest
./emsdk activate latest
And set up ensdk environment:
source /opt/emsdk/emsdk_env.sh
Build XNNPACK
cd <wamr-dir>/samples/workload/XNNPACK
mkdir build
cd build
cmake ..
The wasm files are generated under folder samples/workload/XNNPACK/xnnpack/bazel-bin.
Run benchmarks
Firstly please build iwasm with simd, libc-emcc and lib-pthread support:
$ cd <wamr-dir>/product-mini/platforms/linux/
$ mkdir build && cd build
$ cmake .. -DWAMR_BUILD_SIMD=1 -DWAMR_BUILD_LIBC_EMCC=1 -DWAMR_BUILD_LIB_PTHREAD=1
$ make
And please build wamrc:
cd <wamr-dir>/wamr-compiler
./build_llvm.sh
mkdir build && cd build
cmake ..
make
Then compile wasm file to aot file and run:
$ cd <wamr-dir>/samples/workload/XNNPACK/xnnpack/bazel-bin
$ wamrc --enable-simd -o average_pooling_bench.aot average_pooling_bench.wasm (or other wasm files)
$ iwasm average_pooling_bench.aot