wasm-micro-runtime/samples/workload/XNNPACK
2023-03-30 21:15:21 +08:00
..
.gitignore Fix app manager install atomics app issue and optimize workload scripts (#458) 2020-12-04 15:35:45 +08:00
benchmark.patch Implement SIMD latest opcodes and update LLVM to 13.0 (#758) 2021-09-17 19:12:57 +08:00
CMakeLists.txt Fix compilation errors of workload xnnpack and meshoptimizer (#2081) 2023-03-30 21:15:21 +08:00
README.md Upgrade toolkits (#1878) 2023-02-02 09:42:25 +08:00
xnnpack.patch Fix compilation errors of workload xnnpack and meshoptimizer (#2081) 2023-03-30 21:15:21 +08:00

"XNNPACK" sample introduction

This sample demonstrates how to build XNNPACK benchmarks into WebAssembly with emsdk toolchain and run them with iwasm.

Installation toolchains

please refer to installation instructions.

Build XNNPACK

cd <wamr-dir>/samples/workload/XNNPACK
mkdir build
cd build
cmake ..

The wasm files are generated under folder samples/workload/XNNPACK/xnnpack/bazel-bin.

Run benchmarks

Firstly please build iwasm with simd, libc-emcc and lib-pthread support:

$ cd <wamr-dir>/product-mini/platforms/linux/
$ mkdir build && cd build
$ cmake .. -DWAMR_BUILD_LIBC_EMCC=1 -DWAMR_BUILD_LIB_PTHREAD=1
$ make

And please build wamrc:

cd <wamr-dir>/wamr-compiler
./build_llvm.sh
mkdir build && cd build
cmake ..
make

Then compile wasm file to aot file and run:

$ cd <wamr-dir>/samples/workload/XNNPACK/xnnpack/bazel-bin
$ wamrc -o average_pooling_bench.aot average_pooling_bench.wasm  (or other wasm files)
$ iwasm average_pooling_bench.aot