Răsfoiți Sursa

wasi-nn: do not pretend to support legacy abi in openvino and llamacpp (#4468)

as tested by core/iwasm/libraries/wasi-nn/test/test_tensorflow.c,
the legacy "wasi_nn" abi uses the number of fp32 for get_output.
because these backends don't implement the abi, bail out explicitly
in build time.

cf.
https://github.com/bytecodealliance/wasm-micro-runtime/issues/4376
YAMAMOTO Takashi 6 luni în urmă
părinte
comite
56f87b7ee9

+ 4 - 0
core/iwasm/libraries/wasi-nn/src/wasi_nn_llamacpp.c

@@ -17,6 +17,10 @@ extern char const *LLAMA_COMMIT;
 extern char const *LLAMA_COMPILER;
 extern char const *LLAMA_BUILD_TARGET;
 
+#if WASM_ENABLE_WASI_EPHEMERAL_NN == 0
+#error This backend doesn't support legacy "wasi_nn" abi. Please enable WASM_ENABLE_WASI_EPHEMERAL_NN.
+#endif
+
 // compatible with WasmEdge
 // https://github.com/second-state/WasmEdge-WASINN-examples/blob/master/wasmedge-ggml/README.md#parameters
 // https://github.com/WasmEdge/WasmEdge/blob/master/plugins/wasi_nn/ggml.cpp

+ 4 - 0
core/iwasm/libraries/wasi-nn/src/wasi_nn_openvino.c

@@ -9,6 +9,10 @@
 
 #include "openvino/c/openvino.h"
 
+#if WASM_ENABLE_WASI_EPHEMERAL_NN == 0
+#error This backend doesn't support legacy "wasi_nn" abi. Please enable WASM_ENABLE_WASI_EPHEMERAL_NN.
+#endif
+
 /*
  * refer to
  * https://docs.openvino.ai/2024/openvino-workflow/running-inference/integrate-openvino-with-your-application.html