NPU Integration Information

NPU Android Linux Environment Preparation Operators Supported Intermediate Expert

View Source On Gitee

Steps

Environment Preparation

Besides basic Environment Preparation, HUAWEI HiAI DDK, which contains APIs (including building, loading models and calculation processes) and interfaces implemented to encapsulate dynamic libraries (namly libhiai*.so), is required for the use of NPU. Download DDK and set the directory of extracted files as ${HWHIAI_DDK}. Our build script uses this environment viriable to seek DDK.

Build

Under the Linux operating system, one can easily build MindSpore Lite Package integrating NPU interfaces and libraries using build.sh under the root directory of MindSpore Source Code. The command is as follows. It will build MindSpore Lite’s package under the output directory under the MindSpore source code root directory, which contains the NPU’s dynamic library, the libmindspore-lite dynamic library, and the test tool Benchmark.

bash build.sh -I arm64 -e npu

For more information about compilation, see Linux Environment Compilation.

Integration

  • Integration instructions

    When developers need to integrate the use of NPU features, it is important to note:

    • Configure the NPU backend. For more information about using Runtime to perform inference, see Using Runtime for Model Inference (C++).

    • Compile and execute the binary. If you use dynamic linking, please set environment variables to dynamically link libhiai.so, libhiai_ir.so, and libhiai_ir_build.so. For example,

      export LD_LIBRARY_PATH=mindspore-lite-{version}-inference-android-{arch}/inference/third_party/hiai_ddk/lib/:$LD_LIBRARY_PATH
      

      For more information about compilation output, please refer to Compilation Output with compilation option -I arm64 or -I arm32.

  • Using Benchmark testing NPU inference

    Users can also test NPU inference using MindSpore Lite’s Benchmark tool. For the Benchmark tool location, see Compilation Output. Pass the build package to the /data/local/tmp/ directory of an Android phone equipped with NPU chips (For supported NPU chips, see Chipset Platforms and Supported HUAWEI HiAI Versions) and test NPU inference using the Benchmark tool on the phone, as shown in the example below:

    • Test performance

    ./benchmark --device=NPU --modelFile=./models/test_benchmark.ms --timeProfiling=true
    
    • Test precision

    ./benchmark --device=NPU --modelFile=./models/test_benchmark.ms --inDataFile=./input/test_benchmark.bin --inputShapes=1,32,32,1 --accuracyThreshold=3 --benchmarkDataFile=./output/test_benchmark.out
    

For more information about the use of Benchmark, see Benchmark Use.

For environment variable settings, you need to set the directory where the libmindspore-lite.so (under the directory mindspore-lite-{version}-inference-android-{arch}/inference/lib) and NPU libraries (under the directory mindspore-lite-{version}-inference-android-{arch}/inference/third_party/hiai_ddk/lib/) are located, to ${LD_LIBRARY_PATH}. The directory is specified in Compilation Output with compilation option -I arm64 or -I arm32.

Supported Chips

For supported NPU chips, see Chipset Platforms and Supported HUAWEI HiAI Versions.

Supported Operators

For supported NPU operators, see Lite Operator List.