Building MindSpore Lite

Windows Linux Android Environment Preparation Intermediate Expert

View Source On Gitee

This chapter introduces how to quickly compile MindSpore Lite, which includes the following modules:

Modules in inference version:

Module

Support Platform

Description

converter

Linux, Windows

Model Conversion Tool

runtime(cpp, java)

Linux, Windows, Android

Model Inference Framework(Windows platform does not support java version runtime)

benchmark

Linux, Windows, Android

Benchmarking Tool

cropper

Linux

Static library crop tool for libmindspore-lite.a

minddata

Linux, Android

Image Processing Library

Modules in training version:

Module

Support Platform

Description

converter

Linux

Model Conversion Tool

runtime(cpp)

Linux, Android

Model Train Framework(java is not support)

cropper

Linux

Static library crop tool for libmindspore-lite.a

minddata

Linux, Android

Image Processing Library

benchmark_train

Linux, Android

Performance and Accuracy Validation

obfuscator

Linux

Model Obfuscation Tool

Linux Environment Compilation

Environment Requirements

  • To install and use Android_NDK, you need to configure environment variables. The command example is export ANDROID_NDK=${NDK_PATH}/android-ndk-r20b.

  • After Gradle is installed, you need to add its installation path to the PATH: export PATH=${GRADLE_PATH}/bin:$PATH.

  • To install the Android SDK via Android command line tools, you need to create a new directory first and configure its path to the environment variable in ${ANDROID_SDK_ROOT}, then create SDK via sdkmanager: ./sdkmanager --sdk_root=$ {ANDROID_SDK_ROOT} "cmdline-tools;latest", and finally accept the license through sdkmanager under the ${ANDROID_SDK_ROOT} directory: yes | ./sdkmanager --licenses.

  • Compiling AAR relies on Android SDK Build-Tools, Android SDK Platform-Tools and other Android SDK related components. If the Android SDK in the environment does not have related components, the required dependencies will be automatically downloaded during compilation.

  • When compiling the NPU operator, you need to download DDK V500.010, the directory where the compressed package is decompressed Set to the environment variable ${HWHIAI_DDK}.

Compilation Options

MindSpore Lite provides a compilation script build.sh for one-click compilation, located in the root directory of MindSpore. This script can be used to compile the code of training and inference. The following describes the compilation options of MindSpore Lite.

Parameter

Parameter Description

Value Range

Mandatory or No

-I

Selects an applicable architecture. This option is required when compile MindSpore Lite.

arm64, arm32, or x86_64

No

-d

If this parameter is set, the debug version is compiled. Otherwise, the release version is compiled.

None

No

-i

If this parameter is set, incremental compilation is performed. Otherwise, full compilation is performed.

None

No

-j[n]

Sets the number of threads used during compilation. Otherwise, the number of threads is set to 8 by default.

Integer

No

-e

In the ARM architecture, select the backend operator. Otherwise, all operator of the framework is compiled at the same time.

cpu, gpu, npu

No

-h

Displays the compilation help information.

None

No

-n

Specifies to compile the lightweight image processing module.

lite_cv

No

-A

Language used by mindspore lite, default cpp. If the parameter is set to java, the AAR and JAR for Linux x86 are compiled.

cpp, java

No

-C

If this parameter is set, the converter is compiled, default on.

on, off

No

-o

If this parameter is set, the benchmark and static library crop tool are compiled, default on.

on, off

No

-t

If this parameter is set, the testcase is compiled, default off.

on, off

No

-T

If this parameter is set, MindSpore Lite training version is compiled, i.e., this option is required when compiling, default off.

on, off

No

-W

Enable x86_64 SSE or AVX instruction set, default off.

sse, avx, off

No

  • When the -I parameter changes, such as -I x86_64 is converted to -I arm64, adding -i for parameter compilation does not take effect.

  • When compiling the AAR package, the -A java parameter must be added, and there is no need to add the -I parameter. By default, the built-in CPU and GPU operators are compiled at the same time.

  • The compiler will only generate training packages when -T is opened.

  • Any -e compilation option, the CPU operators will be compiled into it.

Compilation Example

First, download source code from the MindSpore code repository.

git clone https://gitee.com/mindspore/mindspore.git -b r1.2

Then, run the following commands in the root directory of the source code to compile MindSpore Lite of different versions:

  • Debug version of the x86_64 architecture:

    bash build.sh -I x86_64 -d
    
  • Release version of the x86_64 architecture, with the number of threads set:

    bash build.sh -I x86_64 -j32
    
  • Release version of the x86_64 architecture, with the testcase compiled:

    bash build.sh -I x86_64 -t on
    
  • Release version of the ARM 64-bit architecture in the incremental compilation mode, with the number of threads set:

    bash build.sh -I arm64 -i -j32
    
  • Release version of the ARM 64-bit architecture, with the built-in CPU operators compiled:

    bash build.sh -I arm64 -e cpu
    
  • Release version of the ARM 64-bit architecture, with the built-in CPU and GPU operators compiled:

    bash build.sh -I arm64 -e gpu
    
  • Release version of the ARM 64-bit architecture, with the built-in CPU and NPU operators compiled:

    bash build.sh -I arm64 -e npu
    
  • Compile ARM64 with image preprocessing module:

    bash build.sh -I arm64 -n lite_cv
    
  • Compile MindSpore Lite AAR and JAR for Linux x86, AAR compiles the built-in CPU and GPU operators at the same time, but JAR only compiles the built-in CPU:

    bash build.sh -A java
    
  • Compile MindSpore Lite AAR and JAR for Linux x86, with the built-in CPU operators compiled:

    bash build.sh -A java -e cpu
    
  • Release version of the x86_64 architecture, with the benchmark, cropper and converter compiled:

    bash build.sh -I x86_64
    
  • Release version of the x86_64 architecture, with the converter compiled and train on device enabled:

    bash build.sh -I x86_64 -T on
    

Inference Output Description

After the compilation is complete, go to the mindspore/output directory of the source code to view the file generated after compilation. The file is divided into the following parts.

  • mindspore-lite-{version}-inference-{os}-{arch}.tar.gz: Contains model inference framework runtime (cpp), and related tools.

  • mindspore-lite-maven-{version}.zip: Contains model reasoning framework AAR package.

  • version: Version of the output, consistent with that of the MindSpore.

  • os: Operating system on which the output will be deployed.

  • arch: System architecture on which the output will be deployed.

Execute the decompression command to obtain the compiled output:

tar -xvf mindspore-lite-{version}-inference-{os}-{arch}.tar.gz
unzip mindspore-lite-maven-{version}.zip

Description of Converter’s Directory Structure

The conversion tool is only available under the -I x86_64 compilation option, and the content includes the following parts:

mindspore-lite-{version}-inference-linux-x64
└── tools
    └── converter
        ├── converter                # Model conversion tool
        │   └── converter_lite       # Executable program
        ├── lib                      # The dynamic link library that converter depends
        │   └── libmindspore_gvar.so # A dynamic library that stores some global variables
        └── third_party              # The dynamic link library that converter depends
            └── glog
                └── lib
                    └── libglog.so.0 # Dynamic library of Glog

Description of CodeGen’s Directory Structure

The codegen executable program is only available under the -I x86_64 compilation option, and only the operator library required by the inference code generated by codegen is generated under the -I arm64 and -I arm32 compilation options.

  • When the compilation option is -I x86_64:

    mindspore-lite-{version}-inference-linux-x64
    └── tools
        └── codegen # Code generation tool
            ├── codegen          # Executable program
            ├── include          # Header files of inference framework
            │   ├── nnacl        # nnacl operator header file
            │   └── wrapper
            ├── lib
            │   └── libwrapper.a # MindSpore Lite CodeGen generates code dependent operator static library
            └── third_party
                ├── include
                │   └── CMSIS    # ARM CMSIS NN operator header files
                └── lib
                    └── libcmsis_nn.a # ARM CMSIS NN operator static library
    
  • When the compilation option is -I arm64 or -I arm32:

    mindspore-lite-{version}-inference-android-{arch}
    └── tools
        └── codegen # Code generation tool
            └── operator_library # Operator library
                ├── include   # Header files of inference framework
                │   ├── nnacl # nnacl operator header file
                │   └── wrapper
                └── lib       # Inference framework library
                    └── libwrapper.a # MindSpore Lite CodeGen generates code dependent static library
    

Description of Obfuscator’s Directory Structure

The obfuscation tool is only available under the -I x86_64 compilation option and the ENABLE_MODEL_OBF compilation option in mindspore/mindspore/lite/CMakeLists.txt is turned on, the content includes the following parts:

mindspore-lite-{version}-inference-linux-x64
└── tools
    └── obfuscator # Model obfuscation tool
        └── msobfuscator          # Executable program

Description of Runtime and Other tools’ Directory Structure

The inference framework can be obtained under -I x86_64, -I arm64 and -I arm32 compilation options, and the content includes the following parts:

  • When the compilation option is -I x86_64:

    mindspore-lite-{version}-inference-linux-x64
    ├── inference
    │   ├── include  # Header files of inference framework
    │   ├── lib      # Inference framework library
    │   │   ├── libmindspore-lite.a     # Static library of infernece framework in MindSpore Lite
    │   │   ├── libmindspore-lite.so    # Dynamic library of infernece framework in MindSpore Lite
    │   │   └── libmsdeobfuscator-lite.so  # The files of obfuscated model loading dynamic library
    │   └── minddata # Image processing dynamic library
    │       ├── include
    │       └── lib
    │           └── libminddata-lite.so # The files of image processing dynamic library
    └── tools
        ├── benchmark # Benchmarking tool
        │   └── benchmark # Executable program
        ├── codegen   # Code generation tool
        │   ├── codegen   # Executable program
        │   ├── include   # operator header file
        │   ├── lib       # operator static library
        │   └── third_party # ARM CMSIS NN static library
        ├── converter  # Model conversion tool
        ├── obfuscator # Model obfuscation tool
        └── cropper    # Static library crop tool
            ├── cropper                 # Executable file of static library crop tool
            └── cropper_mapping_cpu.cfg # Crop cpu library related configuration files
    
  • When the compilation option is -I arm64 or -I arm32:

    mindspore-lite-{version}-inference-android-{arch}
    ├── inference
    │   ├── include     # Header files of inference framework
    │   ├── lib         # Inference framework library
    │   │   ├── libmindspore-lite.a  # Static library of infernece framework in MindSpore Lite
    │   │   ├── libmindspore-lite.so # Dynamic library of infernece framework in MindSpore Lite
    │   │   └── libmsdeobfuscator-lite.so # The files of obfuscated model loading dynamic library
    │   ├── minddata    # Image processing dynamic library
    │   │   ├── include
    │   │   └── lib
    │   │       └── libminddata-lite.so # The files of image processing dynamic library
    │   └── third_party # NPU library
    │       └── hiai_ddk
    └── tools
        ├── benchmark # Benchmarking tool
        │   └── benchmark
        │   └── benchmark
        └── codegen   # Code generation tool
            ├── include  # operator header file
            └── lib      # operator static library
    
  • When the compilation option is -A java:

    mindspore-lite-maven-{version}
    └── mindspore
        └── mindspore-lite
            └── {version}
                └── mindspore-lite-{version}.aar # MindSpore Lite runtime aar
    
    mindspore-lite-{version}-inference-linux-x64-jar
    └── jar
        ├── libmindspore-lite-jni.so # Dynamic library of MindSpore Lite inference framework
        ├── libmindspore-lite.so     # MindSpore Lite JNI dynamic library
        └── mindspore-lite-java.jar  # MindSpore Lite inference framework jar package
    
  • Compile ARM64 to get the inference framework output of cpu/gpu/npu by default, if you add -e gpu, you will get the inference framework output of cpu/gpu, ARM32 only supports CPU.

  • Before running the tools in the converter, benchmark directory, you need to configure environment variables, and configure the path where the dynamic libraries of MindSpore Lite are located to the path where the system searches for dynamic libraries.

  • The dynamic library libmsdeobfuscator-lite.so is only available if the ENABLE_MODEL_OBF compilation option in mindspore/mindspore/lite/CMakeLists.txt is turned on.

Configure converter:

export LD_LIBRARY_PATH=./output/mindspore-lite-{version}-inference-{os}-{arch}/tools/converter/lib:./output/mindspore-lite-{version}-inference-{os}-{arch}/tools/converter/third_party/glog/lib:${LD_LIBRARY_PATH}

Configure benchmark:

export LD_LIBRARY_PATH=./output/mindspore-lite-{version}-inference-{os}-{arch}/inference/lib:${LD_LIBRARY_PATH}

Training Output Description

If the -T on is added to the MindSpore Lite, go to the mindspore/output directory of the source code to view the file generated after compilation. The file is divided into the following parts.

  • mindspore-lite-{version}-train-{os}-{arch}.tar.gz: Contains model training framework, performance analysis tool.

  • version: Version of the output, consistent with that of the MindSpore.

  • os: Operating system on which the output will be deployed.

  • arch: System architecture on which the output will be deployed.

Execute the decompression command to obtain the compiled output:

tar -xvf mindspore-lite-{version}-train-{os}-{arch}.tar.gz

Windows Environment Compilation

Environment Requirements

  • System environment: Windows 7, Windows 10; 64-bit.

  • Compilation dependencies are:

The compilation script will execute git clone to obtain the code of the third-party dependent libraries. Please make sure that the git network settings are correct and available in advance.

Compilation Options

The compilation options of MindSpore Lite are as follows:

Parameter

Parameter Description

Mandatory or Not

lite

Set this parameter to compile the MindSpore Lite project.

Yes

[n]

Set the number of threads used during compilation, otherwise the default is set to 6 threads.

No

Compilation Example

First, use the git tool to download the source code from the MindSpore code repository.

git clone https://gitee.com/mindspore/mindspore.git -b r1.2

Then, use the cmd tool to compile MindSpore Lite in the root directory of the source code and execute the following commands.

  • Compile the Windows version with the default number of threads (6 threads).

call build.bat lite
  • Compile the Windows version with the specified number of 8 threads.

call build.bat lite 8

Output Description

After the compilation is complete, go to the mindspore/output directory of the source code to view the file generated after compilation. The file is divided into the following parts.

  • mindspore-lite-{version}-inference-win-x64.zip: Contains model inference framework and related tool.

version: Version of the output, consistent with that of the MindSpore.

Execute the decompression command to obtain the compiled output:

unzip mindspore-lite-{version}-inference-win-x64.zip

Docker Environment Compilation

Environmental Preparation

Download the docker image

docker pull swr.cn-south-1.myhuaweicloud.com/mindspore-build/mindspore-lite:ubuntu18.04.2-20210323
  • Before downloading the image, please make sure docker has been installed.

  • Docker image does not currently support Windows version compilation.

Create a container

docker run -tid --net=host --name=docker01 swr.cn-south-1.myhuaweicloud.com/mindspore-build/mindspore-lite:ubuntu18.04.2-20210323

Enter the container

docker exec -ti -u 0 docker01 bash

Compilation Options

Refer to Linux Environment Compilation

Compilation Example

Refer to Linux Environment Compilation

Output Description

Refer to Linux Environment Compilation