MindSpore Lite

Quick Start

  • Quick Start to Device-side Inference

Building

  • Building Device-side

Model Converter

  • Device-side Models Conversion

Model Inference

  • Model Inference (C++)
  • Model Inference (Java)
  • Device-side Model Inference Sample

MindIR Offline Inference

  • Building Cloud-side MindSpore Lite
  • Performing Inference
  • Performing Concurrent Inference
  • Distributed Inference
  • Model Converter
  • Benchmark Tool

Device-side Training

  • Device-side Training Model Conversion
  • Executing Model Training
  • Device-side Training Sample

Advanced Development

  • Data Preprocessing
  • Quantization
  • Performing Inference or Training on MCU or Small Systems
  • Third-party Access
    • Custom Kernel
      • Building Custom Operators Offline
      • Building Custom Operators Online
    • Using Delegate to Support Third-party AI Framework (Device)
    • Application Specific Integrated Circuit Integration Instructions

Tools

  • Visualization Tool
  • Benchmark Tool
  • Static Library Cropper Tool
  • Model Obfuscation Tool

References

  • Overall Architecture (Lite)
  • Lite Operator List
  • Codegen Operator List
  • Model List
  • Troubleshooting
  • Log

RELEASE NOTES

  • Release Notes
MindSpore Lite
  • »
  • Third-party Access »
  • Custom Kernel
  • View page source

Custom Kernel

View Source on Gitee
  • Building Custom Operators Offline
  • Building Custom Operators Online
Previous Next

© Copyright MindSpore.

Built with Sphinx using a theme provided by Read the Docs.