MindSpore Lite is an ultra-fast, intelligent, and simplified AI engine that enables intelligent applications in all scenarios, provides E2E solutions for users, and helps users enable AI capabilities.
Advantages of MindSpore Lite
With efficient kernel algorithms and assembly-level optimization, MindSpore Lite supports heterogeneous scheduling of CPUs, GPUs, and NPUs, fully unleashing hardware computing power and minimizing inference latency and power consumption.
Provides ultra-lightweight solutions. With model quantization compression, the models are smaller and run faster, enabling AI model deployment in extreme environments.
Supports mobile phone OSs (iOS, Android, and LiteOS) and AI applications of various intelligent devices including mobile phones, large screens, tablets, and IoT.
With modal compression, data processing, and unified intermediate representations (IRs) for training and inference, MindSpore Lite is compatible with MindSpore, TensorFlow Lite, Caffe, and ONNX models, facilitating quick deployment.
MindSpore Lite Workflow
Select a new model or retrain an existing model.
Use a tool to convert a model into an on-device model that is easy to deploy.
Introduce a model to an application and load the model to a mobile or embedded device.
MindSpore Lite Users
HMS ML Kit
Use the machine learning kit provided by Huawei to quickly develop on-device machine learning applications.
An open AI capability platform for smart devices, thus accelerating your development cycle and making apps smarter.
SiteAI builds a leading lightweight, efficient, safe and easy-to-use, three-layer collaborative embedded AI platform to enable intelligent network elements and autonomous driving networks.
You can use the preset object detection model to detect and identify objects in the input frames of a camera, add tags to the objects, and mark the objects with borders.
Image segmentation is used to detect the position of the object in the picture or a pixel belongs to which object.