MindSpore

Inference Model

  • Multi-Platform Inference Overview
  • Inference on the Ascend 910 AI processor
  • Inference on Ascend 310
  • Inference on a GPU
  • Inference on a CPU
  • On-Device Inference

Inference Service

  • MindSpore-based Inference Service Deployment
  • Access MindSpore Serving service based on gRPC interface
  • Access MindSpore Serving service based on RESTful interface
  • Servable provided by configuration model
MindSpore
  • »
  • Inference Using MindSpore
  • View page source

Inference Using MindSpore

Inference Model

  • Multi-Platform Inference Overview
  • Inference on the Ascend 910 AI processor
  • Inference on Ascend 310
  • Inference on a GPU
  • Inference on a CPU
  • On-Device Inference

Inference Service

  • MindSpore-based Inference Service Deployment
  • Access MindSpore Serving service based on gRPC interface
  • Access MindSpore Serving service based on RESTful interface
  • Servable provided by configuration model
Next

© Copyright MindSpore.

Built with Sphinx using a theme provided by Read the Docs.