MindSpore

Inference Model

  • Inference Model Overview
  • Inference on the Ascend 910 AI processor
  • Inference on Ascend 310
  • Inference on a GPU
  • Inference on a CPU
  • On-Device Inference

Inference Service

  • MindSpore Serving-based Inference Service Deployment
  • MindSpore Serving-based Distributed Inference Service Deployment
  • gRPC-based MindSpore Serving Access
  • RESTful-based MindSpore Serving Access
  • Servable Provided Through Model Configuration

Application Practice

  • Multi-hop Knowledge Reasoning Question-answering Model TPRR
MindSpore
  • »
  • Search


© Copyright MindSpore.

Built with Sphinx using a theme provided by Read the Docs.