Tutorials

Quick Start

  • Overview
  • Quick Start
  • Tensor
  • Data Loading and Processing
  • Building a Network
  • Automatic Differentiation
  • Model Training
  • Saving and Loading the Model
  • Accelerating with Static Graphs
  • Automatic Mix Precision

Data Processing

  • Data Sampling
  • Converting Dataset to MindRecord
  • Lightweight Data Processing
  • Supporting Python Objects in Dataset Pipeline
  • Auto Augmentation
  • Single-Node Data Cache
  • Optimizing the Data Processing

Compilation

  • Introduction to Graph Mode Programming
  • Graph Mode Syntax - Operators
  • Graph Mode Syntax - Python Statements
  • Graph Mode Syntax - Python Built-in Functions
  • Graph Mode - Programming Techniques

Parallel

  • Distributed Parallelism Overview
  • Distributed Parallel Startup Methods
  • Data Parallel
  • Operator-level Parallelism
  • Optimizer Parallel
  • Pipeline Parallel
  • Optimization Techniques
  • Distributed High-Level Configuration Case

Debugging and Tuning

  • Dynamic Graph Debugging
  • Using Dump in the Graph Mode
  • Feature Value Detection
  • Ascend Performance Tuning
  • Error Reporting Analysis
  • DryRun

Custom programming

  • Custom Operators
  • Custom Fusion
  • Hook Programming

Infer

  • MindSpore Large Language Model Inference
  • Obtaining and Preparing Large Language Model Weights
  • Building a Large Language Model Inference Network from Scratch
  • Building a Parallel Large Language Model Network
  • Multi-device Model Weight Sharding
  • Model Quantization
  • Lite Inference Overview

High Availability

  • Fault Recovery
  • Training Process Exit Gracefully

Orange Pi

  • OrangePi AIpro Development
  • Environment Setup Guide
  • Model Online Inference
  • Quick Start

Model Cases

  • Model Migration
  • Computer Vision
  • Natural Language Processing
  • Generative
Tutorials
  • »
  • Search


© Copyright MindSpore.

Built with Sphinx using a theme provided by Read the Docs.