# Lite算子支持 `Linux` `Ascend` `端侧` `推理应用` `初级` `中级` `高级` [![查看源文件](https://gitee.com/mindspore/docs/raw/r1.6/resource/_static/logo_source.png)](https://gitee.com/mindspore/docs/blob/r1.6/docs/lite/docs/source_zh_cn/operator_list_lite.md) 本文列举MindSpore Lite支持的算子。 | 操作名
  | CPU
FP16 | CPU
FP32 | CPU
Int8 | CPU
UInt8 | GPU
FP16 | GPU
FP32 | NPU
  | TensorRT
  | Ascend
(Ascend310) |支持的TensorFlow Lite算子 | 支持的Caffe算子 | 支持的Onnx算子 | 支持的TensorFlow算子 | | --------------------- | :------------: | :------------: | :------------: | :-------------: | :------------: | :------------: | :---------: | :---------: | :-----------------------------: | ------------------------ | ----------------------------------------------- | ----------------------------------------------- | --------------------- | | Abs | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Abs | | Abs | Abs | | Add | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Add | | Add, Int8Add | Add, AddV2 | | Adder | | ✅ | | | | | | | | | | adder_f | | | AddGrad | | ✅ | | | | | | | | | | | | | AddN | ✅ | ✅ | | | | | | | | AddN | | | | | Assert | ✅ | ✅ | | | | | | | | | | | Assert | | Argmax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Argmax | ArgMax | ArgMax | ArgMax | | Argmin | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | Argmin | | | ArgMin | | AvgPool | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | MeanPooling | Pooling | AveragePool,
GlobalAveragePool,
Int8AveragePool | AvgPool | | AvgPoolGrad | ✅ | ✅ | | | | | | | | | | | | | BatchNorm | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | | BatchNorm | BatchNormalization | | | BatchNormGrad | ✅ | ✅ | | | | | | | | | | | | | BatchToSpace | | ✅ | ✅ | ✅ | ✅ | ✅ | | | | BatchToSpace,
BatchToSpaceND | | | BatchToSpace,
BatchToSpaceND | | BiasAdd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | | | BiasAdd | BiasAdd | | BiasAddGrad | ✅ | ✅ | | | | | | | | | | | | | BroadcastTo | ✅ | ✅ | | | | | | | | BroadcastTo | | Expand | BroadcastTo | | Cast | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Cast,
QUANTIZE,
DEQUANTIZE | | Cast | Cast | | Ceil | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Ceil | | Ceil | Ceil | | Concat | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Concat | Concat | Concat | ConcatV2 | | ConstantOfShape | ✅ | ✅ | | | | | | | | | | ConstantOfShape | | | Conv2d | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Conv2D | Convolution | Conv, Int8Conv,
ConvRelu,
Int8ConvRelu | Conv2D | | Conv2DBackpropFilterFusion | ✅ | ✅ | | | | | | | | | | | | | Conv2DBackpropInputFusion | ✅ | ✅ | | | | | | | | | | | | | Conv2dGrad | | ✅ | | | | | | | | | | | | | Conv2dTranspose | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | DeConv2D | Deconvolution | ConvTranspose | Conv2DBackpropInput | | Conv2dTransposeGrad | | ✅ | | | | | | | | | | | | | Cos | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | |✅ |Cos | | Cos | Cos | | Crop | ✅ | ✅ | ✅ | ✅ | | | | | | | Crop | | | | CropAndResize | | ✅ | | | | | ✅ | | | | | | CropAndResize | | CumSum | | ✅ | | | | | | | | | | | Cumsum | | CustomExtractFeatures | | ✅ | | | | | | | | ExtractFeatures | | | | | CustomNormalize | | ✅ | | | | | | | | Normalize | | | | | CustomPredict | | ✅ | | | | | | | | Predict | | | | | Deconvolution | | | | | | | | | ✅ | | | | | | DeDepthwiseConv2D | | ✅ | ✅ | ✅ | ✅ | ✅ | | | | | Deconvolution | | | | DepthToSpace | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | DepthToSpace | | DepthToSpace | DepthToSpace | | DepthwiseConv2dNative | ✅ | ✅ | ✅ | ✅ | | | ✅ | | ✅ | DepthwiseConv2D | Convolution | | DepthwiseConv2dNative | | DetectionPostProcess | | ✅ | ✅ | ✅ | | | | | | Custom | | | | | Div | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Div, RealDiv | | Div | Div, RealDiv | | DivGrad | | ✅ | | | | | | | | | | | | | DropoutGrad | ✅ | ✅ | | | | | | | | | | | | | Eltwise | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Eltwise | Sum, Max[3] | | | Elu | ✅ | ✅ | | | | | | | ✅ | | ELU | Elu,
NonMaxSuppression | NonMaxSuppressionV3 | | EluGrad | | ✅ | | | | | | | | | | | | | Equal | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Equal | | Equal | Equal | | ExpFusion | ✅ | ✅ | | | ✅ | ✅ | | | | Exp | Exp | Exp | Exp | | ExpandDims | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ExpandDims | | | ExpandDims | | Fill | ✅ | ✅ | | | ✅ | ✅ | | | ✅ | Fill | | | Fill | | Flatten | ✅ | ✅ | | | | | | ✅ | ✅ | | Flatten | | | | FlattenGrad | ✅ | ✅ | | | | | | | | | | | | | Floor | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | flOOR | | Floor | Floor | | FloorDiv | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | FloorDiv | | | FloorDiv | | FloorMod | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | FloorMod | | | FloorMod | | FullConnection | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | FullyConnected | InnerProduct | | | | FusedBatchNorm | ✅ | ✅ | ✅ | ✅ | | | ✅ | | ✅ | FusedBatchNorm | | | FusedBatchNorm,
FusedBatchNormV3 | | GatherNd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | GatherND | | | GatherNd | | Gather | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Gather | | Gather | GatherV2 | | Greater | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Greater | | Greater | Greater | | GreaterEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | GreaterEqual | | | GreaterEqual | | GRU | ✅ | ✅ | | | | | | | | | | | | | HardTanh | ✅ | ✅ | | | | | | | | | | | | | HashtableLookup | | ✅ | | | | | | | | HashtableLookup | | | | | HSigmoid | ✅ | ✅ | | ✅ | | | | | | | | | | | Hswish | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | HardSwish | | | | | HswishGrad | | ✅ | | | | | | | | | | | | | InstanceNorm | ✅ | ✅ | | | | | ✅ | | | InstanceNorm | | InstanceNormalization | | | InvertPermutation | ✅ | ✅ | | | | | | | | | | | InvertPermutation | | L2Norm | | ✅ | ✅ | | | | | | | L2_NORMALIZATION | | | | | LayerNorm | ✅ | ✅ | ✅ | | ✅ | ✅ | | | | | | | | | LayerNormGrad | ✅ | ✅ | | | | | | | | | | | | | LeakyReLU | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | LeakyRelu | | LeakyRelu | LeakyRelu | | LeakyReLUGrad | | ✅ | | | | | | | | | | | | | Less | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | Less | | Less | Less | | LessEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | LessEqual | | | LessEqual | | LRN | | ✅ | | | | | | | | LocalResponseNorm | | Lrn, LRN | | | Log | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | Log | | Log | Log | | LogGrad | ✅ | ✅ | | | | | | | | | | | | | LogicalAnd | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | LogicalAnd | | And | LogicalAnd | | LogicalNot | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | LogicalNot | | Not | LogicalNot | | LogicalOr | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | LogicalOr | | Or | LogicalOr | | LogSoftmax | ✅ | ✅ | | | | | | | | LogSoftmax | | LogSoftmax | | | LshProjection | | ✅ | | | | | | | | LshProjection | | | | | LSTM | ✅ | ✅ | | | | | | | | | | LSTM | | | MatMul | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | BatchMatMul | | MatMul,
Gemm | MatMul,
BatchMatMul,
BatchMatMulV2 | | MatMulGrad | | ✅ | | | | | | | | | | | | | Maximum | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ✅ | Maximum | | Max | Maximum | | MaximumGrad | ✅ | ✅ | | | | | | | | | | | | | MaxPool | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | MaxPooling | Pooling | MaxPool,
GlobalMaxPool | MaxPool | | MaxPoolGrad | ✅ | ✅ | | | | | | | | | | | | | Merge | ✅ | ✅ | | | | | | | | | | | Merge | | Minimum | ✅ | ✅ | | | ✅ | ✅ | ✅ | | ✅ | Minimum | | Min | Minimum | | MinimumGrad | ✅ | ✅ | | | | | | | | | | | | | Mul | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Mul | | Mul | Mul | | MulGrad | | ✅ | | | | | | | | | | | | | Neg | ✅ | ✅ | | | ✅ | ✅ | ✅ | | | Neg | | Neg |Neg | | NegGrad | ✅ | ✅ | | | | | | | | | | | | | NotEqual | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | NotEqual | | | NotEqual | | OneHot | ✅ | ✅ | | | ✅ | ✅ | | | | OneHot | | OneHot | OneHot | | Pad | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Pad, MirrorPad, PadV2 | | Pad | MirrorPad, Pad, PadV2 | | Pow | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | Pow | Power | Pow[2] | Pow | | PowGrad | | ✅ | | | | | | | | | | | | | PReLU | ✅ | ✅ | | | ✅ | ✅ | | | ✅ | PRELU | PReLU | PRelu | | | QuantDTypeCast | ✅ | ✅ | ✅ | ✅ | | | | | | | | | | | RaggedRange | ✅ | ✅ | | | | | | | | | | | RaggedRange | | RandomStandardNormal | ✅ | ✅ | | | | | | | | | | | RandomStandardNormal | | RandomUniform | | ✅ | | | | | | | | | | | RandomUniform | | Range | ✅ | ✅ | | | | | | | | Range | | Range | Range | | Rank | ✅ | ✅ | | | | | | | | Rank | | | Rank | | RealDiv | ✅ | ✅ | | | | | | | ✅ | | | | | | Reciprocal | ✅ | ✅ | ✅ | | | | ✅ | | | | | Reciprocal | | | ReduceAll | | ✅ | | | | | | | | | | | All | | ReduceASum | ✅ | ✅ | | | ✅ | ✅ | | | | | Reduction | | | | ReduceMax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | | ReduceMax | | ReduceMax | Max | | ReduceMean | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | Mean | Reduction | ReduceMean | Mean | | ReduceMin | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | | ReduceMin | | ReduceMin | Min | | ReduceProd | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | | ReduceProd | | ReduceProd | Prod | | ReduceSum | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | ✅ | Sum | Reduction | ReduceSum | Sum | | ReduceSumSquare | ✅ | ✅ | ✅ | ✅ | | | | | | | Reduction | ReduceSumSquare | | | ReLU | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Relu | ReLU | Relu | Relu | | ReLUGrad | ✅ | ✅ | | | | | | | | | | | | | ReLU6 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Relu6 | ReLU6 | Clip[1] | Relu6 | | ReLU6Grad | ✅ | ✅ | | | | | | | | | | | | | Reshape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Reshape | Reshape | Reshape,
Flatten | Reshape | | Resize | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ResizeBilinear,
NearestNeighbor | Interp | Resize, Upsample | ResizeBilinear,
ResizeBicubic,
ResizeNearestNeighbor | | ResizeGrad | ✅ | ✅ | | | | | | | | | | | | | ResizeNearestNeighbor | | | | | | | | | ✅ | | | | | | Reverse | | ✅ | | | | | | | | reverse | | | ReverseV2 | | ReverseSequence | | ✅ | | | | | | | | ReverseSequence | | | ReverseSequence | | Round | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Round | | Round | Round | | Rsqrt | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | Rsqrt | | | Rsqrt | | Select | | ✅ | | | | | | | | | | | Select | | Selu | | | | | | | | | | | | | Selu | | Scale | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | Scale | | | | ScatterNd | ✅ | ✅ | | | | | | | | ScatterNd | | ScatterND | | | ScatterNdUpdate | ✅ | ✅ | | | | | | | | ScatterNdUpdate | | ScatterNdUpdate | | | Shape | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | ✅ | Shape | | Shape | Shape | | Sigmoid | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Logistic | Sigmoid | Sigmoid | Sigmoid | | SigmoidGrad | ✅ | ✅ | | | | | | | | | | | | | Sin | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Sin | | Sin | Sin | | Size | ✅ | ✅ | | | | | | | | | | | Size | | Slice | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | Slice | Slice | Slice | Slice | | SkipGram | | ✅ | | | | | | | | SKipGram | | | | | Softmax | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Softmax | Softmax | Softmax | Softmax | | SoftmaxGrad | | ✅ | | | | | | | | | | | | | Softplus | ✅ | ✅ | | | | | | | | | | | Softplus | | SpaceToBatch | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | SpaceToBatch | | | | | SpaceToBatchND | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | SpaceToBatchND | | | SpaceToBatchND | | SpaceToDepth | ✅ | ✅ | | | ✅ | ✅ | | | | SpaceToDepth | | SpaceToDepth | | | SparseToDense | ✅ | ✅ | | | ✅ | ✅ | | | | SpareToDense | | | | | Splice | ✅ | ✅ | | | | | | | | | | Splice | | | Split | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | Split, SplitV | | Split | Split, SplitV | | SplitWithOverlap | ✅ | ✅ | | | | | | | | | | | | | Sqrt | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Sqrt | | Sqrt | Sqrt | | Square | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | Square | | | Square | | SquaredDifference | ✅ | ✅ | | | ✅ | ✅ | | | | SquaredDifference | | | SquaredDifference | | Squeeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | ✅ | | Squeeze | | Squeeze | Squeeze | | StridedSlice | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | StridedSlice | | Slice,
DynamicSlice | StridedSlice | | StridedSliceGrad | ✅ | ✅ | | | | | | | | | | | | | Stack | ✅ | ✅ | | | ✅ | ✅ | | | ✅ | Stack | | | Pack | | Sub | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | Sub | | Sub | Sub | | SubGrad | | ✅ | | | | | | | | | | | | | Swish | ✅ | ✅ | | | | | | | | | | | | | Switch | ✅ | ✅ | | | | | | | | | | | Switch | | Tanh | ✅ | ✅ | | | ✅ | ✅ | ✅ | ✅ | ✅ | Tanh | TanH | Tanh, Sign | Tanh | | TanhGrad | | ✅ | | | | | | | | | | | | | TensorListFromTensor | ✅ | ✅ | | | | | | | | | | | TensorListFromTensor | | TensorListGetItem | ✅ | ✅ | | | | | | | | | | | TensorListGetItem | | TensorListReserve | ✅ | ✅ | | | | | | | | | | | TensorListReserve | | TensorListSetItem | ✅ | ✅ | | | | | | | | | | | TensorListSetItem | | TensorListStack | ✅ | ✅ | | | | | | | | | | | TensorListStack | | Tile | ✅ | ✅ | | | | | ✅ | | | Tile | Tile | Tile | Tile | | TopK | ✅ | ✅ | ✅ | ✅ | | | | | | TopKV2 | | TopK | TopKV2 | | Transpose | ✅ | ✅ | ✅ | | ✅ | ✅ | ✅ | ✅ | ✅ | Transpose | Permute | Transpose, Int8Transpose | Transpose | | UniformReal | | ✅ | | | | | | | | | | | | | Unique | ✅ | ✅ | | | | | | | | Unique | | | | | UnsortedSegmentSum | ✅ | ✅ | | | | | | | | | | | UnsortedSegmentSum | | Unsqueeze | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | | | | Unsqueeze | | | Unstack | ✅ | ✅ | | | | | | | | Unstack | | | | | Upsample | | | | | | | | | ✅ | | | | | | Where | ✅ | ✅ | | | | | | | | Where | | NonZero, Where | Where | | ZerosLike | ✅ | ✅ | | | | | | | | ZerosLike | | | ZerosLike | | 转换工具支持的其他算子[4] | | | | | | | | | | | | Constant,
Atan, Asin, Tan, Erf,
Loop, Dropout, If, Identity,
Int8GivenIntTensorFill,
Int8GivenTensorFill,
Int8Quantize,
Int8Dequantize,
LpNormalization | Dropout, Enter,
Exit, If,
IsFinite,
LinSpace,
LoopCond,
NextIteration,
StatelessIf,
StatelessWhile,
TensorArrayGatherV3,
TensorArrayReadV3,
TensorArrayScatterV3,
TensorArraySizeV3,
TensorArrayV3,
TensorArrayWriteV3,
While | [1] Clip:仅支持将clip(0, 6)转换为Relu6。 [2] Pow:仅支持指数为单个常数。 [3] Sum与Max:仅支持输入个数为2。 [4] [转换工具](https://www.mindspore.cn/lite/docs/zh-CN/r1.6/use/converter_tool.html)支持,但不需要具体实现的算子,一般这类算子在转化工具中被优化而消失,如被融合掉或者使用其他算子代替。