mindspore.set_dump(target, enabled=True)[source]

Enable or disable dump for the target and its contents.

Target should be an instance of Cell or Primitive. Please note that this API takes effect only when Asynchronous Dump is enabled and the dump_mode field in dump config file is 2. See the dump document for details. The default enabled status for a cell or primitive is False.


This is an experimental prototype that is subject to change or deletion.


  1. This API is only effective for GRAPH_MODE with Ascend backend.

  2. When target is a cell and enabled is True, this API will enable dump for the primitive operator members of the cell instance and its child cell instances recursively. If an operator is not a member of the cell instance, the dump flag will not be set for this operator (e.g. functional operators used directly in construct method). To make this API effective, please use self.some_op = SomeOp() in your cell’s __init__ method.

  3. After using set_dump(cell, True), operators in forward computation of the cell will be dumped. Most backward computation (computation generated by the grad operations) will not be dumped by design. However, due to the graph optimization, a few backward computation data will still be dumped. You can ignore the backward computation data which contains “Gradients” in their filenames.

  4. This API only supports being called before training starts. If you call this API during training, it may not be effective.

  5. For nn.SparseSoftmaxCrossEntropyWithLogits layer, the forward computation and backward computation use the same set of operators. So you can only see dump data from backward computation. Please note that nn.SoftmaxCrossEntropyWithLogits layer will also use the above operators internally when initialized with sparse=True and reduction=”mean”.

  • target (Union[Cell, Primitive]) – The Cell instance or Primitive instance to which the dump flag is set.

  • enabled (bool) – True means enable dump, False means disable dump. Default: True.

Supported Platforms:



>>> # Please set the dump config file and environment variable before
>>> # running this example to actually get the dump data.
>>> # See the document of this API for details.
>>> import numpy as np
>>> import mindspore.nn as nn
>>> import mindspore.context as context
>>> from mindspore import Tensor, set_dump
>>> context.set_context(device_target="Ascend", mode=context.GRAPH_MODE)
>>> class MyNet(nn.Cell):
...     def __init__(self):
...         super().__init__()
...         self.conv1 = nn.Conv2d(5, 6, 5, pad_mode='valid')
...         self.relu1 = nn.ReLU()
...     def construct(self, x):
...         x = self.conv1(x)
...         x = self.relu1(x)
...         return x
>>> net = MyNet()
>>> set_dump(net.conv1)
>>> input_tensor = Tensor(np.ones([1, 5, 10, 10], dtype=np.float32))
>>> output = net(input_tensor)