mindspore.profiler.schedule
- class mindspore.profiler.schedule(*, wait: int, active: int, warmup: int = 0, repeat: int = 0, skip_first: int = 0)
This class use to get the actions of each step. The schedule is as follows:
(NONE) (NONE) (NONE) (WARM_UP) (RECORD) (RECORD) (RECORD_AND_SAVE) None START------->skip_first------->wait-------->warmup-------->active........active.........active----------->stop | | | repeat_1 | ---------------------------------------------------------------
The profiler will skip the first
skip_firststeps, then wait forwaitsteps, then do the warmup for the nextwarmupsteps, then do the active recording for the nextactivesteps and then repeat the cycle starting withwaitsteps. The optional number of cycles is specified with therepeatparameter, the zero value means that the cycles will continue until the profiling is finished.- Keyword Arguments
wait (int) – The number of steps to wait before starting the warm-up phase.
active (int) – The number of steps to record data during the active phase.
warmup (int, optional) – The number of steps to perform the warm-up phase. Default:
0.repeat (int, optional) – The number of times to repeat the cycle. Default:
0.skip_first (int, optional) – The number of steps to skip at the beginning. Default:
0.
- Raises
ValueError – When the parameter step is less than 0.
- Supported Platforms:
Ascend
Examples
>>> import numpy as np >>> import mindspore as ms >>> import mindspore.dataset as ds >>> from mindspore import context, nn, Profiler >>> from mindspore.profiler import schedule, tensor_board_trace_handler >>> >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.fc = nn.Dense(2, 2) ... ... def construct(self, x): ... return self.fc(x) >>> >>> def generator_net(): ... for _ in range(2): ... yield np.ones([2, 2]).astype(np.float32), np.ones([2]).astype(np.int32) >>> >>> def train(test_net): ... optimizer = nn.Momentum(test_net.trainable_params(), 1, 0.9) ... loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True) ... data = ds.GeneratorDataset(generator_net(), ["data", "label"]) ... model = ms.train.Model(test_net, loss, optimizer) ... model.train(1, data) >>> >>> if __name__ == '__main__': ... context.set_context(mode=ms.PYNATIVE_MODE, device_target="Ascend") ... ... net = Net() ... STEP_NUM = 15 ... ... with Profiler(schedule=schedule(wait=1, warmup=1, active=2, repeat=1, skip_first=2), ... on_trace_ready=tensor_board_trace_handler) as prof: ... for i in range(STEP_NUM): ... train(net) ... prof.step()
- to_dict()
Convert schedule to a dict.
- Returns
dict, the parameters of schedule and their values.