mindspore.get_grad
- mindspore.get_grad(gradients, identifier)[source]
When return_ids of
mindspore.grad()ormindspore.value_and_grad()is set toTrue, use the return value ofmindspore.grad(), or the second return value ofmindspore.value_and_grad()as gradients. Then find the specific gradient from gradients according to identifier .As for gradient, two typical cases are included:
identifier is the position of the specific tensor to get gradient.
identifier is a parameter of a network.
- Parameters
gradients (Union[tuple[int, Tensor], tuple[tuple, tuple]]) – The return value of
mindspore.grad()when return_ids is set toTrue.identifier (Union[int, Parameter]) – The position number of a tensor, or a parameter that is used in
mindspore.grad().
- Returns
The Tensor gradient value corresponding to the identifier.
- Raises
RuntimeError – If gradient value corresponding to the identifier is not found.
TypeError – If the type of any argument is not one of the required types.
- Supported Platforms:
AscendGPUCPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> from mindspore import grad, get_grad >>> >>> # Cell object to be differentiated >>> class Net(nn.Cell): ... def construct(self, x, y, z): ... return x * y * z >>> x = Tensor([1, 2], mindspore.float32) >>> y = Tensor([-2, 3], mindspore.float32) >>> z = Tensor([0, 3], mindspore.float32) >>> net = Net() >>> out_grad = grad(net, grad_position=(1, 2), return_ids=True)(x, y, z) >>> output = get_grad(out_grad, 1) >>> print(output) [0. 6.]