mindspore.ops.scatter_nd_div

mindspore.ops.scatter_nd_div(input_x, indices, updates, use_locking=False)[source]

Applying sparse division to individual values or slices in a tensor.

Using given values to update tensor value through the div operation, along with the input indices. This operation outputs the input_x after the update is done, which makes it convenient to use the updated value.

input_x has rank P and indices has rank Q, where Q >= 2.

indices has shape \((i_0, i_1, ..., i_{Q-2}, N)\) where N <= P.

The last dimension of indices (with length N ) indicates slices along the N th dimension of input_x.

updates is a tensor of rank Q-1+P-N. Its shape is: \((i_0, i_1, ..., i_{Q-2}, x\_shape_N, ..., x\_shape_{P-1})\).

Parameters
  • input_x (Parameter) – The target tensor, with data type of Parameter.

  • indices (Tensor) – The index to do div operation whose data type must be mindspore.int32 or mindspore.int64. The rank of indices must be at least 2 and indices.shape[-1] <= len(shape).

  • updates (Tensor) – The tensor to do the div operation with input_x. The data type is same as input_x, and the shape is indices.shape[:-1] + x.shape[indices.shape[-1]:].

  • use_locking (bool) – Whether to protect the assignment by a lock. Default: False.

Returns

Tensor, the updated input_x, has the same shape and type as input_x.

Raises
  • TypeError – If the dtype of use_locking is not bool.

  • TypeError – If the dtype of indices is not int32 or int64.

  • TypeError – If dtype of input_x and updates are not the same.

  • ValueError – If the shape of updates is not equal to indices.shape[:-1] + x.shape[indices.shape[-1]:].

  • RuntimeError – If the data type of input_x and updates conversion of Parameter is required when data type conversion of Parameter is not supported.

Supported Platforms:

GPU CPU

Examples

>>> input_x = Parameter(Tensor(np.array([1, 2, 3, 4, 5, 6, 7, 8]), mindspore.float32), name="x")
>>> indices = Tensor(np.array([[2], [4], [1], [7]]), mindspore.int32)
>>> updates = Tensor(np.array([6, 7, 8, 9]), mindspore.float32)
>>> output = ops.scatter_nd_div(input_x, indices, updates, False)
>>> print(output)
[1.         0.25       0.5        4.         0.71428573 6.
 7.         0.8888889 ]
>>> input_x = Parameter(Tensor(np.ones((4, 4, 4)), mindspore.float32))
>>> indices = Tensor(np.array([[0], [2]]), mindspore.int32)
>>> updates = Tensor(np.array([[[1, 1, 1, 1], [2, 2, 2, 2], [3, 3, 3, 3], [4, 4, 4, 4]],
...                            [[5, 5, 5, 5], [6, 6, 6, 6], [7, 7, 7, 7], [8, 8, 8, 8]]]), mindspore.float32)
>>> output = ops.scatter_nd_div(input_x, indices, updates, False)
>>> print(output)
[[[1.         1.         1.         1.        ]
  [0.5        0.5        0.5        0.5       ]
  [0.33333334 0.33333334 0.33333334 0.33333334]
  [0.25       0.25       0.25       0.25      ]]
 [[1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]]
 [[0.2        0.2        0.2        0.2       ]
  [0.16666667 0.16666667 0.16666667 0.16666667]
  [0.14285715 0.14285715 0.14285715 0.14285715]
  [0.125      0.125      0.125      0.125     ]]
 [[1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]
  [1.         1.         1.         1.        ]]]