mindspore.ops.BatchMatMul

class mindspore.ops.BatchMatMul(*args, **kwargs)[source]

Computes matrix multiplication between two tensors by batch.

\[\text{output}[..., :, :] = \text{matrix}(a[..., :, :]) * \text{matrix}(b[..., :, :])\]

The two input tensors must have the same rank and the rank must be not less than 3.

Parameters
  • transpose_a (bool) – If true, the last two dimensions of a is transposed before multiplication. Default: False.

  • transpose_b (bool) – If true, the last two dimensions of b is transposed before multiplication. Default: False.

Inputs:
  • input_x (Tensor) - The first tensor to be multiplied. The shape of the tensor is \((*B, N, C)\), where \(*B\) represents the batch size which can be multidimensional, \(N\) and \(C\) are the size of the last two dimensions. If transpose_a is True, its shape must be \((*B, C, N)\).

  • input_y (Tensor) - The second tensor to be multiplied. The shape of the tensor is \((*B, C, M)\). If transpose_b is True, its shape must be \((*B, M, C)\).

Outputs:

Tensor, the shape of the output tensor is \((*B, N, M)\).

Raises
  • TypeError – If transpose_a or transpose_b is not a bool.

  • ValueError – If length of shape of input_x is not equal to length of shape of input_y or length of shape of input_x is less than 3.

Supported Platforms:

Ascend GPU CPU

Examples

>>> input_x = Tensor(np.ones(shape=[2, 4, 1, 3]), mindspore.float32)
>>> input_y = Tensor(np.ones(shape=[2, 4, 3, 4]), mindspore.float32)
>>> batmatmul = ops.BatchMatMul()
>>> output = batmatmul(input_x, input_y)
>>> print(output)
[[[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]
 [[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]]
>>> input_x = Tensor(np.ones(shape=[2, 4, 3, 1]), mindspore.float32)
>>> input_y = Tensor(np.ones(shape=[2, 4, 3, 4]), mindspore.float32)
>>> batmatmul = ops.BatchMatMul(transpose_a=True)
>>> output = batmatmul(input_x, input_y)
>>> print(output)
[[[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]
 [[[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]
  [[3. 3. 3. 3.]]]]