mindspore.Layout

View Source On Gitee
class mindspore.Layout(device_matrix, alias_name)[source]

Parallel layout describes the detailed sharding information.

Note

  • It is valid only in semi auto parallel or auto parallel mode.

  • The multiplication result of the device_matrix must be equal to the device count in a pipeline stage.

  • When the layout function is invoked to constructs a sharding strategy, each alias name is only allowed to be used once to shard a tensor.

Parameters
  • device_matrix (tuple) – Describe the shape of devices arrangement, its element type is int.

  • alias_name (tuple) – The alias name for each axis of device_matrix, its length shoits element type is string.

Raises
  • TypeErrordevice_matrix is not a tuple type.

  • TypeErroralias_name is not a tuple type.

  • ValueErrordevice_matrix length is not equal to alias_name length.

  • TypeError – The element of device_matrix is not int type.

  • TypeError – The element of alias_name is not a str type.

  • ValueError – The element of alias_name is an empty str.

  • ValueError – The element of alias_name is “None”.

  • ValueErroralias_name contains repeated element.

Examples

>>> from mindspore import Layout
>>> layout = Layout((2, 2, 2), ("dp", "sp", "mp"))
>>> layout0 = layout("dp", "mp")
>>> print(layout0.to_dict())
{"device_matrix": (2, 2, 2), "tensor_map": (2, 0)}
to_dict()[source]

Transform layout to a dictionary.