mindspore.mint.distributed.new_group

View Source On AtomGit
mindspore.mint.distributed.new_group(ranks=None, timeout=None, backend=None, pg_options=None, use_local_synchronization=False, group_desc=None)[source]

Create a new distributed group.

Note

Parameters
  • ranks (list[int], optional) – List of ranks of group members. Default: None, which will be create the world group.

  • timeout (int, invalid) – Currently it is a reserved parameter.

  • backend (str, invalid) – Support backend Library. Currently support "hccl" and "mccl". When backend is "hccl", Huawei Collective Communication Library (HCCL) is used. When backend is "mccl", MindSpore Collective Communication Library (MCCL) is used. Default: None, which means "hccl" in Ascend.

  • pg_options (GroupOptions, optional) –

    Additional communication group configuration parameters. The backend will automatically select supported parameters and apply them during group initialization. E.g., for the HCCL backend, hccl_config can be specified so that group initialization configurations can be applied. Default is None.

    GroupOptions is defined as a class that can be instantiated as a Python object.

    GroupOptions {
        hccl_config(dict)
    }
    

    hccl_config currently only supports "hccl_buffer_size" or "hccl_comm".

    • hccl_buffer_size (uint32): Specifies the size of the HCCL communication buffer.

    • hccl_comm (int64): Specifies an existing HcclComm pointer. If "hccl_comm" is set, "hccl_buffer_size" will be ignored.

  • use_local_synchronization (bool, invalid) – Currently it is a reserved parameter.

  • group_desc (str, invalid) – Currently it is a reserved parameter.

Returns

str. The name of the created group. Return "" in the abnormal scenarios.

Raises

TypeError – If the list of ranks contains duplicate rank IDs.

Supported Platforms:

Ascend

Examples

Note

Before running the following examples, you need to configure the communication environment variables. For Ascend devices, it is recommended to use the msrun startup method without any third-party or configuration file dependencies. Please see the msrun startup for more details.

>>> import mindspore as ms
>>> from mindspore.mint.distributed import init_process_group, new_group
>>> ms.set_device(device_target="Ascend")
>>> init_process_group()
>>> group = new_group()
>>> print("group is: ", group)
group is: hccl_world_group